CN113824714A - Vehicle configuration method and system - Google Patents

Vehicle configuration method and system Download PDF

Info

Publication number
CN113824714A
CN113824714A CN202111096734.0A CN202111096734A CN113824714A CN 113824714 A CN113824714 A CN 113824714A CN 202111096734 A CN202111096734 A CN 202111096734A CN 113824714 A CN113824714 A CN 113824714A
Authority
CN
China
Prior art keywords
user
information
vehicle
real
control data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111096734.0A
Other languages
Chinese (zh)
Other versions
CN113824714B (en
Inventor
高婧雯
吴国斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Geehy Semiconductor Co Ltd
Original Assignee
Zhuhai Geehy Semiconductor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Geehy Semiconductor Co Ltd filed Critical Zhuhai Geehy Semiconductor Co Ltd
Priority to CN202111096734.0A priority Critical patent/CN113824714B/en
Priority to CN202111394591.1A priority patent/CN114124528B/en
Publication of CN113824714A publication Critical patent/CN113824714A/en
Application granted granted Critical
Publication of CN113824714B publication Critical patent/CN113824714B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for authentication of entities
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Security & Cryptography (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Mechanical Engineering (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Traffic Control Systems (AREA)
  • Lock And Its Accessories (AREA)

Abstract

The present application provides a vehicle configuration system and method, the system comprising: a server and a first vehicle; wherein the first vehicle is configured to: the identity authentication is carried out on the first user, and after the authentication is successful, the identity information of the first user is sent to the server; acquiring real-time demand information of a first user, and sending the real-time demand information to a server; the server is used for: receiving identity information of a first user, and acquiring first control data of the first user according to the identity information; receiving real-time demand information of a first user; adjusting the first control data according to the real-time demand information to obtain second control data; transmitting the second maneuver data to the first vehicle; the first vehicle is further configured to: and receiving second control data sent by the server, and configuring the driving function part of the first vehicle according to the second control data. According to the method and the device, personalized configuration can be performed for the user, and the user experience is improved.

Description

Vehicle configuration method and system
Technical Field
The present application relates to the field of vehicle configuration technologies, and in particular, to a vehicle configuration method and system.
Background
With the improvement of consumption level, various vehicle using scenes such as one vehicle, multiple people, one person, multiple vehicles, shared travel and the like appear. Under various vehicle using scenes, how to enable a vehicle to meet the individual requirements of a vehicle user is a problem to be solved, so that the vehicle user can obtain better vehicle using experience.
Disclosure of Invention
The application provides a vehicle configuration method and a vehicle configuration system, which can perform personalized configuration for a user and improve the vehicle using experience of the user.
In a first aspect, an embodiment of the present application provides a vehicle configuration system, where the system includes: a server and a first vehicle; wherein the content of the first and second substances,
the first vehicle is configured to: carrying out identity authentication on a first user, and after the identity authentication is successful, sending the identity information of the first user to the server; acquiring real-time demand information of the first user, and sending the real-time demand information to the server; the real-time demand information is used for describing real-time information of preset influence factors;
the server is configured to: receiving identity information of the first user, and acquiring first control data of the first user according to the identity information of the first user; the first control data is used for recording configuration information of driving functional components in the first vehicle; receiving real-time demand information of the first user; adjusting the first control data according to the real-time requirement information to obtain second control data; the second control data are used for recording adjusted configuration information of a driving function component in the first vehicle; transmitting the second maneuver data to the first vehicle;
the first vehicle is further configured to: and receiving second control data sent by the server, and configuring a driving function component of the first vehicle according to the second control data.
In the configuration system, the server adjusts the first control data of the first user according to the real-time requirement information of the first user, so that the configuration of the first vehicle is more in line with the personalized requirement of the first user, and the vehicle using experience of the first user is improved.
In one possible implementation, the first vehicle includes:
the user identification unit is used for carrying out identity authentication on the first user;
the central control unit is used for sending the identity information of the first user to the server after the user identification unit successfully authenticates the identity of the first user;
the user state detection unit is used for acquiring the real-time demand information of the first user after the user identification unit successfully authenticates the identity of the first user, and sending the real-time demand information to the central control unit;
the central control unit is further configured to: sending the real-time demand information to the server; receiving second control data sent by the server; sending a control instruction to a functional domain control unit according to the second control data;
and the function domain control unit is used for configuring corresponding driving function components according to the received control instruction.
In one possible implementation, the user identification unit includes:
the first wireless communication module is used for detecting user equipment of the first user, detecting that the user equipment moves towards the direction of the first vehicle, and acquiring first information of the first user from the user equipment;
and the first authentication module is used for performing identity authentication on the first information acquired by the first wireless communication module.
In one possible implementation, the user identification unit includes: the first wireless MCU chip is provided with the first wireless communication module;
the first wireless MCU is used for: the first wireless communication module detects user equipment of the first user and detects that the user equipment moves towards the direction of the first vehicle, the first wireless communication module acquires first information of the first user from the user equipment, and identity authentication is carried out on the first information acquired by the first wireless communication module.
In one possible implementation, the first wireless communication module is a BLE module, or a UWB module.
In one possible implementation, the user identification unit includes:
the second wireless communication module is used for detecting the user equipment of the first user and acquiring first information of the first user from the user equipment;
and the second authentication module is used for performing identity authentication on the first information acquired by the second wireless communication module.
In one possible implementation, the user identification unit includes: the second wireless MCU chip is provided with the second wireless communication module;
the second wireless MCU chip is used for: detecting user equipment of the first user through the second wireless communication module, acquiring first information of the first user from the user equipment through the second wireless communication module, and performing identity authentication on the first information acquired by the second wireless communication module.
In one possible implementation, the second wireless communication module is an NFC module, or a BLE module, or a UWB module.
In one possible implementation, the real-time requirement information includes: picture information and/or video information comprising an image of the face of the first user, and/or voice information of the first user, and/or finger vein information of the first user, and/or weather information, and/or light intensity information outside the first vehicle.
In a possible implementation manner, a virtual server is provided in the server, and the virtual server is configured to: receiving identity information of the first user, and acquiring first control data of the first user according to the identity information of the first user; the first control data is used for recording configuration information of driving functional components in the first vehicle; receiving real-time demand information of the first user; adjusting the first control data according to the real-time requirement information to obtain second control data; transmitting the second maneuver data to the first vehicle.
In one possible implementation, the functional domain control unit includes an MCU chip.
In a second aspect, an embodiment of the present application provides a vehicle configuration method, including:
acquiring identity information of a first user, and acquiring first control data of the first user according to the identity information; the first handling data is used for recording configuration information of a driving function component in a first vehicle;
acquiring real-time demand information of the first user; the real-time demand information is used for describing real-time information of preset influence factors;
adjusting the first control data according to the real-time requirement information to obtain second control data; the second control data are used for recording adjusted configuration information of a driving function component in the first vehicle;
transmitting the second maneuver data to the first vehicle, the transmitted second maneuver data being used to instruct the first vehicle to configure a driving feature according to the second maneuver data.
In a possible implementation manner, the obtaining first manipulation data of the first user according to the identity information includes:
acquiring a user portrait of the first user according to the identity information; the user representation is constructed from historical vehicle usage data of the first user;
determining the first manipulation data from the user representation.
In one possible implementation, the user-portrait tab includes: a first identification for identifying a first driving feature of a vehicle; said determining said first manipulation data from said user representation comprises:
and searching the label comprising the first identifier from the label of the user portrait, and taking the parameter value corresponding to the searched label as the parameter value corresponding to the first identifier in the first control data.
In one possible implementation, the user-portrait tab includes: a first parameter associated with configuring a second driving feature; said determining said first manipulation data from said user representation comprises:
and calculating a parameter value corresponding to a second identifier in the first control data according to the parameter value of the first parameter, wherein the second identifier is used for identifying the second driving function component.
In one possible implementation, the real-time requirement information includes: picture information and/or video information containing a facial image of the first user, and/or sound information of the first user; the adjusting the first control data according to the real-time requirement information includes:
determining the emotion type of the first user according to the picture information and/or the video information and/or the sound information;
adjusting the first manipulation data according to the emotion type of the first user.
In one possible implementation, the real-time requirement information includes: finger vein information of the first user; the adjusting the first control data according to the real-time requirement information includes:
determining the health type of the first user according to the finger vein information of the first user;
adjusting the first manipulation data according to the health type of the first user.
In one possible implementation, the real-time requirement information includes: weather information, and/or light intensity information outside the first vehicle; the adjusting the first control data according to the real-time requirement information includes:
and adjusting the first control data according to the weather information and/or the light intensity information.
In one possible implementation manner, the method further includes: and determining that the first user is in an angry state according to the emotion type of the first user, and controlling the first vehicle to display first prompt information.
In one possible implementation manner, the method further includes: and determining that the first user is in an exhausted state according to the health type of the first user, and controlling the first vehicle to display second prompt information.
In a third aspect, an embodiment of the present application provides a vehicle configuration method, including:
carrying out identity authentication on a first user, and sending identity information of the first user to a server after the identity authentication is successful;
acquiring real-time demand information of the first user, and sending the real-time demand information to the server; the real-time demand information is used for describing real-time information of preset influence factors;
receiving second control data sent by the server; the second control data is obtained by adjusting the first control data corresponding to the identity information by the server according to the real-time demand information; the first control data is used for recording configuration information of driving functional components in the first vehicle; the second control data are used for recording adjusted configuration information of a driving function component in the first vehicle;
configuring a driving feature of the first vehicle according to the second maneuver data.
In one possible implementation manner, the authenticating the first user includes:
detecting user equipment of the first user based on a Bluetooth mode or a UWB mode, and detecting that the user equipment moves towards the direction of the first vehicle, and acquiring first information of the first user from the user equipment;
and performing identity authentication on the first information.
In one possible implementation manner, the authenticating the first user includes:
detecting user equipment of the first user based on an NFC mode, a Bluetooth mode or a UWB mode, and acquiring first information of the first user from the user equipment;
and successfully authenticating the identity of the first information.
In a possible implementation manner, the acquiring the real-time requirement information of the first user includes:
acquiring picture information and/or video information containing the facial image of the first user; and/or the presence of a gas in the gas,
acquiring sound information of the first user; and/or the presence of a gas in the gas,
acquiring finger vein information of the first user; and/or the presence of a gas in the gas,
acquiring weather information; and/or the presence of a gas in the gas,
and acquiring light intensity information outside the first vehicle.
In a fourth aspect, the present application provides a computer program for performing the method of the first aspect when the computer program is executed by a computer.
In a possible design, the program in the fourth aspect may be stored in whole or in part on a storage medium packaged with the processor, or in part or in whole on a memory not packaged with the processor.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic diagram of a system architecture for a vehicle configuration method according to the present application;
FIG. 2 is a flow chart of one embodiment of a vehicle configuration method of the present application;
FIG. 3 is a flow chart of yet another embodiment of a vehicle configuration method of the present application;
FIG. 4 is a flow chart of yet another embodiment of a vehicle configuration method of the present application;
FIG. 5 is a block diagram of an embodiment of a first vehicle according to the present application;
FIG. 6 is a block diagram of one embodiment of a server of the present application;
FIG. 7 is a block diagram of one embodiment of a vehicle configuration system of the present application;
FIG. 8 is a block diagram of another embodiment of a vehicle configuration system according to the present application.
Detailed Description
The terminology used in the description of the embodiments section of the present application is for the purpose of describing particular embodiments of the present application only and is not intended to be limiting of the present application.
The application provides a vehicle configuration method and a vehicle configuration system, which can perform personalized configuration for a user and improve the vehicle using experience of the user.
In the embodiment of the present application, the user is referred to as a user.
Fig. 1 shows a system architecture to which the vehicle configuration method of the present application is applied, including: server 110, first vehicle 120, user device 130.
The server 110 may be a single server, or may be a server cluster formed by a plurality of servers. Alternatively, the server 110 may be a cloud server.
Alternatively, in order to increase the processing speed of the server 110 on the vehicle data, a virtual server may be provided in the server 110, the virtual server being configured to interact with the vehicle and process the vehicle data.
If the amount of the vehicle data that the server 110 needs to process is relatively small, the server 110 may set a virtual server to process all the vehicle data.
If the data amount of the vehicle data that the server 110 needs to process is relatively large, a plurality of virtual servers may be disposed in the server 110, and different virtual servers process the vehicle data of different vehicles, so as to increase the processing speed of the server 110 on the vehicle data. At this time, the vehicles may be grouped in advance, and each virtual server processes vehicle data of a group of vehicles. Method for grouping vehicles the embodiments of the present application are not limited. For example, if the server 110 needs to process vehicle data of all shared cars in a shared car company, the shared cars are divided into: different types of vehicles such as ordinary, business, and VIP may be provided, based on the types of shared vehicles, in the server 110, a virtual server is respectively provided for each type of shared vehicle, and the server 110 may transmit vehicle data to the corresponding virtual server according to the vehicle type for processing, for example, transmit vehicle data transmitted from an ordinary shared vehicle to the virtual server 1 for processing, transmit vehicle data transmitted from a business shared vehicle to the virtual server 2 for processing, and so on.
The user device 130 is an electronic device of the first user, which may be, for example, a cell phone, a key of a vehicle, etc.
Wireless communication may be performed between user device 130 and first vehicle 120. Optionally, the user device 130 and the first vehicle 120 may be respectively provided with corresponding wireless communication modules to realize wireless communication therebetween. Optionally, the user device 130 and the first vehicle 120 may be provided with: a BLE module, and/or a UWB module, and/or a NFC module, and/or a Bluetooth module.
The server 110 and the first vehicle 120 may implement communication through a wireless communication technology, and specifically may be a wireless communication technology such as LTE, 5G, and the like.
Optionally, the server 110 and the user equipment 130 may also implement communication through a wireless communication technology, specifically, the wireless communication technology may be LTE, 5G, and the like.
FIG. 2 is a flow chart of one embodiment of a vehicle configuration method of the present application, which may be performed by a first vehicle. As shown in fig. 2, the method may include:
step 201: and carrying out identity authentication on the first user, and sending the identity information of the first user to the server after the identity authentication is successful.
Optionally, the identity information of the first user may include: an identity of the first user, the identity for uniquely identifying the first user.
Step 202: acquiring real-time demand information of a first user, and sending the real-time demand information to a server; the real-time demand information is used for describing real-time information of influence factors of the first user driving the first vehicle.
Step 202 may be executed after the first vehicle successfully authenticates the identity of the first user, and there is no limitation on the execution order between the step of sending the identity information of the first user to the server in step 201.
Step 203: receiving second control data sent by the server; the second control data is obtained by adjusting the first control data corresponding to the identity information by the server according to the real-time demand information; the first operation data is used for recording configuration information of a driving function component in the first vehicle; the second control data are used to record the adjusted configuration information of the driver function in the first vehicle.
The driving function component of the vehicle refers to a software function and/or a hardware component provided by the vehicle and capable of being operated by a user, the software function may include, for example, automatic driving, a music player, air conditioning temperature adjustment, and the like, and the hardware component may include a window, an atmosphere lamp, a seat, a rearview mirror, and the like.
Step 204: the driving feature of the first vehicle is configured according to the second maneuver data.
In the method shown in fig. 2, the first vehicle sends the real-time demand information of the first user to the server, so that the server can adjust the first control data accordingly, that is, adjust the configuration information of the driving function component when the first user drives the first vehicle this time, so that the configuration information of the driving function component is closer to the personalized requirement of the first user for driving the first vehicle this time, and the driving experience of the first user is improved.
FIG. 3 is a flow chart of one embodiment of a vehicle configuration method of the present application, where the method of FIG. 3 may be performed by a server. As shown in fig. 3, the method may include:
step 301: the identity information of the first user is obtained, and first control data of the first user are obtained according to the identity information.
The first vehicle may send the identity information of the first user to the server after the identity authentication of the first user is successful, so that the server may obtain the identity information of the first user.
Optionally, obtaining the first manipulation data of the first user according to the identity information may include:
acquiring a user portrait of a first user according to the identity information; the user representation is constructed according to historical vehicle usage data of the first user;
first manipulation data is determined from the user representation.
Step 302: acquiring real-time demand information of a first user; the real-time requirement information is used for describing real-time information of preset influence factors.
The first vehicle can send the real-time demand information of the first user to the server after the identity authentication of the first user is successful, so that the server can obtain the real-time demand information of the first user.
The real-time demand information may include, but is not limited to: picture information and/or video information comprising an image of the face of the first user, and/or sound information of the first user, and/or finger vein information of the first user, and/or weather information, and/or light intensity information outside the first vehicle.
Step 303: and adjusting the first control data according to the real-time demand information to obtain second control data.
Step 304: and sending the second operation data to the first vehicle, wherein the sent second operation data is used for instructing the first vehicle to configure the driving function component according to the second operation data.
In the method shown in fig. 3, the server adjusts the first control data according to the real-time requirement information of the first user, that is, adjusts the configuration information of the driving function component when the first user drives the first vehicle this time, so that the configuration information of the driving function component is closer to the personalized requirement of the first user for driving the first vehicle this time, and the driving experience of the first user is improved.
FIG. 4 is a flow chart of one embodiment of a vehicle configuration method of the present application, as shown in FIG. 4, which may include:
step 401: the first vehicle detects user equipment of a first user and receives first information sent by the user equipment.
Optionally, a wireless communication module may be disposed in the first vehicle, and the first vehicle may detect whether a user device approaching the first vehicle exists based on the wireless communication module, and may also establish a wireless connection with the user device based on the wireless communication module. The user device of the first user may be provided therein with a wireless communication module corresponding to the wireless communication module of the first vehicle. Then, in a possible implementation manner, the first vehicle may periodically send a broadcast signal through a wireless communication module in the first vehicle, and after the user equipment of the first user approaches the first vehicle, if the user equipment receives the broadcast signal through a corresponding wireless communication module in the user equipment, a wireless connection is established between the user equipment and the first vehicle through the wireless communication module. After the wireless connection is established, data interaction may be performed between the user equipment and the first vehicle, for example, receiving first information sent by the user equipment.
The first vehicle detection user device and the process of establishing a wireless connection between the first vehicle detection user device and the first vehicle detection user device are related to wireless communication modes supported by the wireless communication module. For example, a Bluetooth Low Energy (BLE) module may be respectively disposed in the first vehicle and the user equipment, or a Bluetooth module may be respectively disposed in the first vehicle and the user equipment, so that the user equipment may be detected and a wireless connection may be established based on a Bluetooth technology; an Ultra Wide Band (UWB) module may be respectively disposed in the first vehicle and the user equipment, and the user equipment may be detected and a wireless connection may be established based on the UWB technology; alternatively, Near Field Communication (NFC) modules may be respectively disposed in the first vehicle and the user equipment, and the user equipment may be detected and a wireless connection may be established based on NFC technology.
In some scenarios, if a first user carries a user device to enter a signal coverage range of a wireless communication module, the first vehicle may detect the user device through the wireless communication module, if the first vehicle establishes a wireless connection with the user device and the identity authentication is successful, the first vehicle is unlocked, but the first user may not need to unlock the first vehicle, at this time, the unlocking operation of the first vehicle is a false unlocking, in order to reduce the probability of the false unlocking of the first vehicle, after the first vehicle detects the user device of the first user in this step, before establishing a wireless connection with the user device of the first user, the first vehicle may also detect a motion trajectory of the user device (the motion trajectory of the user device also represents the motion trajectory of the first user), if it is determined according to the motion trajectory of the user device that the distance between the user device and the first vehicle gradually decreases, i.e. the user equipment is moving in the direction of the first vehicle, the subsequent steps of establishing a radio connection, etc. are only performed.
In order to further reduce the probability of the mistaken unlocking of the first vehicle, the first vehicle may further perform subsequent steps of establishing a wireless connection and the like after determining that the distance between the user equipment and the first vehicle is gradually reduced and the distance between the user equipment and the first vehicle is smaller than a preset threshold value.
The detection of the motion track of the user equipment can be realized based on positioning technologies such as Bluetooth positioning and UWB positioning. Specifically, if the BLE module or the bluetooth module is respectively arranged in the first vehicle and the user equipment, the first vehicle may position the position of the user equipment relative to the first vehicle by using a bluetooth positioning technology, and the motion trajectory of the user equipment may be determined by continuously positioning the position of the user equipment relative to the first vehicle; similar to the implementation manner based on the bluetooth positioning technology, if the UWB modules are respectively disposed in the first vehicle and the user equipment, the first vehicle may determine the motion track of the user equipment by using the UWB positioning technology.
Step 402: the first vehicle authenticates the identity of the first information, and if the identity authentication is successful, step 403 is executed.
Optionally, the first information may include: identification information of the first user. The identity information is used to uniquely identify the first user.
Optionally, in order to improve the security of the identity authentication between the first vehicle and the user equipment, the user equipment may encrypt the identity information used for the identity authentication, and the first information may be data obtained by encrypting the identity information of the first user.
In one possible implementation manner, the first vehicle may store identity authentication information of a user having a right to use the first vehicle, and the first vehicle may perform identity authentication on the received first information based on the stored identity authentication information.
In another possible implementation manner, the first vehicle may send the received first information and the device identifier of the first vehicle to the server, and the server performs identity authentication on the first information and sends an identity authentication result to the first vehicle.
It should be noted that, if the first information is encrypted data, before performing the identity authentication, the vehicle or the server may need to perform decryption processing on the first information. The encryption and decryption method described above is not limited in the embodiments of the present application.
If the first vehicle fails to authenticate the first information, a rejection message can be sent to the user equipment, and the user equipment can display a prompt interface for prompting the user that the authentication fails.
Step 403: the first vehicle controls unlocking of the vehicle door.
Step 404: the first vehicle sends a first request message to the server, the first request message comprising: identity information of the first user.
The identity information of the first user may include: identification information of the first user.
Step 405: the server searches for first control data corresponding to the identity information.
The first maneuver data is used to record configuration information for a driving feature in the first vehicle. For example, the opening or closing of an air conditioner, the adjustment position of a seat, the adjustment position of a rear view mirror, etc.
The method comprises the steps that each time a first user drives a vehicle, the driven vehicle can send use data generated during the first user uses the vehicle to a server, and the data generated during the first user uses the vehicle is called vehicle use data; the server may store the vehicle usage data of the first user as historical vehicle usage data of the first user. The server may construct a user representation of the first user based on historical vehicle usage data of the first user, and determine first maneuver data of the first user based on the user representation of the first user.
The user representation is a set of tags obtained by abstracting each piece of concrete information of the user into tags. The tags in the user representation may include an identification of individual driving features or parameters associated with configuring the driving features based on the user representation of the first user constructed from the vehicle usage data. For example, the tags of the user representation constructed in the embodiments of the present application may include, but are not limited to: air conditioners, left rearview mirrors, right rearview mirrors, vehicle windows, music players, vehicle-mounted refrigerators, indoor brightness and the like; the labels of the air conditioner, the left rearview mirror, the right rearview mirror and the like correspond to driving functional components, and the indoor brightness is a parameter related to the configuration of the atmosphere lamp.
The vehicle usage data may include: operational behavior information of the vehicle, and/or travel state information of the vehicle, and the like. The operation behavior information of the vehicle may include configuration operation of the driving function part by the user during use of the vehicle, operation time, information configured for the driving function part, and the like. Taking an air conditioner as an example, the operation behavior information may include: the air conditioner is turned on or off, and the turning-on time or the turning-off time of the air conditioner, the temperature set by the air conditioner, the starting time and the ending time for setting the temperature, the wind power set by the air conditioner, the starting time and the ending time for setting the wind power, and the like. The running state information of the vehicle may include: the vehicle's travel route, start travel time, end travel time, etc.
When the user representation of the first user is constructed according to the historical vehicle use data of the first user, for each tag included in the user representation, information corresponding to the tag can be extracted from the historical vehicle use data, and accordingly, the preference setting of the user for the information represented by the tag is calculated, and the parameter value of the tag in the user representation is obtained. For example, for the label "air conditioner", the set temperature of the air conditioner, wind power each time the user uses the vehicle may be acquired from the historical vehicle use data. And calculating the information of the starting time, the ending time and the like according to the temperature and the wind power which are set by the user and are preferred by the user to be set for the air conditioner, the time when the user is used to turn on the air conditioner after starting the car (the time can be identified by exemplifying the time length of the starting time when the user uses the car), and the like, thereby obtaining the parameter value of the label 'air conditioner' in the user portrait.
Based on the above description, for example, a user representation of a first user includes:
air conditioning: 22 degrees, wind power level 2; indoor brightness: xx; left rearview mirror: position 1, right rear view mirror: position 2; skylight: opening, and opening shielding; vehicle window: closing; a radio: closing; the music player: on, equalizer parameters XXX; vehicle-mounted refrigerator: 4 degrees; average vehicle speed: 53; maximum vehicle speed: 130, 130; steering wheel quick torsion resistance: xxx; an automatic driving function: closing; and so on. Labels such as an air conditioner, a left rearview mirror and a right rearview mirror correspond to driving functional components, and indoor brightness is a parameter related to setting of an atmosphere lamp.
It should be noted that, as the first user continues to use the vehicle, the historical vehicle usage data of the first user stored in the server continues to increase, and accordingly, after the user representation of the first user is constructed, the server may optimize the user representation of the first user based on a self-learning algorithm according to the vehicle usage data of the first user received each time, so that the user representation is closer to the preference setting of the first user for the vehicle. The self-learning algorithm
The server may determine first manipulation data of the first user based on a user representation of the first user. The method specifically comprises the following steps:
if the user portrays a label comprising: a first identifier for identifying a first driving feature of the vehicle; determining the first manipulation data from the user representation may include: searching a parameter value corresponding to the first identifier from the user portrait label, and taking the searched parameter value as the parameter value corresponding to the first identifier in the first control data; and/or the presence of a gas in the gas,
if the user portrays a label comprising: a first parameter associated with configuring a second driving feature; determining the first manipulation data from the user representation may include: and calculating a parameter value corresponding to a second identifier in the first control data according to the parameter value of the first parameter, wherein the second identifier is used for identifying a second driving function component.
Wherein the first driving function component and the second driving function component are each any driving function component of the first vehicle.
For example, the server may determine the first manipulation data of the first user according to the user representation of the first user in the foregoing example as follows: air conditioning: 22 degrees, wind power level 2; atmosphere lamp 1: turning on, turning off the atmosphere lamp 2; left rearview mirror: position 1, right rear view mirror: position 2; skylight: opening, and opening shielding; vehicle window: closing; a radio: closing; the music player: on, equalizer parameters XXX; vehicle-mounted refrigerator: 4 degrees; steering wheel quick torsion resistance: xxx; and so on.
In the above first manipulation data, the room brightness is a parameter related to the configuration of the atmosphere lamp, and the turning on or off of the atmosphere lamps 1 and 2 is determined based on the room brightness in the user portrait. For example, if the indoor brightness in the user portrait is bright, it may be determined that the atmosphere lamps 1 and 2 are both turned on, and if the indoor brightness in the user portrait is dark, it may be determined that the atmosphere lamps 1 and 2 are both turned off, and so on.
Step 406: the first vehicle detects that the first user sits in the main driving position, acquires the real-time demand information of the first user, and sends the real-time demand information of the user to the server.
The real-time demand information may include, but is not limited to: video information of the first user, and/or sound information of the first user, and/or finger vein information of the first user, and/or weather information, and/or light intensity information outside the first vehicle.
The finger vein information may include: heart rate, and/or blood pressure, etc.
After the door of the first vehicle is unlocked, the first user enters the first vehicle and sits in the main driving seat to drive the first vehicle. The first vehicle can be provided with a camera for acquiring picture information and/or video information of a first user; and/or a microphone can be arranged in the first vehicle and used for collecting the voice information of the first user; and/or a finger vein recognition unit can be arranged in the first vehicle and used for collecting finger vein information of the first user; and/or, the first vehicle may be provided with a wireless communication module, and weather information is acquired from a network in a manner of LTE, 5G, and the like; and/or a light sensor can be arranged in the first vehicle and used for acquiring the light intensity outside the first vehicle.
The camera can be arranged in front of and above a driving seat in the first vehicle and is used for shooting pictures and/or video images comprising facial images of a first user; the microphone can be arranged near the driving seat and used for collecting the sound signal of the first user; the finger vein recognition unit can be arranged on the steering wheel, particularly on a part of the steering wheel, which is usually touched by a driver, and is used for detecting finger vein information of the first user, such as heart rate, blood pressure and the like.
Optionally, for picture information, and/or video information, and/or sound information, and/or finger vein information, etc. of the first user, the first vehicle may obtain the information within a period of time during which the first user initially drives the vehicle, and at this time, the server may determine second manipulation data for the first user when the first vehicle is initially used by the first user, and further configure the first vehicle; the information can also be periodically or continuously acquired in the process of using the first vehicle by the first user, so that the server can determine the second control data again when the real-time demand information of the first user changes, and further reconfigure the first vehicle.
Step 407: and the server adjusts the parameter values of the driving function components in the first control data according to the real-time demand information to obtain second control data of the first vehicle.
If the real-time requirement information comprises: and the server can input the video information into a preset first model to obtain the emotion type of the user corresponding to the video information. The first model is used for detecting the emotion type of the user according to the facial image of the user in the video information, and the emotion type of the user can include but is not limited to: happy, angry, depressed, sad, tired, etc.
The first model may be obtained by pre-training, and the training samples may be: a video segment comprising a facial image of a user, the video segment labeled with an emotion type; and inputting a large number of training samples into the original model, and training the model so as to obtain the first model. The input to the first model is a video clip and the output is a type of emotion.
If the real-time requirement information comprises: and the server can input the picture information into a preset second model to obtain the emotion type of the user corresponding to the picture information. The second model is used for detecting the emotion type of the user according to the facial image of the user in the picture information, and the emotion type of the user can include but is not limited to: happy, angry, depressed, sad, tired, etc.
The second model may be obtained by pre-training, and the training samples may be: the method comprises the steps of including picture information of a face image of a user, wherein the picture information is marked with an emotion type; and inputting a large number of training samples into the original model, and training the model so as to obtain a second model. The input to the second model is picture information and the output is a mood type. Optionally, the picture information may also be a video frame in the video information.
If the real-time demand information includes: and the server can input the voice data into a preset third model to obtain the emotion type of the user corresponding to the voice data. The third model is used to detect the type of emotion of the user from the sound data. The emotion types of the user may include, but are not limited to: happy, angry, depressed, sad, tired, etc.
The third model may be obtained by pre-training, and the training samples may be: a sound segment labeled with an emotion type; and inputting a large number of training samples into a preset original model, and training the model so as to obtain a third model. The input to the third model is a sound segment and the output is a mood type.
Alternatively, the original model used to train the model may be a neural network model.
It should be noted that, if the real-time requirement information includes video information and sound information, the emotion type corresponding to the video information and the emotion type corresponding to the sound information may be determined according to the above manner, and the first control data is adjusted accordingly; or, the emotion type of the first user may also be calculated according to the emotion type corresponding to the video information and the emotion type corresponding to the sound information, the first control data may be adjusted according to the calculated emotion type, and the method for calculating the emotion type of the first user may be, for example, setting a weight or a priority for the emotion type, and determining the emotion type of the first user according to the weight or the priority. If the real-time requirement information includes the picture information and the sound information, or includes the picture information, the video information, and the sound information, the specific implementation may refer to the implementation description when the real-time requirement information includes the video information and the sound information, which is not described herein again.
If the real-time demand information includes finger vein information, the health type of the user may be determined according to the finger vein information, and the health type may include, but is not limited to: healthy, abnormal. For example: and if the heart rate of the user exceeds a preset threshold, or the high pressure in the blood pressure exceeds a maximum blood pressure threshold, or the low pressure in the blood pressure is lower than a minimum blood pressure threshold, judging that the health type of the user is abnormal, otherwise, judging that the user is healthy.
Optionally, when the parameter values of the driving function components in the first control data are adjusted according to the real-time requirement information, each kind of real-time requirement information may affect the parameter values of some or all driving function components in the first control data, and different kinds of real-time requirement information may affect the parameter values of the same or different driving function components. For example:
the mood type of the user can influence the parameter values of driving functions such as steering wheel rapid torsion resistance, music players (such as on-off state, sound volume and equalizer); for example, if the emotion type of the user is calm, the rapid torsion resistance of the steering wheel may not be adjusted, and if the emotion type of the user is angry or excited, the rapid torsion resistance of the steering wheel may be increased, so as to prevent the user from driving at an excessive speed under intense emotion and causing danger; if the user's mood type is depressed or exhausted, the music player can be turned on, the songs played, to relieve the user's mood, etc.;
the user's health type may affect the values of parameters of driving features such as steering wheel fast torsional resistance, music players (e.g., sound volume and equalizer), autopilot functions, etc.; for example, if the health type of the user is abnormal, the rapid torsion resistance of the steering wheel can be increased, and the vehicle speed can be reduced, so that the driving safety of the user can be ensured.
The light intensity can affect the parameter values of the atmosphere lamps, the parameter values of the vehicle front lighting lamps, and the like; for example, the ambient lamp 1 is turned on and the ambient lamp 2 is turned on according to the indoor brightness calculation, and the parameter values of the ambient lamp 1 and/or the ambient lamp 2 may be adjusted to be turned off when the light intensity outside the first vehicle is strong with reference to the light intensity outside the first vehicle.
The weather information can influence the parameter values of driving functional components such as an air conditioner, a skylight, a vehicle window and the like; and the like; for example, if the skylight is opened and the window is opened in the first manipulation data, but the weather information is a heavy rain, the parameter value of the skylight in the first manipulation data may be adjusted to be closed and the parameter value of the window may be adjusted to be closed.
It should be noted that, since the identity authentication of the first user is successful in the foregoing step, in step 406, the first vehicle determines that the user sitting in the main driving seat is the first user, but there may exist a possibility that an illegal user steals the first information of the first user to successfully perform the identity authentication, and unlocks the first vehicle, in order to improve the safety of the first vehicle, after the server receives the real-time requirement information in this step, if the real-time requirement information includes the video information of the user, the server may determine whether the facial image of the driver in the video information is consistent with the preset facial image of the first user, if so, it indicates that the driver is the first user, that is, the first user is driving the first vehicle, the server adjusts the parameter value of the driving function component in the first control data according to the real-time requirement information, if not, it indicates that the driver is not the first user, the server may send a warning message to the user device of the first user to prompt that the information of the first user is stolen, the first vehicle is being illegally used, and the like, so that the information of the first user and the security of the first vehicle are improved.
Optionally, the server may identify the illegal user by comparing the facial image, and if the real-time requirement information includes the sound information, the server may identify the illegal user by comparing the voiceprint in the sound information with the voiceprint of the preset first user.
Optionally, besides the real-time requirement information, the first vehicle may further include a fingerprint identification module configured to collect a fingerprint of the driver, so that the fingerprint information of the driver may be sent to the server in step 406, and the server may further identify an illegal user through a fingerprint comparison method before executing the step.
Optionally, in addition to the real-time requirement information, the first vehicle may further include an iris recognition module configured to collect irises of the driver, so that the iris information of the driver may be sent to the server in step 406, and the server may further recognize an illegal user through an iris comparison method before executing the step.
It should be noted that, two or more than two collecting devices of the camera, the microphone, the fingerprint identification module and the iris identification module may be disposed in the first vehicle, so as to send two or more than two kinds of information of the video information, the voice information, the fingerprint information and the iris information to the server, and the server may identify an illegal user according to the above information, and once a certain kind of information is not compared, the driver may be identified as an illegal user, and then send the reminding information to the user equipment of the first user.
It should be noted that the second operation data in this step is also the vehicle personalization information for the actual needs of the first user.
Step 408: the server sends the second operation data to the first vehicle.
Step 409: the first vehicle configures a driving feature of the first vehicle according to the vehicle maneuver data.
As shown in fig. 5, based on the foregoing vehicle configuration method, the present application provides an implementation structure of a first vehicle. As shown in fig. 5, the first vehicle 500 may include: the user identification unit 510, the central control unit 520, the user status detection unit 530, and the functional domain control unit 540 may be configured to execute the methods provided in the foregoing method embodiments to implement the configuration of the driving function component in the first vehicle.
The user identification unit 510 is configured to perform identity authentication on a user, and send an identity authentication result to the central control unit 520.
The central control unit 520 is configured to receive an identity authentication result, and send the identity information of the first user to the server when the identity authentication result is successful.
Specifically, the user identification unit 510 is configured to successfully authenticate the identity of the first user.
The central control unit 520 is used for the user identification unit to successfully authenticate the identity of the first user and send the identity information of the first user to the server;
the user state detection unit 530 is used for acquiring the real-time requirement information of the first user and sending the real-time requirement information to the central control unit; specifically, the user status detection unit 530 may collect real-time requirement information of the user through a camera, and/or a microphone, and/or a finger vein recognition unit, etc.
The central control unit 520 is also configured to: sending the real-time demand information to a server; receiving second control data sent by the server; sending a control instruction to a functional domain control unit according to the second control data;
and a functional domain control unit 540, configured to receive a control instruction of the central control unit, and configure a corresponding driving function component according to the received control instruction.
Alternatively, the user identification unit 510 may include:
the first wireless communication module is used for detecting user equipment of a first user, detecting that the user equipment moves towards the direction of a first vehicle, and acquiring first information of the first user from the user equipment;
and the first authentication module is used for performing identity authentication on the first information acquired by the first wireless communication module.
Optionally, the user identification unit comprises: the first wireless MCU chip is internally provided with a first wireless communication module;
the first wireless MCU is used for: the user equipment of the first user is detected through the first wireless communication module, the user equipment is detected to move towards the direction of the first vehicle, the first information of the first user is acquired from the user equipment through the first wireless communication module, and identity authentication is carried out on the first information acquired by the first wireless communication module.
Optionally, the first wireless communication module may be a BLE module, or, a UWB module.
Alternatively, the user identification unit 510 may include:
the second wireless communication module is used for detecting the user equipment of the first user and acquiring first information of the first user from the user equipment;
and the second authentication module is used for performing identity authentication on the first information acquired by the second wireless communication module.
Optionally, the user identification unit comprises: the second wireless MCU chip is internally provided with a second wireless communication module;
the second wireless MCU chip is used for: the user equipment of the first user is detected through the second wireless communication module, the first information of the first user is acquired from the user equipment through the second wireless communication module, and identity authentication is carried out on the first information acquired by the second wireless communication module.
Alternatively, the second wireless communication module may be an NFC module, or a BLE module, or a UWB module.
Optionally, the central control unit may be further configured to: and receiving an indication message sent by the server when the first user is detected to be in an tired state or an angry state, and displaying prompt information corresponding to the tired state or the angry state.
Alternatively, the functional domain control unit 540 may be implemented by an MCU chip (e.g., a general MCU chip). It should be noted that, the general MCU chip herein is generally an MCU chip without a wireless transmission function. It should be noted that the functional domain control unit 540 may be implemented by a common MCU or a wireless MCU chip, and the embodiment of the present application is not limited thereto.
Alternatively, the functional domain control unit 540 may also detect whether the user is seated in the main driving seat of the vehicle according to the detection data of the driving functional component; accordingly, the user status detecting unit 530 may obtain the real-time requirement information of the first user after the functional domain controlling unit 540 detects that the user sits in the main driving position, send the real-time requirement information to the central controlling unit, and then send the real-time requirement information to the server by the central controlling unit.
Optionally, a mobile networking module may be disposed in the central control unit 520, the mobile networking module may support LTE, 5G, and other communication technologies, and the central control unit 520 may communicate with the server through the mobile networking module.
Fig. 6 is a schematic structural diagram of a server provided in the embodiment of the present application, and as shown in fig. 6, a server 600 may include:
the first obtaining unit 610 is configured to obtain identity information of a first user, and obtain first control data of the first user according to the identity information; the first operation data is used for recording configuration information of a driving function component in the first vehicle;
a second obtaining unit 620, configured to obtain real-time requirement information of the first user; the real-time demand information is used for describing real-time information of influence factors of a first user driving a first vehicle;
an adjusting unit 630, configured to adjust the first control data according to the real-time requirement information to obtain second control data; the second control data are used for recording the adjusted configuration information of the driving function component in the first vehicle;
the sending unit 640 is configured to send second operation data to the first vehicle, where the sent second operation data is used to instruct the first vehicle to configure the driving function component according to the second operation data.
Optionally, the first obtaining unit 610 may specifically be configured to: acquiring a user portrait of a first user according to the identity information; the user representation is constructed according to historical vehicle usage data of the first user; first manipulation data is determined from the user representation.
Optionally, the user portrait tab comprises: a first identifier for identifying a first driving feature of the vehicle; the first obtaining unit 610 may specifically be configured to: and searching the label comprising the first identifier from the label of the user portrait, and taking the parameter value corresponding to the searched label as the parameter value corresponding to the first identifier in the first control data.
Optionally, the user portrait tab comprises: a first parameter associated with configuring a second driving feature; the first obtaining unit 610 may specifically be configured to: and calculating a parameter value corresponding to a second identifier in the first control data according to the parameter value of the first parameter, wherein the second identifier is used for identifying a second driving function component.
Optionally, the real-time requirement information includes: picture information and/or video information containing a facial image of the first user, and/or sound information of the first user; the adjusting unit 630 may specifically be configured to:
determining the emotion type of the first user according to the picture information and/or the video information and/or the sound information;
adjusting the first manipulation data according to the emotion type of the first user.
Optionally, the real-time requirement information includes: finger vein information of the first user; the adjusting unit 630 may specifically be configured to:
determining the health type of the first user according to the finger vein information of the first user;
adjusting the first manipulation data according to the health type of the first user.
Optionally, the real-time requirement information includes: weather information, and/or light intensity information outside the first vehicle; the adjusting unit 630 may specifically be configured to:
and adjusting the first control data according to the weather information and/or the light intensity information.
Optionally, the method further comprises:
the display unit is used for determining that the first user is in an angry state according to the emotion type of the first user and controlling the first vehicle to display first prompt information; and/or determining that the first user is in an exhausted state according to the emotion type of the first user, and controlling the first vehicle to display the second prompt message.
It should be understood that the above division of modules in the vehicle and the server is only a division of logical functions, and the actual implementation may be wholly or partially integrated into one physical entity or may be physically separated. And these modules can be realized in the form of software called by processing element; or may be implemented entirely in hardware; and part of the modules can be realized in the form of calling by the processing element in software, and part of the modules can be realized in the form of hardware. For example, the configuration module may be a separate processing element, or may be integrated into a chip of the electronic device. Other modules are implemented similarly. In addition, all or part of the modules can be integrated together or can be independently realized. In implementation, each step of the above method or each module above may be implemented by an integrated logic circuit of hardware in a processor element or an instruction in the form of software.
For example, the above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), one or more microprocessors (DSPs), one or more Field Programmable Gate Arrays (FPGAs), etc. For another example, these modules may be integrated together and implemented in the form of a System-On-a-Chip (SOC).
As shown in fig. 7, the present application further provides a vehicle configuration system, the system comprising: a server 710 and a first vehicle 720; the first vehicle 720 and the server 710 may be used to implement the vehicle configuration method shown in fig. 2 to 4. Alternatively, the server may be implemented by the structure shown in fig. 6 described above, and the first vehicle may be implemented by the structure shown in fig. 5 described above.
Optionally, as shown in fig. 8, the vehicle configuration system of the present application may further include: the user equipment 730. The user device 730 is configured to cooperate with the first vehicle 720 to implement the vehicle configuration method shown in fig. 2-4.
The application also provides a server, which includes a storage medium and a central processing unit, the storage medium may be a non-volatile storage medium, a computer executable program is stored in the storage medium, and the central processing unit is connected with the non-volatile storage medium and executes the computer executable program to implement the method provided by the embodiment shown in fig. 2 to fig. 4 of the application.
The application also provides a server, which comprises a memory and a processor, wherein the processor is used for realizing the method provided by the embodiment shown in fig. 2 to fig. 4 of the application.
The application also provides a first vehicle, which comprises a storage medium and a central processing unit, wherein the storage medium can be a nonvolatile storage medium, a computer executable program is stored in the storage medium, and the central processing unit is connected with the nonvolatile storage medium and executes the computer executable program to realize the method provided by the embodiment shown in fig. 2 to 4 of the application.
An embodiment of the present application further provides a computer-readable storage medium, in which a computer program is stored, and when the computer program runs on a computer, the computer is enabled to execute the method provided by the embodiment shown in fig. 2 to 4 of the present application.
Embodiments of the present application further provide a computer program product, which includes a computer program, and when the computer program runs on a computer, the computer executes the method provided in the embodiments shown in fig. 2 to 4 of the present application.
In the embodiments of the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, and means that there may be three relationships, for example, a and/or B, and may mean that a exists alone, a and B exist simultaneously, and B exists alone. Wherein A and B can be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" and similar expressions refer to any combination of these items, including any combination of singular or plural items. For example, at least one of a, b, and c may represent: a, b, c, a and b, a and c, b and c or a and b and c, wherein a, b and c can be single or multiple.
Those of ordinary skill in the art will appreciate that the various elements and algorithm steps described in connection with the embodiments disclosed herein can be implemented as electronic hardware, computer software, or combinations of electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, any function, if implemented in the form of a software functional unit and sold or used as a separate product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only for the specific embodiments of the present application, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present disclosure, and all the changes or substitutions should be covered by the protection scope of the present application. The protection scope of the present application shall be subject to the protection scope of the claims.

Claims (23)

1. A vehicle configuration system, characterized in that the system comprises: a server and a first vehicle; wherein the content of the first and second substances,
the first vehicle is configured to: carrying out identity authentication on a first user, and after the identity authentication is successful, sending the identity information of the first user to the server; acquiring real-time demand information of the first user, and sending the real-time demand information to the server; the real-time demand information is used for describing real-time information of preset influence factors;
the server is configured to: receiving identity information of the first user, and acquiring first control data of the first user according to the identity information of the first user; the first control data is used for recording configuration information of driving functional components in the first vehicle; receiving real-time demand information of the first user; adjusting the first control data according to the real-time requirement information to obtain second control data; the second control data are used for recording adjusted configuration information of a driving function component in the first vehicle; transmitting the second maneuver data to the first vehicle;
the first vehicle is further configured to: and receiving second control data sent by the server, and configuring a driving function component of the first vehicle according to the second control data.
2. The system of claim 1, wherein the first vehicle comprises:
the user identification unit is used for carrying out identity authentication on the first user;
the central control unit is used for sending the identity information of the first user to the server after the user identification unit successfully authenticates the identity of the first user;
the user state detection unit is used for acquiring the real-time demand information of the first user after the user identification unit successfully authenticates the identity of the first user, and sending the real-time demand information to the central control unit;
the central control unit is further configured to: sending the real-time demand information to the server; receiving second control data sent by the server; sending a control instruction to a functional domain control unit according to the second control data;
and the function domain control unit is used for configuring corresponding driving function components according to the received control instruction.
3. The system of claim 2, wherein the subscriber identification unit comprises:
the first wireless communication module is used for detecting user equipment of the first user, detecting that the user equipment moves towards the direction of the first vehicle, and acquiring first information of the first user from the user equipment;
and the first authentication module is used for performing identity authentication on the first information acquired by the first wireless communication module.
4. The system of claim 3, wherein the subscriber identification unit comprises: the first wireless MCU chip is provided with the first wireless communication module;
the first wireless MCU is used for: the first wireless communication module detects user equipment of the first user and detects that the user equipment moves towards the direction of the first vehicle, the first wireless communication module acquires first information of the first user from the user equipment, and identity authentication is carried out on the first information acquired by the first wireless communication module.
5. The system according to claim 3 or 4, wherein the first wireless communication module is a BLE module or a UWB module.
6. The system of claim 2, wherein the subscriber identification unit comprises:
the second wireless communication module is used for detecting the user equipment of the first user and acquiring first information of the first user from the user equipment;
and the second authentication module is used for performing identity authentication on the first information acquired by the second wireless communication module.
7. The system of claim 6, wherein the subscriber identification unit comprises: the second wireless MCU chip is provided with the second wireless communication module;
the second wireless MCU chip is used for: detecting user equipment of the first user through the second wireless communication module, acquiring first information of the first user from the user equipment through the second wireless communication module, and performing identity authentication on the first information acquired by the second wireless communication module.
8. The system according to claim 6 or 7, wherein the second wireless communication module is an NFC module, or a BLE module, or a UWB module.
9. The system of any one of claims 1 to 8, wherein the real-time demand information comprises: picture information and/or video information comprising an image of the face of the first user, and/or voice information of the first user, and/or finger vein information of the first user, and/or weather information, and/or light intensity information outside the first vehicle.
10. The system according to any one of claims 1 to 9, wherein a virtual server is provided in the server, and the virtual server is configured to: receiving identity information of the first user, and acquiring first control data of the first user according to the identity information of the first user; the first control data is used for recording configuration information of driving functional components in the first vehicle; receiving real-time demand information of the first user; adjusting the first control data according to the real-time requirement information to obtain second control data; transmitting the second maneuver data to the first vehicle.
11. The system according to any one of claims 1 to 10, wherein the domain control unit comprises an MCU chip.
12. A vehicle configuration method, characterized by comprising:
acquiring identity information of a first user, and acquiring first control data of the first user according to the identity information; the first handling data is used for recording configuration information of a driving function component in a first vehicle;
acquiring real-time demand information of the first user; the real-time demand information is used for describing real-time information of preset influence factors;
adjusting the first control data according to the real-time requirement information to obtain second control data; the second control data are used for recording adjusted configuration information of a driving function component in the first vehicle;
transmitting the second maneuver data to the first vehicle, the transmitted second maneuver data being used to instruct the first vehicle to configure a driving feature according to the second maneuver data.
13. The method of claim 12, wherein obtaining the first manipulation data of the first user according to the identity information comprises:
acquiring a user portrait of the first user according to the identity information; the user representation is constructed from historical vehicle usage data of the first user;
determining the first manipulation data from the user representation.
14. The method of claim 13, wherein the user representation tag comprises: a first identification for identifying a first driving feature of a vehicle; said determining said first manipulation data from said user representation comprises:
and searching the label comprising the first identifier from the label of the user portrait, and taking the parameter value corresponding to the searched label as the parameter value corresponding to the first identifier in the first control data.
15. The method of claim 13, wherein the user representation tag comprises: a first parameter associated with configuring a second driving feature; said determining said first manipulation data from said user representation comprises:
and calculating a parameter value corresponding to a second identifier in the first control data according to the parameter value of the first parameter, wherein the second identifier is used for identifying the second driving function component.
16. The method according to any one of claims 12 to 15, wherein the real-time demand information comprises: picture information and/or video information containing a facial image of the first user, and/or sound information of the first user; the adjusting the first control data according to the real-time requirement information includes:
determining the emotion type of the first user according to the picture information and/or the video information and/or the sound information;
adjusting the first manipulation data according to the emotion type of the first user.
17. The method according to any one of claims 12 to 15, wherein the real-time demand information comprises: finger vein information of the first user; the adjusting the first control data according to the real-time requirement information includes:
determining the health type of the first user according to the finger vein information of the first user;
adjusting the first manipulation data according to the health type of the first user.
18. The method according to any one of claims 12 to 15, wherein the real-time demand information comprises: weather information, and/or light intensity information outside the first vehicle; the adjusting the first control data according to the real-time requirement information includes:
and adjusting the first control data according to the weather information and/or the light intensity information.
19. The method of claim 16, further comprising:
determining that the first user is in an angry state according to the emotion type of the first user, and controlling the first vehicle to display first prompt information; and/or the presence of a gas in the gas,
and determining that the first user is in an exhausted state according to the emotion type of the first user, and controlling the first vehicle to display second prompt information.
20. A vehicle configuration method, characterized by comprising:
carrying out identity authentication on a first user, and sending identity information of the first user to a server after the identity authentication is successful;
acquiring real-time demand information of the first user, and sending the real-time demand information to the server; the real-time demand information is used for describing real-time information of preset influence factors;
receiving second control data sent by the server; the second control data is obtained by adjusting the first control data corresponding to the identity information by the server according to the real-time demand information; the first handling data is used for recording configuration information of a driving function component in a first vehicle; the second control data are used for recording adjusted configuration information of a driving function component in the first vehicle;
configuring a driving feature of the first vehicle according to the second maneuver data.
21. The method of claim 20, wherein authenticating the first user comprises:
detecting user equipment of the first user based on a Bluetooth mode or a UWB mode, and detecting that the user equipment moves towards the direction of the first vehicle, and acquiring first information of the first user from the user equipment;
and performing identity authentication on the first information.
22. The method of claim 20, wherein authenticating the first user comprises:
detecting user equipment of the first user based on an NFC mode, a Bluetooth mode or a UWB mode, and acquiring first information of the first user from the user equipment;
and successfully authenticating the identity of the first information.
23. The method according to any one of claims 20 to 22, wherein obtaining real-time demand information of the first user comprises:
acquiring picture information and/or video information containing the facial image of the first user; and/or the presence of a gas in the gas,
acquiring sound information of the first user; and/or the presence of a gas in the gas,
acquiring finger vein information of the first user; and/or the presence of a gas in the gas,
acquiring weather information; and/or the presence of a gas in the gas,
and acquiring light intensity information outside the first vehicle.
CN202111096734.0A 2021-09-17 2021-09-17 Vehicle configuration method and system Active CN113824714B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111096734.0A CN113824714B (en) 2021-09-17 2021-09-17 Vehicle configuration method and system
CN202111394591.1A CN114124528B (en) 2021-09-17 2021-09-17 Wireless MCU and vehicle configuration system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111096734.0A CN113824714B (en) 2021-09-17 2021-09-17 Vehicle configuration method and system

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202111394591.1A Division CN114124528B (en) 2021-09-17 2021-09-17 Wireless MCU and vehicle configuration system

Publications (2)

Publication Number Publication Date
CN113824714A true CN113824714A (en) 2021-12-21
CN113824714B CN113824714B (en) 2022-11-25

Family

ID=78914905

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202111096734.0A Active CN113824714B (en) 2021-09-17 2021-09-17 Vehicle configuration method and system
CN202111394591.1A Active CN114124528B (en) 2021-09-17 2021-09-17 Wireless MCU and vehicle configuration system

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202111394591.1A Active CN114124528B (en) 2021-09-17 2021-09-17 Wireless MCU and vehicle configuration system

Country Status (1)

Country Link
CN (2) CN113824714B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115273865A (en) * 2022-07-26 2022-11-01 中国第一汽车股份有限公司 Intelligent voice interaction method, device, equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105235615A (en) * 2015-10-27 2016-01-13 浙江吉利控股集团有限公司 Vehicle control system based on face recognition
CN107963080A (en) * 2017-12-05 2018-04-27 李国强 A kind of vehicle Automatic adjustment method and system
CN108437993A (en) * 2018-04-28 2018-08-24 广东轻工职业技术学院 A kind of road anger vehicle drive automatically control system and method
KR102005040B1 (en) * 2019-02-28 2019-07-29 송혜선 Vehicle quick starting Control System by Using Face Perception Data and Smart Terminal and Method thereof
CN110356363A (en) * 2018-04-09 2019-10-22 杭州海康汽车技术有限公司 A kind of driver identity authentication method, device, system and server
CN110435573A (en) * 2019-06-28 2019-11-12 北京汽车集团有限公司 Control method for vehicle and device
CN110673503A (en) * 2019-10-31 2020-01-10 重庆长安汽车股份有限公司 Intelligent household equipment control method and device, cloud server and computer readable storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101745443B1 (en) * 2015-11-27 2017-06-12 주식회사 우현디지털 Authentication system for driver of vehicle
CN107316363A (en) * 2017-07-05 2017-11-03 奇瑞汽车股份有限公司 A kind of automobile intelligent interacted system based on biological identification technology
CN108819900A (en) * 2018-06-04 2018-11-16 上海商汤智能科技有限公司 Control method for vehicle and system, vehicle intelligent system, electronic equipment, medium
CN109572705B (en) * 2018-12-11 2020-07-28 武汉格罗夫氢能汽车有限公司 Driver emotion management method and device and storage device
CN111619304A (en) * 2019-02-28 2020-09-04 上海博泰悦臻电子设备制造有限公司 Control method and system of vehicle air conditioner and vehicle
CN112116735A (en) * 2019-06-20 2020-12-22 华为技术有限公司 Intelligent lock unlocking method and related equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105235615A (en) * 2015-10-27 2016-01-13 浙江吉利控股集团有限公司 Vehicle control system based on face recognition
CN107963080A (en) * 2017-12-05 2018-04-27 李国强 A kind of vehicle Automatic adjustment method and system
CN110356363A (en) * 2018-04-09 2019-10-22 杭州海康汽车技术有限公司 A kind of driver identity authentication method, device, system and server
CN108437993A (en) * 2018-04-28 2018-08-24 广东轻工职业技术学院 A kind of road anger vehicle drive automatically control system and method
KR102005040B1 (en) * 2019-02-28 2019-07-29 송혜선 Vehicle quick starting Control System by Using Face Perception Data and Smart Terminal and Method thereof
CN110435573A (en) * 2019-06-28 2019-11-12 北京汽车集团有限公司 Control method for vehicle and device
CN110673503A (en) * 2019-10-31 2020-01-10 重庆长安汽车股份有限公司 Intelligent household equipment control method and device, cloud server and computer readable storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115273865A (en) * 2022-07-26 2022-11-01 中国第一汽车股份有限公司 Intelligent voice interaction method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN114124528B (en) 2024-01-23
CN114124528A (en) 2022-03-01
CN113824714B (en) 2022-11-25

Similar Documents

Publication Publication Date Title
US11465631B2 (en) Personalization system and method for a vehicle based on spatial locations of occupants' body portions
CN112622917B (en) System and method for authenticating an occupant of a vehicle
US10040423B2 (en) Vehicle with wearable for identifying one or more vehicle occupants
US10328824B2 (en) Vehicle operations based on biometric fingerprint analysis
US9928670B2 (en) Method and system for access control monitoring
RU2699168C2 (en) Object detection for vehicles
US7439849B2 (en) User welcoming system for an automobile
WO2018000999A1 (en) On-board system and control method for vehicle facility
US20220379846A1 (en) Multi-modal context based vehicle management
US10576934B2 (en) Decentralized cloud-based authentication for autonomous vehicles
CN106600762A (en) Method and system for controlling vehicle door
CN113824714B (en) Vehicle configuration method and system
CN108657186A (en) Intelligent driving cabin exchange method and device
US11667265B2 (en) Activating a security mode for a vehicle based on driver identification
US20220410841A1 (en) Unlocking vehicle doors with facial recognition
KR20210020367A (en) Apparatus and Method for Authenticating Biometric Information using Certification Score
CN111902864A (en) Method for operating a sound output device of a motor vehicle, speech analysis and control device, motor vehicle and server device outside the motor vehicle
CN109606314B (en) Automobile control method and automobile
KR102642242B1 (en) Vehicle and controlling method of vehicle
US11447101B2 (en) Point-of-interest-based anti-vehicle theft systems and processes for using the same
US11465587B2 (en) Vehicular key fob device
US20230154454A1 (en) Methods and apparatus for training a classification device
CN109606307B (en) Automobile control device and automobile
CN109501716B (en) Automobile control device and automobile
CN116142111A (en) Method, device, equipment and storage medium for controlling contextual model in vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant