CN112525214B - Interaction method and device of map card, vehicle and readable medium - Google Patents

Interaction method and device of map card, vehicle and readable medium Download PDF

Info

Publication number
CN112525214B
CN112525214B CN202011332251.1A CN202011332251A CN112525214B CN 112525214 B CN112525214 B CN 112525214B CN 202011332251 A CN202011332251 A CN 202011332251A CN 112525214 B CN112525214 B CN 112525214B
Authority
CN
China
Prior art keywords
map card
card object
map
vehicle
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011332251.1A
Other languages
Chinese (zh)
Other versions
CN112525214A (en
Inventor
邓帅
简驾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Xiaopeng Motors Technology Co Ltd
Guangzhou Chengxingzhidong Automotive Technology Co., Ltd
Original Assignee
Guangzhou Xiaopeng Motors Technology Co Ltd
Guangzhou Chengxingzhidong Automotive Technology Co., Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Xiaopeng Motors Technology Co Ltd, Guangzhou Chengxingzhidong Automotive Technology Co., Ltd filed Critical Guangzhou Xiaopeng Motors Technology Co Ltd
Priority to CN202011332251.1A priority Critical patent/CN112525214B/en
Publication of CN112525214A publication Critical patent/CN112525214A/en
Application granted granted Critical
Publication of CN112525214B publication Critical patent/CN112525214B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3667Display of a road map
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3641Personalized guidance, e.g. limited guidance on previously travelled routes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3647Guidance involving output of stored or live camera images or video streams
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3652Guidance using non-audiovisual output, e.g. tactile, haptic or electric stimuli
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3667Display of a road map
    • G01C21/367Details, e.g. road map scale, orientation, zooming, illumination, level of detail, scrolling of road map or positioning of current position marker
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3691Retrieval, searching and output of information related to real-time traffic, weather, or environmental conditions
    • G01C21/3694Output thereof on a road map
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3697Output of additional, non-guidance related information, e.g. low fuel level

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Atmospheric Sciences (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Ecology (AREA)
  • Environmental & Geological Engineering (AREA)
  • Environmental Sciences (AREA)
  • Navigation (AREA)

Abstract

The embodiment of the invention provides an interaction method and device of a map card, a vehicle and a readable medium, which are applied to an intelligent cabin of the vehicle, wherein the intelligent cabin is provided with a vehicle-mounted display part, the vehicle-mounted display part displays a virtual map interface, and the method comprises the following steps: acquiring scene information and state information of the vehicle; extracting an alternative map card object adapted to the scene information and the state information of the vehicle; the alternative map card object is provided with an interaction control; displaying the candidate map card object in the virtual map interface; and responding to the operation aiming at the interaction control, triggering a corresponding alternative map card object, so that the map card related to the information of the current scene can be displayed for the user according to the specific scene of the current user, the map card also has an application function corresponding to the scene and can be used by the user, the relevance between the map card and the scene of the user is improved, and the user experience is better.

Description

Interaction method and device of map card, vehicle and readable medium
Technical Field
The present invention relates to the field of information interaction, and in particular, to a map card interaction method, a map card pushing device, a vehicle, and a readable medium.
Background
In recent years, the main driving force of the high-speed development of the automobile industry has been gradually changed from the product and technology drive of the past supply end to the drive of continuously increasing customer demands, and the development of the vehicle-mounted terminal operating system has also promoted the development of the electronic cabin. The electronic cabin is gradually integrated with the vehicle-mounted man-machine interaction system, and in the vehicle-mounted system on the market at present, the most important function is one of the functions of using the highest frequency, and the map navigation is undoubtedly realized, so that in the design of most vehicle-mounted systems, the position for triggering and starting the map is generally placed in an important area with a shallow level and the most accessible of a driver, however, after the map is triggered at present, the displayed map card is mostly an invariable icon, the icon and characters or display fixed information and the like, and is only used as an entrance of an application function, the display of the specific scene is not combined, the relevance with the current scene of the user is poor, the humanization is not realized, and the experience of the user is poor.
Disclosure of Invention
In view of the foregoing, embodiments of the present invention are directed to providing a map card interaction method and a corresponding map card interaction device that overcome or at least partially solve the foregoing problems.
In order to solve the above problems, an embodiment of the present invention discloses an interaction method of a map card, which is applied to an intelligent cabin of a vehicle, the intelligent cabin having a vehicle-mounted display part, the vehicle-mounted display part displaying a virtual map interface, the method comprising:
Acquiring scene information and state information of the vehicle;
Extracting an alternative map card object adapted to the scene information and the state information of the vehicle; the alternative map card object is provided with an interaction control;
displaying the candidate map card object in the virtual map interface;
And responding to the operation of the interaction control, and triggering corresponding alternative map card objects.
Optionally, the candidate map card objects include a general map card object and a special map card object, and the step of extracting the candidate map card object adapted to the scene information and the state information of the vehicle includes:
judging whether to select the special map card object based on the state information;
if yes, extracting a target special map card object from the special map card object;
and if not, extracting a target normal map card object from the normal map card objects based on the scene information.
Optionally, the state information of the vehicle includes navigation state information and shielding information, the special map card object includes a navigation card object and a return map card object, and the step of extracting the target special map card object from the special map card object includes:
Judging whether the vehicle is in a navigation state or not based on the navigation state information;
if yes, determining the target special map card object as a navigation card object;
If not, judging whether the virtual map interface is blocked by a preset control window or not based on the blocking information;
if the virtual map interface is blocked by a preset control window, determining that the target special map card object is a return map card object;
And extracting the target special map card object.
Optionally, the scene information includes a current time and a current vehicle position, and the step of extracting a target normal map card object from the normal map card objects based on the scene information includes:
judging whether the current time belongs to preset reminding time or not;
if yes, judging whether the current vehicle position belongs to a preset reminding position or not;
If the current vehicle position belongs to a preset reminding position, determining a normal map card object corresponding to the reminding time and the reminding position as a target normal map card object;
the target is extracted, typically a map card object.
Optionally, the candidate map card object is a navigation card object, and the step of triggering the corresponding candidate map card object in response to the operation on the interaction control includes:
And responding to the operation of the interaction control, and exiting the navigation state.
Optionally, the method further comprises:
Receiving user-defined time and user-defined position input by a user;
And generating a first alternative map card object by adopting the custom time and the custom position.
Optionally, the intelligent cabin is connected with a mobile terminal, and the method further comprises:
receiving schedule information sent by the mobile terminal; the schedule information comprises schedule time and reminding item text;
And generating a second alternative map card object by adopting the schedule information.
The embodiment of the invention also discloses an interaction device of the map card, which is applied to an intelligent cabin of a vehicle, wherein the intelligent cabin is provided with a vehicle-mounted display part, the vehicle-mounted display part displays a virtual map interface, and the device comprises:
the acquisition module is used for acquiring scene information and state information of the vehicle;
An alternative map card object extraction module for extracting an alternative map card object adapted to the scene information and the state information of the vehicle; the alternative map card object is provided with an interaction control;
the map card candidate object display module is used for displaying the map card candidate object in the virtual map interface;
And the alternative map card object triggering module is used for responding to the operation of the interaction control and triggering the corresponding alternative map card object.
The embodiment of the invention also discloses a vehicle, which comprises:
One or more processors; and
One or more machine readable media having instructions stored thereon, which when executed by the one or more processors, cause the vehicle to perform one or more methods as described above.
One or more machine-readable media having instructions stored thereon, which when executed by one or more processors, cause the processors to perform one or more of the methods described above are also disclosed.
The embodiment of the invention has the following advantages:
In the embodiment of the invention, the scene information and the state information of the vehicle are acquired, the candidate map card object which is adapted to the scene information and the state information of the vehicle is extracted, the candidate map card object is provided with the interactive control, the candidate map card object is displayed in the virtual map interface, and the corresponding candidate map card object is triggered in response to the operation aiming at the interactive control, so that the map card which is related to the information of the current scene can be displayed for the user according to the specific scene of the current user, the map card also has the application function corresponding to the current scene and can be selected by the user for use, the relevance of the map card and the scene of the user is greatly improved, and the user experience is better.
Drawings
FIG. 1 is a flow chart of steps of an interactive method embodiment of a map card of the present invention;
FIG. 2 is a flow chart of steps of another embodiment of a map card interaction method of the present invention;
FIG. 3 is a schematic diagram of a virtual map interface displaying standby map card objects in accordance with the present invention;
FIG. 4 is a schematic illustration of a virtual map interface displaying alternate map card objects in accordance with the present invention;
FIG. 5 is a schematic illustration of a virtual map interface displaying standby map card objects in accordance with the present invention;
FIG. 6 is a schematic diagram of a virtual map interface displaying standby map card objects in accordance with the present invention;
FIG. 7 is a schematic diagram of a virtual map interface displaying standby map card objects in accordance with the present invention;
FIG. 8 is a schematic diagram of a virtual map interface displaying standby map card objects in accordance with the present invention;
Fig. 9 is a block diagram of an interactive apparatus embodiment of a map card of the present invention.
Detailed Description
In order that the above-recited objects, features and advantages of the present invention will become more readily apparent, a more particular description of the invention will be rendered by reference to the appended drawings and appended detailed description. It will be apparent that the described embodiments are some, but not all, embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
In recent years, the main driving force of the development of the vehicle industry is driven by products and technologies of a past supply end, the driving force is gradually converted into the driving force of continuously improving user demands, the cognition of users on vehicles is gradually changed from a single vehicle to a living space, a cockpit is used as a space for direct experience of the users, the intelligent degree of the cockpit is improved, the cockpit becomes an experience core of human-vehicle interaction, and better vehicle riding experience can be provided for the users.
The intelligent cabin electronic system can refer to a whole set of system consisting of a central control system, a full liquid crystal instrument, a head-up display, an entertainment system, an intelligent sound system, a car networking module, a streaming media rearview mirror, a remote information processing system and the like.
The intelligent cabin system can be a system which takes the cockpit area controller as a center, realizes the functions of the intelligent cabin electronic system on a unified software and hardware platform, and is integrated with intelligent interaction, intelligent scene and personalized service. The intelligent cabin system can form the basis of human-vehicle interaction and interconnection of vehicles and the outside.
The usage scenario of the intelligent cabin system may generally cover the entire scenario of the user's usage of the vehicle. Specifically, the method can comprise a time scene after a user uses the vehicle before the user uses the vehicle and during the process of using the vehicle by the user, and also can comprise a space scene of a driver, a co-driver, a rear passenger, related people or objects outside the vehicle.
Compared with the past instruction type interaction, the human-computer interaction in the intelligent cabin can be combined with the use scene of the vehicle and the user, and the intelligent effect which meets the requirements of the user better is achieved based on the basic technologies such as image recognition, voice recognition and environment perception.
In the embodiment of the invention, the scene information and the state information of the vehicle are acquired, the candidate map card object which is adapted to the scene information and the state information of the vehicle is extracted, the candidate map card object is provided with the interaction control, the candidate map card object is displayed in the virtual map interface, and the corresponding candidate map card object is triggered in response to the operation aiming at the interaction control, so that the map card which is related to the information of the current scene can be displayed for the user according to the specific scene of the current user, the map card also has the application function corresponding to the current scene and can be selected by the user for use, the relevance of the map card and the scene of the user is greatly improved, and the user experience is better.
Referring to fig. 1, there is shown a flow chart of steps of an interactive method embodiment of a map card of the present invention, applied to an intelligent cabin of a vehicle, the intelligent cabin having an on-board display unit displaying a virtual map interface, and specifically may include the steps of:
step 101, acquiring scene information and state information of the vehicle;
for a user driving a vehicle, the navigation function of the vehicle is one of the most commonly used functions, and the basis of the navigation function is a virtual map.
After a user enters a vehicle and starts the intelligent cabin, the intelligent cabin can acquire the state information of the current vehicle and the scene information related to the scene where the user is currently located, and the intelligent cabin can judge the scene where the user is currently located through the scene information and the state information of the vehicle.
Step 102, extracting an alternative map card object adapted to the scene information and the state information of the vehicle; the alternative map card object is provided with an interaction control;
the map card can be an information window which can be displayed on a virtual map interface and provides specific information and application functions for users, different map cards are configured with different guide characters and application functions, one or more interaction controls are arranged on the map card, each interaction control can correspond to one application function, and the users can trigger the interaction control in a click triggering mode, for example, so that the corresponding application function can be started quickly. Therefore, after the scene information and the state information of the vehicle are acquired, the candidate map card matched with the scene information and the state information of the vehicle can be selected from various candidate map cards stored in the intelligent cabin.
Step 103, displaying the candidate map card object in the virtual map interface;
after the intelligent cabin extracts the candidate map card objects, the candidate map card objects are displayed in the virtual map interface, and a user can view the candidate map card in the virtual map interface.
And step 104, responding to the operation of the interaction control, and triggering a corresponding alternative map card object.
When the user views the alternative map card, the application function set by the alternative map card can be triggered through the triggering operation of the interaction control. For example, the application function corresponding to the interaction control in the alternative map card is a function of searching the surrounding parking lot, after clicking the interaction control, the intelligent cabin starts searching the surrounding parking lot, and the searching result is displayed for the user in the virtual map interface for the user to select.
In the embodiment of the invention, the alternative map card object adapted to the scene information and the state information of the vehicle is extracted by acquiring the scene information and the state information of the vehicle, the alternative map card object is provided with the interaction control, the alternative map card object is displayed in the virtual map interface, and the corresponding alternative map card object is triggered in response to the operation aiming at the interaction control, so that the map card which is related to the information of the current scene can be displayed for the user according to the specific scene of the current user, the map card also has the application function corresponding to the current scene and can be selected by the user for use, the relevance of the map card and the scene of the user is greatly improved, and the user experience is better.
Referring to fig. 2, there is shown a flow chart of steps of another interactive method embodiment of a map card of the present invention, applied to an intelligent cabin of a vehicle, the intelligent cabin having an on-board display unit displaying a virtual map interface, which may specifically include the steps of:
step 201, acquiring scene information and state information of the vehicle;
In the embodiment of the present invention, step 201 is similar to step 101 in the previous embodiment, and the detailed description will refer to step 101 in the previous embodiment, and will not be repeated here.
Step 202, judging whether to select the special map card object based on the state information;
Specifically, the candidate map card objects may include a general map card object and a special map card object, where the general map card object is related to a daily scene in which the user is located, such as a commute, a dinner, etc., and the special map card object is related to a state in which the vehicle is currently located, such as a navigation state.
And when the alternative map card is selected, the special map card object is selected with a priority greater than that of the normal map card object, namely, if the alternative map card object comprises the special map card object and the normal map card object, the special map card object is selected preferentially. Therefore, it is possible to determine whether the status information satisfies the requirement of extracting the special map card object.
Step 203, if yes, extracting a target special map card object from the special map card objects;
specifically, different special map card objects may be displayed in different vehicle states, and a target special map card object corresponding to the current vehicle state may be selected from a plurality of special map card objects.
In an alternative embodiment of the present invention, the status information of the vehicle includes navigation status information and shielding information, the special map card object includes a navigation card object and a return map card object, and the step 203 includes the following sub-steps:
Judging whether the vehicle is in a navigation state or not based on the navigation state information;
if yes, determining the target special map card object as a navigation card object;
If not, judging whether the virtual map interface is blocked by a preset control window or not based on the blocking information;
if the virtual map interface is blocked by a preset control window, determining that the target special map card object is a return map card object;
And extracting the target special map card object.
Because the virtual map interface takes the virtual map as a desktop, other control windows, such as a popup window set by a system, a Bluetooth function setting popup window and the like, can be displayed above the virtual map, and can shield the virtual map when the control windows are displayed, so that whether the control windows displayed above the virtual map exist currently can be detected, shielding information is obtained, whether the virtual map is shielded or not is judged according to the shielding information, specifically, if the shielding information is the control windows displayed above the virtual map, the virtual map interface is determined to be shielded, at the moment, the special map card object which should be extracted can be determined to be a return map card object, the return map card object can contain guide words of returning to the map and corresponding interactive controls, a user can trigger the interactive controls to close all the displayed control windows and directly return to the virtual map interface. If the virtual map interface is not blocked by the control window, whether the vehicle starts the navigation function can be judged according to the navigation state information, if the vehicle is in the navigation state, a navigation card object is provided for a user, and the current position, the destination, the distance from the destination and the text for guiding driving can be displayed in the navigation card object. And extracting the target special map card object which is determined to be displayed to the user.
Step 204, if not, extracting a target normal map card object from the normal map card objects based on the scene information;
when the state information of the vehicle does not meet the extraction condition of the special map card object, the user can be provided with the normal map card object, and therefore the target normal map card object corresponding to the scene in which the user is currently located is extracted from the plurality of normal map card objects based on the scene information.
In an alternative embodiment of the present invention, the scene information includes a current time and a current vehicle position, and the step 204 includes the following sub-steps:
judging whether the current time belongs to preset reminding time or not;
if yes, judging whether the current vehicle position belongs to a preset reminding position or not;
If the current vehicle position belongs to a preset reminding position, determining a normal map card object corresponding to the reminding time and the reminding position as a target normal map card object;
the target is extracted, typically a map card object.
Specifically, the scene where the user is located may be a scene where the user is ready to go to work in the morning, a scene where the user is ready to go to work in the evening, or a scene where the user is ready to eat lunch in noon, and the like, and different general map card objects can be provided for the user for different scenes. Therefore, whether the current time belongs to the reminding time set in the intelligent cabin is judged, for example, the set reminding time is 7:00-9:00 on workdays, if so, whether the current position of the vehicle belongs to the reminding position corresponding to the reminding time is judged, for example, the reminding position corresponding to 7:00-9:00 on workdays is within 200 meters nearby home, if the vehicle position also belongs to the reminding position, the normal map card object corresponding to the reminding time 7:00-9:00 on workdays and the reminding position within 200 meters nearby home can be used as the normal map card object displayed to the user, and the normal map card object is extracted.
Step 205, displaying the candidate map card object in the virtual map interface;
After the candidate map card object is extracted, the extracted candidate map card object is displayed in the virtual map interface, specifically, in order to facilitate the user to observe the virtual map interface, the candidate map card object may be displayed on the left side or the right side of the virtual map interface, the user may view the displayed candidate map card object, and take a general map card object corresponding to a reminding time of 7:00-9:00 and a reminding position of within 200 meters around home as an example, and the general map card object corresponds to a scene of working, so that an interaction control for guiding the words of navigating to a company and starting a navigation function and an interaction control for searching a nearby parking lot may be displayed. It should be noted that when neither the scene information nor the status information of the vehicle meets the condition of extracting the standby map card object, a default map card object may be displayed in the virtual map interface, which displays a guidance language of the default setting of the intelligent cabin, such as "you good, please ask where to go", and an interactive control for searching for a nearby parking lot.
And step 206, triggering corresponding alternative map card objects in response to the operation of the interaction control.
Specifically, functions triggered by the interaction controls set in different alternative map cards are different, and a normal map card object corresponding to reminding time 7:00-9:00 on workdays and reminding position within 200 meters nearby is taken as an example, wherein the normal map card object is provided with two interaction controls, one of the interaction controls can be used for directly opening a navigation function taking a company as a destination after being triggered, and the other one of the interaction controls can be used for searching nearby parking lots. For other standby map card objects, interaction controls with other functions, such as standby map card objects provided when the user leaves work, can enter a navigation state of a destination home after the interaction controls are triggered, such as standby map card objects provided when the user prepares to eat lunch, can search nearby restaurants for the user and display after the interaction controls are triggered, and the user can trigger the functions set in the standby map card objects by triggering the interaction controls.
In an alternative embodiment of the present invention, the candidate map card object is a navigation card object, and the step 206 further includes the following sub-steps:
And responding to the operation of the interaction control, and exiting the navigation state.
When the vehicle is in the navigation state, the displayed alternative map card object becomes a navigation card object, an interaction control for exiting the navigation state is provided in the navigation card object, and the user directly triggers the interaction control to exit the navigation state.
In an alternative embodiment of the present invention, the method further comprises:
Receiving user-defined time and user-defined position input by a user;
And generating a first alternative map card object by adopting the custom time and the custom position.
In order to meet personal requirements of different users, the intelligent cabin can also accept user-input self-defined time and self-defined position, then the self-defined time and self-defined position are adopted to generate an alternative map card object, when the current time accords with the self-defined time, the generated alternative map card object can be extracted and displayed, an interaction control for entering navigation can be arranged in the alternative map card object, after the user is triggered, the vehicle directly enters a navigation state with the self-defined position as a destination, and a one-key navigation function is realized.
In an alternative embodiment of the present invention, the smart capsule is connected to a mobile terminal, and the method further comprises:
receiving schedule information sent by the mobile terminal; the schedule information comprises schedule time and reminding item text;
And generating a second alternative map card object by adopting the schedule information.
The intelligent cabin can be connected with a mobile terminal used by a user through a wireless signal, and the user can set schedule information on the mobile terminal, for example, schedule time: tuesday 16 hours 40 minutes; reminding item: after receiving the child to learn, sending the schedule information to the intelligent cabin, and generating an alternative map card object by the intelligent cabin by adopting the schedule information, so that when the current time meets the schedule time, the alternative map card object containing reminding items is displayed in the virtual map interface.
In the embodiment of the invention, the scene information and the state information of the vehicle are acquired, whether the special map card object is selected or not is judged based on the state information, if yes, the target special map card object is extracted from the special map card object, if no, the target general map card object is extracted from the general map card object based on the scene information, the alternative map card object is displayed in the virtual map interface, and the corresponding alternative map card object is triggered in response to the operation aiming at the interaction control, so that the scenes corresponding to the map card are more various, the map card which is related to the information of the current scene can be displayed for the user according to the specific scene of the current user, the intelligent cabin can judge the intention of the user, so that the map card meeting the user requirement is provided, and the user experience is better.
In order to facilitate the technical personnel to further enhance the understanding of the invention, the invention is described below by way of an application scenario example.
Scene one
The method comprises the steps that a user enters a main driver seat of an intelligent cabin and starts the intelligent cabin 20 minutes on a working day, the intelligent cabin obtains current scene information and state information of a vehicle, the fact that the vehicle is not in a navigation state and a control window for shielding a virtual map interface does not exist is determined according to the state information of the vehicle, a corresponding normal map card object is not found according to the current scene information, and displaying a default alternative map card object for the user is confirmed. As shown in fig. 3, a virtual map interface is shown, where 301 is a virtual map interface, 302 is a displayed default candidate map card object, and the default candidate map card object is provided with a guide phrase "good morning, and navigation and where" 303 is an interaction control used for searching surrounding parking lots, and a user can search parking lots around a vehicle by clicking the interaction control, so as to select the parking lots for parking.
Scene two
The method comprises the steps that a user enters a main driver seat of an intelligent cabin and starts the intelligent cabin at 20 minutes on the working day, the user opens a wireless network selection window on a vehicle-mounted display component, the intelligent cabin obtains current scene information and state information of a vehicle, the wireless network selection window shielding a virtual map interface is determined according to the state information of the vehicle, and the user is confirmed to possibly need to return to the virtual map interface, so that a return map card object is displayed for the user. As shown in fig. 4, a schematic diagram of a virtual map interface displaying a return map card object is shown, 401 is a wireless network selection window for shielding the virtual map interface, 402 is the return map card object displayed, the return map card object is provided with a guide word "return desktop", 403 and 404 are interaction controls identical to the interaction controls set in the standby map card object displayed when the virtual map interface is not shielded, and a user can close the wireless network selection window and return to the virtual map interface by clicking the return map card object.
Scene three
The working day is 10 hours and 20 minutes, a user enters a main driver seat of the intelligent cabin and starts the intelligent cabin, the user starts a navigation function, and the vehicle enters a navigation state. The intelligent cabin acquires current scene information and state information of the vehicle, determines that the vehicle is in a navigation state according to the state information of the vehicle and a control window for shielding a virtual map interface does not exist, and confirms that a navigation card object is displayed for a user. As shown in fig. 5, a virtual map interface is shown for displaying a navigation card object, 501 is a displayed navigation map card object, a navigation destination, a distance, and the like are displayed on the navigation card object, 502 is an interaction control for exiting navigation, and a user can exit a navigation state of a vehicle by clicking the interaction control.
Scene four
8 Minutes on weekday, the user comes to park at 150 meters from home, enters the main driver's seat of the intelligent cabin and starts the intelligent cabin. The intelligent cabin acquires current scene information and state information of the vehicle, determines that the vehicle is not in a navigation state according to the state information of the vehicle and a control window for shielding a virtual map interface does not exist, determines that the current time is 8 points and 20 points on the workday, the current time is 6 points and 30 points and 9 points and 30 points on the workday, and determines that the current vehicle position is 150 meters away from the home and 0 meters to 200 meters away from the home, so that a user is judged to be likely to be ready to work, and corresponding general map card objects are extracted. As shown in fig. 6, a virtual map interface is shown for displaying a normal map card object, 601 is a displayed normal map card object, a guide word "navigate to company" is displayed on the normal map card object, 602 is an interactive control for starting navigation destined for company, 603 is an interactive control for searching a nearby parking lot, and a user can quickly open the functions of navigating or searching the parking lot by clicking the interactive control.
Scene five
At 12 minutes of weekday, the user comes out of the company, comes to a vehicle parked 200 meters from the company, enters the main operator seat of the intelligent cabin and starts the intelligent cabin. The intelligent cabin acquires current scene information and state information of the vehicle, determines that the vehicle is not in a navigation state according to the state information of the vehicle and a control window for shielding a virtual map interface does not exist, determines that the current time is 12 minutes on the working day and the current time is 11 minutes on the working day and 30 minutes on the working day and 13 minutes on the working day and the current time is 30 minutes on the working day and 13 minutes on the working day and determines that the current vehicle position is 0m to 500m away from the company according to the current vehicle position is 200 m away from the company, so that the user is judged to possibly need to eat lunch, and corresponding general map card objects are extracted. As shown in fig. 7, a virtual map interface is shown, where 701 is a displayed normal map card object, on which a guide phrase "the user has lunch" is displayed, 702 is an interactive control for searching nearby restaurants, 703 is an interactive control for searching nearby parking lots, and a user can quickly open a function of searching restaurants or searching parking lots by clicking the interactive control.
Scene six
At 18 minutes of weekday, the user comes out of the company, comes to a vehicle parked 200 meters from the company, enters the main operator seat of the intelligent cabin and starts the intelligent cabin. The intelligent cabin acquires current scene information and state information of the vehicle, determines that the vehicle is not in a navigation state according to the state information of the vehicle and a control window for shielding a virtual map interface does not exist, determines that the current time is 18 points and 20 points and the current time is 18 points and 00 points and 20 points and 00 points of the working day, and determines that the current vehicle position is 0-500 meters away from the company according to the current vehicle position is 200 meters away from the company, so that the user is judged to possibly get home after working, and corresponding general map card objects are extracted. As shown in fig. 8, a virtual map interface is shown for displaying a normal map card object, 801 is a displayed normal map card object, the normal map card object is displayed with a guide phrase "navigate home", 802 is an interactive control for starting navigation with home destination, 803 is an interactive control for searching a nearby parking lot, and a user can quickly open the functions of navigating or searching the parking lot by clicking the interactive control.
In the embodiment of the invention, a plurality of different standby map card objects are automatically displayed for the user according to whether the vehicle is in a navigation state, whether the virtual map interface is shielded or not and the current time and the position of the vehicle, so that the user can conveniently select the function to be used, the operation is simple, the convenience and the pertinence are realized, the efficiency of the user for completing various daily transactions is improved, and the user experience is greatly improved.
It should be noted that, for simplicity of description, the method embodiments are shown as a series of acts, but it should be understood by those skilled in the art that the embodiments are not limited by the order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred embodiments, and that the acts are not necessarily required by the embodiments of the invention.
Referring to fig. 9, there is shown a block diagram of an interactive apparatus embodiment of a map card of the present invention, applied to a smart car of a vehicle, the smart car having an on-board display unit displaying a virtual map interface, which may include the following modules:
an acquisition module 901, configured to acquire scene information and status information of the vehicle;
an alternative map card object extraction module 902 for extracting an alternative map card object adapted to the scene information and the state information of the vehicle; the alternative map card object is provided with an interaction control;
an alternative map card object display module 903, configured to display the alternative map card object in the virtual map interface;
the alternative map card object triggering module 904 is used for triggering corresponding alternative map card objects in response to the operation of the interaction control.
In an embodiment of the present invention, the candidate map card objects include a general map card object and a special map card object, and the candidate map card object extraction module 902 includes:
A special map card object judging sub-module for judging whether to select the special map card object based on the state information;
The target special map card object extraction sub-module is used for extracting a target special map card object from the special map card object if the special map card object is selected;
And the target general map card object extraction sub-module is used for extracting the target general map card object from the general map card object based on the scene information if the special map card object is not selected.
In an embodiment of the present invention, the state information of the vehicle includes navigation state information and shielding information, the special map card object includes a navigation card object and a return map card object, and the target special map card object extraction submodule includes:
The shielding judging unit is used for judging whether the virtual map interface is shielded by a preset control window or not based on the shielding information;
The return map card object determining unit is used for determining that the target special map card object is a return map card object if the virtual map interface is blocked by a preset control window;
the navigation state judging unit is used for judging whether the vehicle is in a navigation state or not based on the navigation state information if the virtual map interface is not blocked by a preset control window;
the navigation card object determining unit is used for determining that the target special map card object is a navigation card object if the vehicle is in a navigation state;
and the first extraction unit is used for extracting the target special map card object.
In one embodiment of the present invention, the scene information includes a current time and a current vehicle position, and the target general map card object extraction submodule includes:
The current time judging unit is used for judging whether the current time belongs to preset reminding time or not;
the current vehicle position judging unit is used for judging whether the current vehicle position belongs to a preset reminding position or not if the current time belongs to the preset reminding time;
A target normal map card object determining unit, configured to determine a normal map card object corresponding to the reminding time and the reminding position as a target normal map card object if the current vehicle position belongs to a preset reminding position;
and the second extraction unit is used for extracting the target general map card object.
In an embodiment of the present invention, the candidate map card object is a navigation card object, and the candidate map card object triggering module 904 includes:
and the exit sub-module is used for responding to the operation of the interaction control and exiting the navigation state.
In an embodiment of the invention, the apparatus further comprises:
The first receiving module is used for receiving the user-defined time and the user-defined position input by the user;
And the first alternative map card object generation module is used for generating a first alternative map card object by adopting the custom time and the custom position.
In an embodiment of the invention, the apparatus further comprises:
the second receiving module is used for receiving schedule information sent by the mobile terminal; the schedule information comprises schedule time and reminding item text;
and the second alternative map card object generation module is used for generating a second alternative map card object by adopting the schedule information.
For the device embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and reference is made to the description of the method embodiments for relevant points.
The embodiment of the invention also discloses a vehicle, which comprises:
One or more processors; and
One or more machine readable media having instructions stored thereon, which when executed by the one or more processors, cause the vehicle to perform one or more methods as described above.
One or more machine-readable media having instructions stored thereon, which when executed by one or more processors, cause the processors to perform one or more of the methods described above are also disclosed.
In this specification, each embodiment is described in a progressive manner, and each embodiment is mainly described by differences from other embodiments, and identical and similar parts between the embodiments are all enough to be referred to each other.
It will be apparent to those skilled in the art that embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the invention may take the form of a computer program product on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal device to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal device, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiment and all such alterations and modifications as fall within the scope of the embodiments of the invention.
Finally, it is further noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or terminal device that comprises the element.
The above description of the present invention provides a map card interaction method, a map card interaction device, a vehicle and a readable medium, and specific examples are applied to illustrate the principles and embodiments of the present invention, and the above description of the examples is only used to help understand the method and core ideas of the present invention; meanwhile, as those skilled in the art will have variations in the specific embodiments and application scope in accordance with the ideas of the present invention, the present description should not be construed as limiting the present invention in view of the above.

Claims (8)

1. An interaction method of map cards, applied to an intelligent cabin of a vehicle, the intelligent cabin having an on-board display component, the on-board display component displaying a virtual map interface, the method comprising:
Acquiring scene information and state information of the vehicle;
Extracting an alternative map card object adapted to the scene information and the state information of the vehicle; the alternative map card object is provided with an interaction control; the candidate map card object is also configured with an application function corresponding to the scene information; the alternative map card objects comprise a general map card object and a special map card object;
displaying the candidate map card object in the virtual map interface;
triggering corresponding alternative map card objects in response to the operation of the interaction control;
the step of extracting the candidate map card object adapted to the scene information and the state information of the vehicle includes:
judging whether to select the special map card object based on the state information;
if yes, extracting a target special map card object from the special map card object;
if not, extracting a target normal map card object from the normal map card objects based on the scene information;
the method further comprises the steps of:
Receiving user-defined time and user-defined position input by a user;
And generating a first alternative map card object by adopting the custom time and the custom position.
2. The method of claim 1, wherein the status information of the vehicle includes navigation status information and occlusion information, the special map card object includes a navigation card object and a return map card object, and the extracting the target special map card object from the special map card object includes:
judging whether the virtual map interface is blocked by a preset control window or not based on the blocking information;
If yes, determining the target special map card object as a return map card object;
if not, judging whether the vehicle is in a navigation state or not based on the navigation state information;
if the vehicle is in a navigation state, determining that the target special map card object is a navigation card object;
And extracting the target special map card object.
3. The method of claim 1, wherein the scene information includes a current time and a current vehicle position, and wherein the step of extracting a target normal map card object from the normal map card objects based on the scene information comprises:
judging whether the current time belongs to preset reminding time or not;
if yes, judging whether the current vehicle position belongs to a preset reminding position or not;
If the current vehicle position belongs to a preset reminding position, determining a normal map card object corresponding to the reminding time and the reminding position as a target normal map card object;
the target is extracted, typically a map card object.
4. The method of claim 2, wherein the candidate map card objects are navigation card objects, and the step of triggering the respective candidate map card objects in response to the operation on the interaction control comprises:
And responding to the operation of the interaction control, and exiting the navigation state.
5. The method of claim 1, wherein the intelligent cockpit is connected to a mobile terminal, the method further comprising:
receiving schedule information sent by the mobile terminal; the schedule information comprises schedule time and reminding item text;
And generating a second alternative map card object by adopting the schedule information.
6. An interactive device of map card is applied to the intelligent cabin of vehicle, intelligent cabin has on-vehicle display part, on-vehicle display part shows virtual map interface, characterized in that, the device includes:
the acquisition module is used for acquiring scene information and state information of the vehicle;
An alternative map card object extraction module for extracting an alternative map card object adapted to the scene information and the state information of the vehicle; the alternative map card object is provided with an interaction control; the candidate map card object is also configured with an application function corresponding to the scene information; the alternative map card objects comprise a general map card object and a special map card object;
the map card candidate object display module is used for displaying the map card candidate object in the virtual map interface;
the alternative map card object triggering module is used for responding to the operation of the interaction control and triggering a corresponding alternative map card object;
the alternative map card object extraction module comprises:
A special map card object judging sub-module for judging whether to select the special map card object based on the state information;
The target special map card object extraction sub-module is used for extracting a target special map card object from the special map card object if the special map card object is selected;
The target general map card object extraction sub-module is used for extracting a target general map card object from the general map card objects based on the scene information if the special map card object is not selected;
The apparatus further comprises:
The first receiving module is used for receiving the user-defined time and the user-defined position input by the user;
And the first alternative map card object generation module is used for generating a first alternative map card object by adopting the custom time and the custom position.
7. A vehicle, characterized by comprising:
One or more processors; and
One or more machine readable media having instructions stored thereon, which when executed by the one or more processors, cause the vehicle to perform the method of one or more of claims 1-5.
8. One or more machine readable media having instructions stored thereon, which when executed by one or more processors, cause the processors to perform the method of one or more of claims 1-5.
CN202011332251.1A 2020-11-24 2020-11-24 Interaction method and device of map card, vehicle and readable medium Active CN112525214B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011332251.1A CN112525214B (en) 2020-11-24 2020-11-24 Interaction method and device of map card, vehicle and readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011332251.1A CN112525214B (en) 2020-11-24 2020-11-24 Interaction method and device of map card, vehicle and readable medium

Publications (2)

Publication Number Publication Date
CN112525214A CN112525214A (en) 2021-03-19
CN112525214B true CN112525214B (en) 2024-05-28

Family

ID=74993460

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011332251.1A Active CN112525214B (en) 2020-11-24 2020-11-24 Interaction method and device of map card, vehicle and readable medium

Country Status (1)

Country Link
CN (1) CN112525214B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113419800B (en) * 2021-06-11 2023-03-24 北京字跳网络技术有限公司 Interaction method, device, medium and electronic equipment
CN113602090A (en) * 2021-08-03 2021-11-05 岚图汽车科技有限公司 Vehicle control method, device and system
CN114374765B (en) * 2021-12-16 2024-04-19 浙江零跑科技股份有限公司 Method for realizing mobile phone schedule information reminding by intelligent cabin
CN114895814A (en) * 2022-06-16 2022-08-12 广州小鹏汽车科技有限公司 Interaction method of vehicle-mounted system, vehicle and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103442072A (en) * 2013-09-02 2013-12-11 百度在线网络技术(北京)有限公司 Method and device for pushing traffic information related to user schedules
CN108267142A (en) * 2016-12-30 2018-07-10 上海博泰悦臻电子设备制造有限公司 A kind of navigation display method based on address card, system and a kind of vehicle device
CN108845736A (en) * 2018-06-12 2018-11-20 苏州思必驰信息科技有限公司 Exchange method and system for vehicle-mounted voice system
CN110095133A (en) * 2019-04-30 2019-08-06 广州小鹏汽车科技有限公司 Road conditions based reminding method, device, vehicle, computer equipment and its storage medium
WO2020186897A1 (en) * 2019-03-18 2020-09-24 北京无限光场科技有限公司 Information processing method and apparatus
CN111722905A (en) * 2020-06-28 2020-09-29 广州小鹏车联网科技有限公司 Interaction method, information processing method, vehicle and server
CN111722825A (en) * 2020-06-28 2020-09-29 广州小鹏车联网科技有限公司 Interaction method, information processing method, vehicle and server
CN111768779A (en) * 2020-06-28 2020-10-13 广州小鹏车联网科技有限公司 Interaction method, information processing method, vehicle and server

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103442072A (en) * 2013-09-02 2013-12-11 百度在线网络技术(北京)有限公司 Method and device for pushing traffic information related to user schedules
CN108267142A (en) * 2016-12-30 2018-07-10 上海博泰悦臻电子设备制造有限公司 A kind of navigation display method based on address card, system and a kind of vehicle device
CN108845736A (en) * 2018-06-12 2018-11-20 苏州思必驰信息科技有限公司 Exchange method and system for vehicle-mounted voice system
WO2020186897A1 (en) * 2019-03-18 2020-09-24 北京无限光场科技有限公司 Information processing method and apparatus
CN110095133A (en) * 2019-04-30 2019-08-06 广州小鹏汽车科技有限公司 Road conditions based reminding method, device, vehicle, computer equipment and its storage medium
CN111722905A (en) * 2020-06-28 2020-09-29 广州小鹏车联网科技有限公司 Interaction method, information processing method, vehicle and server
CN111722825A (en) * 2020-06-28 2020-09-29 广州小鹏车联网科技有限公司 Interaction method, information processing method, vehicle and server
CN111768779A (en) * 2020-06-28 2020-10-13 广州小鹏车联网科技有限公司 Interaction method, information processing method, vehicle and server

Also Published As

Publication number Publication date
CN112525214A (en) 2021-03-19

Similar Documents

Publication Publication Date Title
CN112525214B (en) Interaction method and device of map card, vehicle and readable medium
CN104838335B (en) Use the interaction and management of the equipment of gaze detection
US9656690B2 (en) System and method for using gestures in autonomous parking
US10209853B2 (en) System and method for dialog-enabled context-dependent and user-centric content presentation
EP2223046B1 (en) Multimode user interface of a driver assistance system for inputting and presentation of information
CN110211586A (en) Voice interactive method, device, vehicle and machine readable media
CN113715811B (en) Parking method, parking device, vehicle, and computer-readable storage medium
US11820228B2 (en) Control system and method using in-vehicle gesture input
US20180081507A1 (en) User interface for accessing a set of functions, procedures, and computer readable storage medium for providing a user interface for access to a set of functions
CN113302664A (en) Multimodal user interface for a vehicle
WO2014070872A2 (en) System and method for multimodal interaction with reduced distraction in operating vehicles
WO2022062491A1 (en) Vehicle-mounted smart hardware control method based on smart cockpit, and smart cockpit
CN112590807B (en) Vehicle control card interaction method and device for vehicle components
CN111722905A (en) Interaction method, information processing method, vehicle and server
CN111722825A (en) Interaction method, information processing method, vehicle and server
KR102286569B1 (en) Smart car see-through display control system and method thereof
CN112667084A (en) Control method and device for vehicle-mounted display screen, electronic equipment and storage medium
CN111324202A (en) Interaction method, device, equipment and storage medium
EP4086580A1 (en) Voice interaction method, apparatus and system, vehicle, and storage medium
CN116204253A (en) Voice assistant display method and related device
CN113961114A (en) Theme replacement method and device, electronic equipment and storage medium
CN113353065A (en) Interaction method and device based on automatic driving
WO2020200557A1 (en) Method and apparatus for interaction with an environment object in the surroundings of a vehicle
CN104019826A (en) Automatic navigation method and system based on touch control
WO2023153314A1 (en) In-vehicle equipment control device and in-vehicle equipment control method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant