CN113450799A - Vehicle-mounted schedule management method, system, terminal and storage medium - Google Patents

Vehicle-mounted schedule management method, system, terminal and storage medium Download PDF

Info

Publication number
CN113450799A
CN113450799A CN202110733775.XA CN202110733775A CN113450799A CN 113450799 A CN113450799 A CN 113450799A CN 202110733775 A CN202110733775 A CN 202110733775A CN 113450799 A CN113450799 A CN 113450799A
Authority
CN
China
Prior art keywords
schedule
vehicle
user
management method
terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110733775.XA
Other languages
Chinese (zh)
Inventor
姜杨阳
李志刚
李振龙
于振勇
孟庆贺
汤祺
刘思琪
于昊
王荫南
马文峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
FAW Bestune Car Co Ltd
Original Assignee
FAW Bestune Car Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by FAW Bestune Car Co Ltd filed Critical FAW Bestune Car Co Ltd
Priority to CN202110733775.XA priority Critical patent/CN113450799A/en
Publication of CN113450799A publication Critical patent/CN113450799A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/109Time management, e.g. calendars, reminders, meetings or time accounting
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • G10L2015/223Execution procedure of a spoken command

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Physics & Mathematics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • General Physics & Mathematics (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Economics (AREA)
  • Data Mining & Analysis (AREA)
  • General Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Theoretical Computer Science (AREA)
  • Computational Linguistics (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Human Computer Interaction (AREA)
  • Acoustics & Sound (AREA)
  • Multimedia (AREA)
  • Navigation (AREA)

Abstract

The invention discloses a vehicle-mounted schedule management method, a system, a terminal and a storage medium, belonging to the technical field of intelligent driving vehicles and comprising the following steps: when a user awakening request is received, acquiring voice data in the user awakening request; converting the voice data into semantic data; determining a schedule-related intent from the semantic data; and executing the schedule operation corresponding to the user awakening request according to the schedule related intention. The patent provides a vehicle-mounted schedule management method, a system, a terminal and a storage medium, based on the requirements of schedule creation and management in a user vehicle scene, a voice schedule creation mode more conforming to vehicle-mounted interaction is realized, meanwhile, the strong relevance of vehicle-mounted scene schedule and navigation is considered, a navigation function is integrated into a schedule creation process, navigation can be directly initiated while schedule reminding, and the driving safety and convenient interaction experience of the user are improved.

Description

Vehicle-mounted schedule management method, system, terminal and storage medium
Technical Field
The invention discloses a vehicle-mounted schedule management method, a system, a terminal and a storage medium, and belongs to the technical field of intelligent vehicle driving.
Background
With the rapid discovery of the vehicle network technology, vehicles become a third space except homes and companies gradually, vehicle-mounted interaction is more and more intelligent, the traditional way of managing schedules through a mobile phone is unfriendly in the vehicles, and with the wide application of the voice recognition technology in vehicle-mounted scenes, the safe and convenient experience of driving is improved by combining voice, schedule management and navigation.
Disclosure of Invention
The invention aims to solve the problems that the existing complex operation of manually adding schedules and the mode of managing the schedules by a mobile phone are not friendly in a vehicle, and provides a vehicle-mounted schedule management method, a system, a terminal and a storage medium.
The invention aims to solve the problems and is realized by the following technical scheme:
according to a first aspect of embodiments of the present invention, there is provided an onboard schedule management method, the method including:
when a user awakening request is received, acquiring voice data in the user awakening request;
converting the voice data into semantic data;
determining a schedule-related intent from the semantic data;
and executing the schedule operation corresponding to the user awakening request according to the schedule related intention.
Preferably, the determining the schedule-related intention through the semantic data includes:
determining whether the schedule-related intention is determined by the semantic data:
if yes, executing the next step;
and if not, forwarding to other processing flows for processing.
Preferably, the schedule-related intents include: query intent and create intent.
Preferably, when the schedule-related intention is a schedule query intention;
judging whether an unexpired schedule exists:
if yes, executing the next step;
if not, finishing the judgment;
judging whether the schedule is due:
if yes, carrying out schedule reminding and executing the next step;
if not, finishing the judgment;
judging whether the schedule is associated with the destination:
if so, automatically initiating navigation;
otherwise, ending the judgment.
Preferably, when the schedule-related intention creates an intention for a schedule;
judging whether the schedule creating condition is met:
if yes, executing the next step;
if not, finishing the judgment;
and creating a schedule through the user wake-up request.
Preferably, the creating of the schedule further comprises:
judging whether a schedule-related destination is added:
is, an addition destination;
otherwise, ending the judgment.
According to a second aspect of the embodiments of the present invention, there is provided an in-vehicle schedule management system, the system including:
the device comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring voice data in a user awakening request when the user awakening request is received;
the conversion module is used for converting the voice data into semantic data;
the judging module is used for determining the schedule related intention according to the semantic data;
and the execution module executes the schedule operation corresponding to the user awakening request according to the schedule related intention.
Preferably, the determining module is configured to:
determining whether the schedule-related intention is determined by the semantic data:
if yes, executing the next step;
and if not, forwarding to other processing flows for processing.
According to a third aspect of the embodiments of the present invention, there is provided a terminal, including:
one or more processors;
a memory for storing the one or more processor-executable instructions;
wherein the one or more processors are configured to:
the method of the first aspect of the embodiments of the present invention is performed.
According to a fourth aspect of embodiments of the present invention, there is provided a non-transitory computer-readable storage medium, wherein instructions, when executed by a processor of a terminal, enable the terminal to perform the method of the first aspect of embodiments of the present invention.
According to a fifth aspect of embodiments of the present invention, there is provided an application program product, which, when running on a terminal, causes the terminal to perform the method of the first aspect of embodiments of the present invention.
Compared with the prior art, the invention has the beneficial effects that:
the patent provides a vehicle-mounted schedule management method, a system, a terminal and a storage medium, based on the requirements of schedule creation and management in a vehicle-mounted scene of a user, a voice schedule creation mode more conforming to vehicle-mounted interaction is realized, so that the user can complete schedule creation in a more convenient voice input mode instead of a manual input mode in the driving process, meanwhile, the strong association of vehicle-mounted scene schedule and navigation is considered, the navigation function is integrated into the schedule creation process, navigation can be directly initiated while schedule reminding is carried out, and the driving safety and convenient interaction experience of the user are improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a flow chart illustrating a method for on-board schedule management according to an exemplary embodiment;
FIG. 2 is a block diagram schematically illustrating the structure of an in-vehicle schedule management system according to an exemplary embodiment;
fig. 3 is a schematic block diagram of a terminal structure shown in accordance with an example embodiment.
Detailed Description
The technical solutions of the present invention will be described clearly and completely with reference to the accompanying drawings, and it should be understood that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In the description of the present invention, it should be noted that the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc., indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, and are only for convenience of description and simplicity of description, but do not indicate or imply that the device or element being referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus, should not be construed as limiting the present invention.
In the description of the present invention, it should be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
The embodiment of the invention provides a vehicle-mounted schedule management method which is realized by a terminal, wherein the terminal can be a smart phone, a desktop computer or a notebook computer and the like, and the terminal at least comprises a CPU (Central processing Unit), a voice acquisition device and the like.
Example one
Fig. 1 is a flowchart illustrating an in-vehicle schedule management method for use in a terminal according to an exemplary embodiment, the method including the steps of:
step 101, when a user wake-up request is received, acquiring voice data in the user wake-up request, wherein the specific contents include:
and after the schedule management software is started, waiting for the user to wake up and performing voice input, and acquiring voice data in the user wake-up request when the user wake-up request is received.
Step 102, converting the voice data into semantic data, wherein the specific content comprises:
the synchronous cloud finishes semantic understanding, and the cloud returns semantic data understanding results to the system.
Step 103, determining the schedule related intention according to the semantic data, wherein the specific contents are as follows:
determining whether the schedule-related intention is determined by the semantic data:
if yes, executing the next step;
and if not, forwarding to other processing flows for processing.
Step 104, executing the schedule operation corresponding to the user awakening request according to the schedule-related intention, wherein the specific contents are as follows:
the schedule-related intents include: query intent and create intent.
When the schedule-related intention is a schedule query intention;
judging whether an unexpired schedule exists:
if yes, executing the next step;
if not, finishing the judgment;
judging whether the time is expired:
if yes, carrying out schedule reminding and executing the next step;
if not, finishing the judgment;
judging whether the schedule is associated with the destination:
if so, automatically initiating navigation;
otherwise, ending the judgment.
When the schedule-related intention creates an intention for the schedule;
judging whether the schedule creating condition is met:
if yes, executing the next step;
if not, finishing the judgment;
and creating a schedule through the user wake-up request.
Judging whether a schedule-related destination is added:
if yes, adding destinations related to schedules;
otherwise, ending the judgment.
Example two
In an exemplary embodiment, there is also provided an in-vehicle schedule management system, as shown in fig. 2, including:
an obtaining module 210, configured to obtain, when a user wake-up request is received, voice data in the user wake-up request;
a conversion module 220, configured to convert the voice data into semantic data;
a judging module 230, determining the schedule-related intention according to the semantic data;
the executing module 230 executes a schedule operation corresponding to the user wake-up request according to the schedule-related intention.
Preferably, the determining module is configured to:
determining whether the schedule-related intention is determined by the semantic data:
if yes, executing the next step;
and if not, forwarding to other processing flows for processing.
EXAMPLE III
Fig. 3 is a block diagram of a terminal according to an embodiment of the present application, where the terminal may be the terminal in the foregoing embodiment. The terminal 300 may be a portable mobile terminal such as: smart phones, tablet computers. The terminal 300 may also be referred to by other names such as user equipment, portable terminal, etc.
Generally, the terminal 300 includes: a processor 301 and a memory 302.
The processor 301 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on. The processor 301 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 301 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 301 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content required to be displayed on the display screen. In some embodiments, the processor 301 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
Memory 302 may include one or more computer-readable storage media, which may be tangible and non-transitory. Memory 302 may also include high speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 302 is used to store at least one instruction for execution by processor 301 to implement the on-board schedule management methods provided herein.
In some embodiments, the terminal 300 may further include: a peripheral interface 303 and at least one peripheral. Specifically, the peripheral device includes: at least one of radio frequency circuitry 304, touch display screen 305, camera 306, audio circuitry 307, positioning components 308, and power supply 309.
The peripheral interface 303 may be used to connect at least one peripheral related to I/O (Input/Output) to the processor 301 and the memory 302. In some embodiments, processor 301, memory 302, and peripheral interface 303 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 301, the memory 302 and the peripheral interface 303 may be implemented on a separate chip or circuit board, which is not limited by the embodiment.
The Radio Frequency circuit 304 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 304 communicates with communication networks and other communication devices via electromagnetic signals. The rf circuit 304 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 304 comprises: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuitry 304 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the rf circuit 304 may further include NFC (Near Field Communication) related circuits, which are not limited in this application.
The touch display screen 305 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. Touch display screen 305 also has the ability to capture touch signals on or over the surface of touch display screen 305. The touch signal may be input to the processor 301 as a control signal for processing. The touch screen display 305 is used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the touch display screen 305 may be one, providing the front panel of the terminal 300; in other embodiments, the touch display screen 305 may be at least two, respectively disposed on different surfaces of the terminal 300 or in a folded design; in still other embodiments, the touch display 305 may be a flexible display disposed on a curved surface or on a folded surface of the terminal 300. Even more, the touch screen display 305 may be arranged in a non-rectangular irregular pattern, i.e., a shaped screen. The touch Display screen 305 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and the like.
The camera assembly 306 is used to capture images or video. Optionally, camera assembly 306 includes a front camera and a rear camera. Generally, a front camera is used for realizing video call or self-shooting, and a rear camera is used for realizing shooting of pictures or videos. In some embodiments, the number of the rear cameras is at least two, and each of the rear cameras is any one of a main camera, a depth-of-field camera and a wide-angle camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize a panoramic shooting function and a VR (Virtual Reality) shooting function. In some embodiments, camera assembly 306 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
Audio circuit 307 is used to provide an audio interface between the user and terminal 300. Audio circuitry 307 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 301 for processing or inputting the electric signals to the radio frequency circuit 304 to realize voice communication. The microphones may be provided in plural numbers, respectively, at different portions of the terminal 300 for the purpose of stereo sound collection or noise reduction. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 301 or the radio frequency circuitry 304 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, audio circuitry 307 may also include a headphone jack.
The positioning component 308 is used to locate the current geographic Location of the terminal 300 to implement navigation or LBS (Location Based Service). The Positioning component 308 may be a Positioning component based on the Global Positioning System (GPS) in the united states, the beidou System in china, or the galileo System in russia.
The power supply 309 is used to supply power to the various components in the terminal 300. The power source 309 may be alternating current, direct current, disposable batteries, or rechargeable batteries. When the power source 309 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the terminal 300 also includes one or more sensors 310. The one or more sensors 310 include, but are not limited to: acceleration sensor 311, gyro sensor 312, pressure sensor 313, fingerprint sensor 314, optical sensor 315, and proximity sensor 316.
The acceleration sensor 311 may detect the magnitude of acceleration in three coordinate axes of a coordinate system established with the terminal 300. For example, the acceleration sensor 311 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 301 may control the touch display screen 305 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 311. The acceleration sensor 311 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 312 may detect a body direction and a rotation angle of the terminal 300, and the gyro sensor 312 may cooperate with the acceleration sensor 311 to acquire a 3D (3Dimensions, three-dimensional) motion of the user with respect to the terminal 300. The processor 301 may implement the following functions according to the data collected by the gyro sensor 312: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
The pressure sensor 313 may be disposed on a side bezel of the terminal 300 and/or an underlying layer of the touch display screen 305. When the pressure sensor 313 is disposed at the side frame of the terminal 300, a user's grip signal of the terminal 300 can be detected, and left-right hand recognition or shortcut operation can be performed according to the grip signal. When the pressure sensor 313 is disposed at the lower layer of the touch display screen 305, the operability control on the UI interface can be controlled according to the pressure operation of the user on the touch display screen 305. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 314 is used for collecting a fingerprint of a user to identify the identity of the user according to the collected fingerprint. Upon identifying that the user's identity is a trusted identity, processor 301 authorizes the user to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying, and changing settings, etc. The fingerprint sensor 314 may be disposed on the front, back, or side of the terminal 300. When a physical button or a vendor Logo is provided on the terminal 300, the fingerprint sensor 314 may be integrated with the physical button or the vendor Logo.
The optical sensor 315 is used to collect the ambient light intensity. In one embodiment, the processor 301 may control the display brightness of the touch screen display 305 based on the ambient light intensity collected by the optical sensor 315. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 305 is increased; when the ambient light intensity is low, the display brightness of the touch display screen 305 is turned down. In another embodiment, the processor 301 may also dynamically adjust the shooting parameters of the camera head assembly 306 according to the ambient light intensity collected by the optical sensor 315.
A proximity sensor 316, also known as a distance sensor, is typically provided on the front face of the terminal 300. The proximity sensor 316 is used to collect the distance between the user and the front surface of the terminal 300. In one embodiment, when the proximity sensor 316 detects that the distance between the user and the front surface of the terminal 300 gradually decreases, the processor 301 controls the touch display screen 305 to switch from the bright screen state to the dark screen state; when the proximity sensor 316 detects that the distance between the user and the front surface of the terminal 300 gradually becomes larger, the processor 301 controls the touch display screen 305 to switch from the breath screen state to the bright screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 3 is not intended to be limiting of terminal 300 and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be used.
Example four
In an exemplary embodiment, there is also provided a computer-readable storage medium on which a computer program is stored, which when executed by a processor, implements the in-vehicle schedule management method as provided in all inventive embodiments of the present application.
Any combination of one or more computer-readable media may be employed. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
EXAMPLE five
In an exemplary embodiment, an application program product is also provided, which includes one or more instructions executable by the processor 301 of the apparatus to perform the onboard schedule management method.
While embodiments of the invention have been disclosed above, it is not intended to be limited to the uses set forth in the specification and examples. It can be applied to all kinds of fields suitable for the present invention. Additional modifications will readily occur to those skilled in the art. It is therefore intended that the invention not be limited to the exact details and illustrations described and illustrated herein, but fall within the scope of the appended claims and equivalents thereof.

Claims (10)

1. A vehicle-mounted schedule management method is characterized by comprising the following steps:
when a user awakening request is received, acquiring voice data in the user awakening request;
converting the voice data into semantic data;
determining a schedule-related intent from the semantic data;
and executing the schedule operation corresponding to the user awakening request according to the schedule related intention.
2. The vehicle-mounted schedule management method according to claim 1, wherein the determining schedule-related intents through the semantic data comprises:
determining whether the schedule-related intention is determined by the semantic data:
if yes, executing the next step;
and if not, forwarding to other processing flows for processing.
3. The vehicle-mounted schedule management method according to claim 1, wherein the schedule-related intents comprise: query intent and create intent.
4. The vehicle-mounted schedule management method according to claim 3, wherein when the schedule-related intention is a schedule query intention;
judging whether an unexpired schedule exists:
if yes, executing the next step;
if not, finishing the judgment;
judging whether the schedule is due:
if yes, carrying out schedule reminding and executing the next step;
if not, finishing the judgment;
judging whether the schedule is associated with the destination:
if so, automatically initiating navigation;
otherwise, ending the judgment.
5. The vehicle-mounted schedule management method according to claim 4, wherein when the schedule-related intention is a schedule creation intention;
judging whether the schedule creating condition is met:
if yes, executing the next step;
if not, finishing the judgment;
and creating a schedule through the user wake-up request.
6. The vehicle-mounted schedule management method according to claim 5, further comprising, after creating a schedule:
judging whether a schedule-related destination is added:
is, an addition destination;
otherwise, ending the judgment.
7. An on-vehicle schedule management system, characterized in that the system comprises:
the device comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring voice data in a user awakening request when the user awakening request is received;
the conversion module is used for converting the voice data into semantic data;
the judging module is used for determining the schedule related intention according to the semantic data;
and the execution module executes the schedule operation corresponding to the user awakening request according to the schedule related intention.
8. The vehicle-mounted schedule management system of claim 7, wherein the judging module is configured to:
determining whether the schedule-related intention is determined by the semantic data:
if yes, executing the next step;
and if not, forwarding to other processing flows for processing.
9. A terminal, comprising:
one or more processors;
a memory for storing the one or more processor-executable instructions;
wherein the one or more processors are configured to:
the in-vehicle schedule management method according to any one of claims 1 to 6 is performed.
10. A non-transitory computer-readable storage medium, wherein instructions in the storage medium, when executed by a processor of a terminal, enable the terminal to perform the in-vehicle schedule management method according to any one of claims 1 to 6.
CN202110733775.XA 2021-06-30 2021-06-30 Vehicle-mounted schedule management method, system, terminal and storage medium Pending CN113450799A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110733775.XA CN113450799A (en) 2021-06-30 2021-06-30 Vehicle-mounted schedule management method, system, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110733775.XA CN113450799A (en) 2021-06-30 2021-06-30 Vehicle-mounted schedule management method, system, terminal and storage medium

Publications (1)

Publication Number Publication Date
CN113450799A true CN113450799A (en) 2021-09-28

Family

ID=77814372

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110733775.XA Pending CN113450799A (en) 2021-06-30 2021-06-30 Vehicle-mounted schedule management method, system, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN113450799A (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106600223A (en) * 2016-12-09 2017-04-26 奇酷互联网络科技(深圳)有限公司 Schedule creation method and device
CN112085463A (en) * 2020-08-11 2020-12-15 广州汽车集团股份有限公司 Vehicle-mounted voice schedule management device and method and vehicle-mounted terminal

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106600223A (en) * 2016-12-09 2017-04-26 奇酷互联网络科技(深圳)有限公司 Schedule creation method and device
CN112085463A (en) * 2020-08-11 2020-12-15 广州汽车集团股份有限公司 Vehicle-mounted voice schedule management device and method and vehicle-mounted terminal

Similar Documents

Publication Publication Date Title
CN110827820B (en) Voice awakening method, device, equipment, computer storage medium and vehicle
CN113212257A (en) Automatic driver seat position adjusting method, device, terminal and storage medium based on Internet of vehicles
CN111400610A (en) Vehicle-mounted social contact method and device and computer storage medium
CN116871982A (en) Device and method for detecting spindle of numerical control machine tool and terminal equipment
CN110290191B (en) Resource transfer result processing method, device, server, terminal and storage medium
CN110992954A (en) Method, device, equipment and storage medium for voice recognition
CN114801889A (en) Intelligent charging control method, system, terminal and storage medium for electric automobile
CN114779920A (en) Whole vehicle window gesture control system based on biological recognition and control method thereof
CN111708581B (en) Application starting method, device, equipment and computer storage medium
CN111717205B (en) Vehicle control method, device, electronic equipment and computer readable storage medium
CN114550717A (en) Voice sound zone switching method, device, equipment and storage medium
CN114789734A (en) Perception information compensation method, device, vehicle, storage medium, and program
CN114595019A (en) Theme setting method, device and equipment of application program and storage medium
CN114475520A (en) Automobile control method and device and computer storage medium
CN112818243A (en) Navigation route recommendation method, device, equipment and storage medium
CN111707263A (en) Path planning method and device, electronic equipment and storage medium
CN113450799A (en) Vehicle-mounted schedule management method, system, terminal and storage medium
CN114566064B (en) Method, device, equipment and storage medium for determining position of parking space
CN110944294B (en) Movement track recording method, device, system, computer equipment and storage medium
CN111522576B (en) Application management method, device, equipment and computer storage medium
CN116665671A (en) Effective communication method, device, terminal and storage medium for following vehicle
CN116681478A (en) Method, device and terminal for judging effectiveness of trial-driving path of vehicle
CN117032178A (en) OTA mode test method, OTA mode test device, terminal and storage medium
CN114518916A (en) Method, device and equipment for realizing gesture recognition and computer readable storage medium
CN118012610A (en) CPU load rate monitoring method and device based on automobile multi-core embedded system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210928