WO2023125431A1 - 一种检测的方法及装置 - Google Patents

一种检测的方法及装置 Download PDF

Info

Publication number
WO2023125431A1
WO2023125431A1 PCT/CN2022/141989 CN2022141989W WO2023125431A1 WO 2023125431 A1 WO2023125431 A1 WO 2023125431A1 CN 2022141989 W CN2022141989 W CN 2022141989W WO 2023125431 A1 WO2023125431 A1 WO 2023125431A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
user
information
vehicle
time
Prior art date
Application number
PCT/CN2022/141989
Other languages
English (en)
French (fr)
Inventor
于金正
高翔宇
解文博
薛波
卓晓燕
陈维
詹舒飞
朱智超
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2023125431A1 publication Critical patent/WO2023125431A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/14546Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue for measuring analytes not otherwise provided for, e.g. ions, cytochromes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/48Biological material, e.g. blood, urine; Haemocytometers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30268Vehicle interior

Definitions

  • the present application relates to the technical field of sensors, in particular to a detection method and device.
  • This application provides a detection method and device, which realizes that when the user is faced with some problems related to driving and travel (such as drunk driving, fatigue driving, trouble charging electric vehicles, left property, etc.), it can provide solutions for these problems. Travel tips or services to improve user experience.
  • the present application provides a detection method, including: acquiring physiological information parameters, blood alcohol concentration parameters, and collection time parameters for collecting blood alcohol concentration parameters; based on physiological information parameters, blood alcohol concentration parameters, and collection time parameters, determining Predicted sober up time; wherein, the predicted sober up time is used to indicate the time point when the user's blood alcohol concentration is lower than the threshold blood alcohol concentration; and the predicted sober up time is displayed.
  • the user can determine how long it takes to sober up after drinking through the detection method provided by the application, which can prevent the user from driving while drunk and causing loss of life and property to himself and others.
  • the physiological information parameters include one or more of weight, height, age, gender, sleep time, and sleep quality.
  • determining the predicted sobering time specifically includes: based on the physiological information parameter, the blood alcohol concentration parameter and the collection time parameter, through the alcohol prediction model, Determine predicted sobriety time.
  • the method before acquiring the physiological information parameter, the blood alcohol concentration parameter, and the acquisition time parameter for acquiring the blood alcohol concentration parameter, the method further includes: receiving a first input; acquiring the physiological information parameter, the blood alcohol concentration parameter, and The acquisition time parameter for acquiring the blood alcohol concentration parameter specifically includes: acquiring a physiological information parameter, a blood alcohol concentration parameter, and an acquisition time parameter in response to the first input.
  • the method before determining the predicted sobering time based on the physiological information parameter, the blood alcohol concentration parameter and the acquisition time parameter, the method further includes: receiving a second input; based on the physiological information parameter, the blood alcohol concentration parameter and the acquisition time parameter, The time parameter, determining the predicted sobering time, specifically includes: in response to the second input, determining the predicted sobering time based on the physiological information parameter, the blood alcohol concentration parameter and the acquisition time parameter.
  • the predicted sobering time is determined, specifically including: acquiring alcohol intake parameters; based on physiological information parameters, alcohol intake parameters, and blood alcohol concentration parameters and acquisition time parameters to determine the predicted sobering time.
  • acquiring the drink ingestion parameter specifically includes: acquiring an image of a container of the drink ingestion through a camera; and determining the drink ingestion parameter based on the container image.
  • the drink intake parameter includes a drink alcohol alcohol parameter and a drink volume parameter
  • the alcohol alcohol parameter is used to indicate the alcohol alcohol intake of the user
  • the wine volume parameter is used to indicate the volume of the alcohol drink consumed by the user.
  • the predicted time to sober up is determined, specifically including: based on alcohol intake parameters, physiological information parameters, through an alcohol prediction model , to obtain the predicted alcohol absorption rate and predicted alcohol metabolism rate; based on physiological information parameters, input alcohol parameters, predict alcohol absorption rate, predict alcohol metabolism rate, and obtain the corresponding relationship between blood alcohol concentration and time; based on blood alcohol concentration parameters, collect time parameters , the corresponding relationship between blood alcohol concentration and time, to determine and predict the time of sobering up.
  • the present application provides another detection method, including: acquiring user behavior data; determining the user's fatigue level before driving based on the user's behavior data; determining the user's first fatigue level based on the user's fatigue level before driving - Recommended driving time; display the first recommended driving time.
  • users can get the recommended driving time before driving. Based on the recommended driving time, the user can determine the time he can drive. It can be understood that when the recommended driving time is zero, it can be used to indicate that the user is not suitable for driving. In this way, the user can avoid fatigue driving and cause harm to the life and property of oneself or others.
  • acquiring the user's behavior data specifically includes: acquiring the user's travel time; obtaining the user's behavior data at the first moment before the travel time; wherein, the difference between the first moment and the travel time is preset time.
  • acquiring the user's travel time specifically includes: acquiring the user's schedule information, where the schedule information includes one or more of the user's ticket information, meeting information, and schedule information; information to obtain the travel time of the user.
  • the method further includes: acquiring the physical state data of the user in the driving state of the vehicle; determining the fatigue degree of the user during driving based on the physical state data of the user; The user's final fatigue level is determined based on the driving fatigue level; the second recommended driving duration is determined based on the user's final fatigue level; and the second recommended driving duration is displayed.
  • the user can obtain the recommended driving time during driving to avoid fatigue driving.
  • determining the user's driving fatigue level based on the user's physical state data specifically includes: determining the driving fatigue level based on the user's physical state data through a second fatigue model, and the second The fatigue model is trained based on the user's historical physical state data.
  • determining the user's driving fatigue level specifically includes: obtaining the vehicle driving data of the user in the vehicle driving state; based on the user's physical state data and the user's The driving data on the car can determine the fatigue level of the user during driving.
  • the user's driving fatigue degree is determined based on the user's physical state data and the user's vehicle driving data, which specifically includes: determining the second fatigue model based on the user's physical state data; The driving data of the user and the physical state data of the user are used to determine the degree of fatigue during driving through the second fatigue model.
  • acquiring the user's behavior data specifically includes: acquiring the user's user data, the user's user data includes one or more of exercise duration, exercise intensity, and sleep duration; based on the user data, determine user behavior data.
  • determining the fatigue degree of the user before driving based on the behavior data of the user specifically includes: determining the fatigue degree of the user before driving through the first fatigue model based on the behavior data of the user; wherein, The first fatigue model is trained according to the user's historical behavior data.
  • the present application provides another detection method, which is applied to the first communication system, and the first communication system includes a first electronic device and a second electronic device; the method includes: the first electronic device detects a passenger's boarding operation , to obtain the in-car image of the passenger before getting on the car; the first electronic device establishes a communication connection with the second electronic device; the first electronic device detects the operation of getting off the passenger, and obtains the in-car image of the passenger after getting off the car; when the first electronic When the device determines that the passenger's items are left in the car based on the in-vehicle image before the passenger boards and the in-vehicle image after the passenger gets off the vehicle, the device broadcasts the first missing prompt information; wherein, the first missing prompt information is used to remind the passenger The item is left in the car; the first electronic device sends the item missing indication information to the second electronic device through the communication connection; the second electronic device displays the second missing prompt information based on the item missing indication information, and the second missing prompt information
  • both the driver and the passenger can receive a reminder, so as to prevent the passenger's belongings from being left in the car. Equally, also avoided when the passenger retrieves article, to the occupation of the time of passenger and driver.
  • the second electronic device is an electronic device with the strongest signal among all electronic devices detected by the first electronic device.
  • the method further includes: the first electronic device sends the movement information of the first electronic device to the second electronic device through the communication connection ;
  • the second electronic device determines that the motion state of the first electronic device is the same as the motion state of the second electronic device based on the motion information of the first electronic device, the second electronic device sends a confirmation success signaling to the first electronic device;
  • the first electronic device maintains the communication connection with the second electronic device after receiving the confirmation success signaling.
  • the first electronic device and the second electronic device are electronic devices in the same vehicle, so that passengers can receive the second missing prompt information.
  • the second electronic device determines that the motion state of the first electronic device is the same as the motion state of the second electronic device based on the motion information of the first electronic device, which specifically includes: the second electronic device continues N When it is determined that the motion information of the first electronic device is the same as that of the second electronic device, it is determined that the motion state of the first electronic device is the same as the motion state of the second electronic device; wherein, N is a positive integer.
  • the second electronic device determines that the motion state of the first electronic device is the same as the motion state of the second electronic device based on the motion information of the first electronic device, which specifically includes: When judging whether the motion information of the first electronic device and the motion information of the second electronic device are the same for the first time, it is determined that the motion information of the first electronic device and the motion information of the second electronic device are at least N times the same, and the second electronic device determines that The motion state of the first electronic device is the same as the motion state of the second electronic device; wherein, N is less than or equal to M, and M and N are positive integers.
  • the motion information of the first electronic device is the same as the motion information of the second electronic device .
  • the method further includes: the second electronic device sends movement information of the second electronic device to the first electronic device through the communication connection;
  • the first electronic device determines that the motion state of the first electronic device is the same as the motion state of the second electronic device based on the motion information of the second electronic device, the first electronic device sends a confirmation success signaling to the second electronic device;
  • the electronic device receives the confirmation success signaling, and maintains the communication connection with the first electronic device.
  • the method further includes: when the second electronic device determines that the motion state of the first electronic device is different from the motion state of the second electronic device based on the motion information of the first electronic device, the second electronic device The device disconnects the communication connection with the first electronic device.
  • disconnecting the communication connection between the second electronic device and the first electronic device specifically includes: the second electronic device sends a confirmation failure signaling to the first electronic device; the first electronic device receives a confirmation failure signaling; Signaling, disconnecting the communication connection with the second electronic device.
  • the method further includes: the second electronic device broadcasts a communication connection request.
  • establishing a communication connection between the first electronic device and the second electronic device specifically includes: the second electronic device broadcasts a communication connection request; the first electronic device receives the communication connection request from the second electronic device; An electronic device sends a communication connection response to the second electronic device; the second electronic device receives the communication connection response from the first electronic device, and establishes a communication connection with the first electronic device.
  • establishing a communication connection between the first electronic device and the second electronic device specifically includes: after the first electronic device detects that a passenger sits down in the car, the first electronic device receives the A communication connection request: the first electronic device sends a communication connection response to the second electronic device, and establishes a communication connection with the second electronic device.
  • the present application provides another detection method, which is applied to a second communication system, and the second communication system includes a first electronic device, a server, and a charging device; the method includes: the first electronic device receives one or more charging information of a charging station; the first electronic device displays one or more charging station options based on the charging information of one or more charging stations, and the one or more charging station options include the first charging station option; the first electronic device receives After the input of the first charging station option, the first navigation information is displayed, and the first navigation information is used to indicate the route from the position of the first electronic device to the first charging station corresponding to the first charging station option; After an electronic device arrives at the first charging station, the server obtains the parking position information of the first electronic device in the first charging station; the server sends the parking position information to the charging device; the charging device arrives at the first charging station indicated by the parking position information After the location, the first electronic device is charged.
  • the user can quickly obtain the charging service provided by the server, and the charging device can find the first electronic device by itself, and charge the first electronic device, reducing the user's charging operations.
  • the first electronic device displays one or more charging station options based on charging information of one or more charging stations, specifically including: the first electronic device displays charging information based on one or more charging stations and the charging information of the first electronic device to determine one or more charging station options.
  • the one or more charging station options include charging price, charging time and arrival distance
  • the charging price is used to indicate the cost required for the first electronic device to fully charge
  • the charging time is used to indicate the first electronic device The time required for the device to be fully charged
  • the arrival distance are used to indicate the distance between the first electronic device and the charging station corresponding to the charging station option.
  • the first electronic device receives the charging information of one or more charging stations sent by the server, which specifically includes: when the first electronic device detects a charging scene, the first electronic device receives the charging information sent by the server.
  • the charging information of one or more charging stations, the scene to be charged includes a low battery scene and a parking lot scene; wherein, the low battery scene is a scene where the power of the first electronic device is lower than a preset power threshold, and the parking lot scene is a scene where the first electronic device Scenarios where the distance between the device and nearby parking locations is less than the specified distance threshold.
  • the first electronic device receives the charging information of one or more charging stations sent by the server, which specifically includes: the first electronic device obtains the destination information of the user, and the destination information includes the destination address and The route to the destination; after the first electronic device determines that the power of the first electronic device is lower than the power consumed by the first electronic device to the destination address according to the route to the destination, the first electronic device receives one or Charging information for multiple charging stations.
  • the server sends the parking location information to the charging device, which specifically includes: the server sends a request to start charging to the first electronic device; the first electronic device receives the request to start charging, and displays a control to start charging; After an electronic device receives the fourth input to the charging start control, in response to the fourth input, it sends a charging start response to the server; the server sends the parking location information to the charging device after receiving the charging start response.
  • the communication system further includes the second electronic device, and after the charging device arrives at the first charging station indicated by the parking position information and charges the first electronic device, the method further includes: The second electronic device sends vehicle charging information, and the vehicle charging information includes the electric quantity of the first electronic device; after receiving the vehicle charging information, the second electronic device displays the vehicle charging prompt information, and the vehicle charging prompt information is used to remind the user of the first electronic device electricity.
  • the present application provides a communication device, including one or more processors and one or more memories.
  • the one or more memories are coupled with one or more processors, the one or more memories are used to store computer program codes, the computer program codes include computer instructions, and when the one or more processors execute the computer instructions, cause the communication device to perform The detection method in any possible implementation of any of the above aspects.
  • an embodiment of the present application provides a computer storage medium, including computer instructions.
  • the communication device is caused to execute the detection method in any possible implementation manner of any one of the above aspects.
  • the embodiment of the present application provides a computer program product, which, when the computer program product is run on a computer, causes the computer to execute the detection method in any possible implementation manner of any one of the above aspects.
  • FIG. 1 is a schematic structural diagram of an electronic device 100 provided in an embodiment of the present application.
  • FIG. 2 is a schematic diagram of a communication system provided by an embodiment of the present application.
  • FIG. 3 is a schematic diagram of modules of an electronic device 100 provided in an embodiment of the present application.
  • Fig. 4 is a schematic diagram of a blood alcohol concentration-time curve provided in the embodiment of the present application.
  • 5A-5H are a set of schematic diagrams of interfaces provided by the embodiment of the present application.
  • Figures 6A-6B are schematic diagrams of another set of interfaces provided by the embodiment of the present application.
  • FIG. 7 is a schematic flow diagram of a detection method provided in an embodiment of the present application.
  • FIG. 8 is a schematic diagram of another communication system provided by an embodiment of the present application.
  • FIG. 9 is a schematic block diagram of another electronic device 100 provided in the embodiment of the present application.
  • Fig. 10 is a schematic flow chart of another detection method provided by the embodiment of the present application.
  • FIG. 11 is a schematic diagram of an application scenario provided by an embodiment of the present application.
  • FIG. 12 is a schematic diagram of another application scenario provided by the embodiment of the present application.
  • FIG. 13 is a schematic structural diagram of another electronic device 100 provided in the embodiment of the present application.
  • FIG. 14 is a schematic structural diagram of an in-vehicle device 900 provided in an embodiment of the present application.
  • Figures 15A-15E are another set of schematic diagrams of interfaces provided by the embodiment of the present application.
  • Figure 16 is a schematic flow chart of another detection method provided by the embodiment of the present application.
  • Fig. 17A is a schematic diagram of an image inside a car before getting on the car provided by the embodiment of the present application;
  • Fig. 17B is a schematic diagram of an image in a car after getting off the car provided by the embodiment of the present application;
  • 18A-18E are another set of schematic diagrams of interfaces provided by the embodiment of the present application.
  • Figure 19 is a schematic flow chart of another detection method provided by the embodiment of the present application.
  • FIG. 20 is a schematic flow diagram of another detection method provided in the embodiment of the present application.
  • Figure 21 is a schematic flow chart of another detection method provided by the embodiment of the present application.
  • Figure 22 is a schematic flow chart of another detection method provided in the embodiment of the present application.
  • Fig. 23 is a schematic flowchart of another detection method provided by the embodiment of the present application.
  • first and second are used for descriptive purposes only, and cannot be understood as implying or implying relative importance or implicitly specifying the quantity of indicated technical features. Therefore, the features defined as “first” and “second” may explicitly or implicitly include one or more of these features. In the description of the embodiments of the present application, unless otherwise specified, the “multiple” The meaning is two or more.
  • the electronic device provided by the embodiment of the present application is introduced below.
  • the electronic device 100 may be a cell phone, a tablet computer, a desktop computer, a laptop computer, a handheld computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, as well as a cellular phone, a personal digital assistant (personal digital assistant) digital assistant (PDA), augmented reality (augmented reality, AR) device, virtual reality (virtual reality, VR) device, artificial intelligence (artificial intelligence, AI) device, wearable device, vehicle-mounted device, smart home device and/or
  • PDA personal digital assistant
  • augmented reality augmented reality, AR
  • VR virtual reality
  • AI artificial intelligence
  • wearable device wearable device
  • vehicle-mounted device smart home device
  • smart home device smart home device
  • smart home device smart home device
  • the embodiment of the present application does not specifically limit the specific type of the electronic equipment.
  • FIG. 1 shows a schematic structural diagram of an electronic device 100 .
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, and an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone jack 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194, and A subscriber identification module (subscriber identification module, SIM) card interface 195 and the like.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, bone conduction sensor 180M, etc.
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or fewer components than shown in the figure, or combine certain components, or separate certain components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural-network processing unit, NPU), etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processing unit
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • baseband processor baseband processor
  • neural network processor neural-network processing unit
  • the controller may be the nerve center and command center of the electronic device 100 .
  • the controller can generate an operation control signal according to the instruction opcode and timing signal, and complete the control of fetching and executing the instruction.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is a cache memory.
  • the memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated access is avoided, and the waiting time of the processor 110 is reduced, thereby improving the efficiency of the system.
  • processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transmitter (universal asynchronous receiver/transmitter, UART) interface, mobile industry processor interface (mobile industry processor interface, MIPI), general-purpose input and output (general-purpose input/output, GPIO) interface, subscriber identity module (subscriber identity module, SIM) interface, and /or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input and output
  • subscriber identity module subscriber identity module
  • SIM subscriber identity module
  • USB universal serial bus
  • the interface connection relationship between the modules shown in the embodiment of the present invention is only a schematic illustration, and does not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the charging management module 140 is configured to receive a charging input from a charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 can receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100 . While the charging management module 140 is charging the battery 142 , it can also supply power to the electronic device through the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives the input from the battery 142 and/or the charging management module 140 to provide power for the processor 110 , the internal memory 121 , the display screen 194 , the camera 193 , and the wireless communication module 160 .
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, and battery health status (leakage, impedance).
  • the power management module 141 may also be disposed in the processor 110 .
  • the power management module 141 and the charging management module 140 may also be set in the same device.
  • the wireless communication function of the electronic device 100 can be realized by the antenna 1 , the antenna 2 , the mobile communication module 150 , the wireless communication module 160 , a modem processor, a baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover single or multiple communication frequency bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves through the antenna 1, filter and amplify the received electromagnetic waves, and send them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signals modulated by the modem processor, and convert them into electromagnetic waves and radiate them through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 150 may be set in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be set in the same device.
  • a modem processor may include a modulator and a demodulator.
  • the modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator sends the demodulated low-frequency baseband signal to the baseband processor for processing. After the low-frequency baseband signal is processed by the baseband processor, it is passed to the application processor.
  • the application processor outputs sound signals through audio equipment (not limited to speaker 170A, receiver 170B, etc.), or displays images or videos through display screen 194 .
  • the modem processor may be a stand-alone device. In some other embodiments, the modem processor may be independent from the processor 110, and be set in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide wireless local area networks (wireless local area networks, WLAN) (such as wireless fidelity (Wireless Fidelity, Wi-Fi) network), bluetooth (bluetooth, BT), global navigation satellite, etc. applied on the electronic device 100.
  • System global navigation satellite system, GNSS
  • frequency modulation frequency modulation, FM
  • near field communication technology near field communication, NFC
  • infrared technology infrared, IR
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency-modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , frequency-modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
  • the antenna 1 of the electronic device 100 is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC , FM, and/or IR techniques, etc.
  • GSM global system for mobile communications
  • GPRS general packet radio service
  • code division multiple access code division multiple access
  • CDMA broadband Code division multiple access
  • WCDMA wideband code division multiple access
  • time division code division multiple access time-division code division multiple access
  • TD-SCDMA time-division code division multiple access
  • the GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a Beidou navigation satellite system (beidou navigation satellite system, BDS), a quasi-zenith satellite system (quasi -zenith satellite system (QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • Beidou navigation satellite system beidou navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the electronic device 100 realizes the display function through the GPU, the display screen 194 , and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos and the like.
  • the display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode, AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diodes (quantum dot light emitting diodes, QLED), etc.
  • the electronic device 100 may include 1 or N display screens 194 , where N is a positive integer greater than 1.
  • the electronic device 100 can realize the shooting function through the ISP, the camera 193 , the video codec, the GPU, the display screen 194 and the application processor.
  • the ISP is used for processing the data fed back by the camera 193 .
  • the light is transmitted to the photosensitive element of the camera through the lens, and the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also optimize the algorithm for image noise and brightness.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be located in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the light signal into an electrical signal, and then transmits the electrical signal to the ISP for conversion into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other image signals.
  • the electronic device 100 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs.
  • the electronic device 100 can play or record videos in various encoding formats, for example: moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4 and so on.
  • MPEG moving picture experts group
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the electronic device 100 can be realized through the NPU, such as image recognition, face recognition, speech recognition, text understanding, and the like.
  • the external memory interface 120 can be used to connect an external non-volatile memory, so as to expand the storage capacity of the electronic device 100 .
  • the external non-volatile memory communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music and video are stored in an external non-volatile memory.
  • the internal memory 121 may be used to store computer-executable program codes including instructions.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by executing instructions stored in the internal memory 121 .
  • the internal memory 121 may include an area for storing programs and an area for storing data.
  • the stored program area can store an operating system, at least one application program required by a function (such as a sound playing function, an image playing function, etc.) and the like.
  • the storage data area can store data created during the use of the electronic device 100 (such as audio data, phonebook, etc.) and the like.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (universal flash storage, UFS) and the like.
  • the electronic device 100 can implement audio functions through the audio module 170 , the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signal.
  • Speaker 170A also referred to as a "horn" is used to convert audio electrical signals into sound signals.
  • Receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the microphone 170C also called “microphone” or “microphone”, is used to convert sound signals into electrical signals.
  • pressure sensor 180A When the pressure sensor 180A is used to sense the pressure signal, it can convert the pressure signal into an electrical signal.
  • pressure sensor 180A may be disposed on display screen 194 .
  • the gyro sensor 180B can be used to determine the motion posture of the electronic device 100 .
  • the air pressure sensor 180C is used to measure air pressure.
  • the magnetic sensor 180D includes a Hall sensor, and the magnetic sensor 180D can be used to detect the opening and closing of the flip leather case.
  • the acceleration sensor 180E can detect the acceleration of the electronic device 100 in various directions (generally three axes).
  • the distance sensor 180F is used to measure the distance.
  • the proximity light sensor 180G can also be used in leather case mode, automatic unlock and lock screen in pocket mode.
  • the ambient light sensor 180L is used for sensing ambient light brightness.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the temperature sensor 180J is used to detect temperature.
  • Touch sensor 180K also known as "touch panel”.
  • the touch sensor 180K can be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to the touch operation can be provided through the display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100 , which is different from the position of the display screen 194 .
  • the bone conduction sensor 180M can acquire vibration signals.
  • Keys 190 include a power key, a volume key and the like.
  • the motor 191 can generate a vibrating reminder.
  • the indicator 192 can be an indicator light, and can be used to indicate charging status, power change, and can also be used to indicate messages, missed calls, notifications, and the like.
  • the SIM card interface 195 is used for connecting a SIM card.
  • the electronic device 100 can acquire parameters of alcohol intake, physiological information parameters, blood alcohol concentration parameters and acquisition time parameters.
  • the drink intake parameter includes a drink degree parameter and a drink volume parameter.
  • Physiological information parameters are physical data that affect the user's alcohol absorption rate and alcohol metabolism rate, for example, data such as sleep time, sleep quality, and weight.
  • the blood alcohol level parameter may be used to indicate the user's blood alcohol level.
  • the acquisition time parameter is used to indicate the time point when the electronic device 100 acquires the blood alcohol concentration parameter.
  • the electronic device 100 can input the parameters of alcohol concentration and physiological information into the alcohol prediction model to obtain the predicted metabolic rate and predicted absorption rate.
  • the predicted metabolic rate and predicted absorption rate are parameters that affect the user's blood alcohol concentration.
  • the electronic device 100 may obtain the user's blood alcohol concentration-time (blood alcohol concentration-time, C-T) curve based on physiological information parameters, alcohol intake parameters, predicted alcohol metabolism rate and predicted absorption rate.
  • the electronic device 100 can also obtain the predicted sobering time based on the blood alcohol concentration parameter, the collection time parameter and the blood alcohol-time curve. In this way, the electronic device 100 can obtain the user's sobering time through the detected parameters, and the electronic device 100 can prompt the user when to sober up, so as to prevent the user from driving and other activities while drunk, and protect the life and property of the user and others.
  • the electronic device 100 may receive the drinking start time input by the user, and obtain the predicted sobering time based on the physiological information parameter, the alcohol intake parameter and the drinking start time. For example, the electronic device 100 may obtain the correspondence between blood alcohol concentration and time based on the physiological information parameter and the alcohol intake parameter through an alcohol prediction model. Then, according to the corresponding relationship between blood alcohol concentration and time, and the time of starting drinking, the predicted sobering time is obtained. For another example, the electronic device 100 may use physiological information parameters, alcohol intake parameters, and drinking start time as inputs to an alcohol prediction model to obtain a predicted sobering time. In this way, without the alcohol sensor, the electronic device 100 can also obtain the predicted sobering time.
  • the electronic device 100 may also obtain the volume of alcohol that the user can ingest based on the expected sobering time input by the user. Specifically, after the electronic device 100 receives the expected sobering time input by the user, the electronic device 100 can obtain the parameters of the degree of alcohol and the physiological information parameters, and input the parameters of the degree of alcohol and the physiological information into the alcohol prediction model to obtain the predicted metabolic rate and Predict absorption rate. The electronic device 100 then obtains the ingestible alcohol volume based on the predicted metabolic rate, predicted absorption rate, expected sobering time, and alcohol degree parameters. In this way, the electronic device 100 can prompt the user to sober up at the desired sobering time when drinking alcohol that does not exceed the ingestible volume, without affecting the user's subsequent itinerary.
  • the electronic device 100 may predict and obtain the expected sobering time based on one or more of the above alcohol intake parameters, physiological information parameters, blood alcohol concentration parameters and collection time parameters. For example, the electronic device 100 may determine the predicted sobering time based on the user's physiological information parameter, the blood alcohol concentration parameter and the collection time parameter for collecting the blood alcohol concentration parameter. The electronic device 100 may also obtain the drinkable volume of the user based on one or more of the drink degree parameter and the physiological information parameter, as well as the expected sobering time. In this way, the electronic device 100 can also obtain the expected sobering time or the volume of ingestible alcohol when the acquired parameters are one or more of the above parameters.
  • the communication system 10 may include an electronic device 100 and an electronic device 200 .
  • the electronic device 100 may establish a wireless connection with the electronic device 200 through a wireless communication method (for example, wireless fidelity (wireless fidelity, Wi-Fi), Bluetooth, etc.).
  • the electronic device 100 can receive the data transmitted by the electronic device 200, or the electronic device 100 can send the operation instruction input by the user to the electronic device 200, and the electronic device 200 can perform the operation indicated by the operation instruction after receiving the operation instruction, etc. .
  • the electronic device 100 can be used to store the data required for training the alcohol prediction model (for example, physiological information parameters, alcohol volume parameters, alcohol degree parameters, blood alcohol concentration parameters, acquisition time parameters, predicted alcohol metabolism rate, Predict alcohol absorption rate, C-T curve, predict sober time, etc.).
  • the electronic device 100 may perform alcohol prediction model training based on these data, and obtain an alcohol prediction model with an accuracy greater than a first threshold (for example, 90%).
  • the electronic device 100 may also obtain a prediction result based on the alcohol prediction model and data related to the user's current drinking (ie, physiological information parameters, alcohol intake parameters). Predicted results may include predicted metabolic rate, predicted absorption rate, and blood alcohol concentration-time profile. The electronic device 100 may also obtain a correction result based on the prediction result, the blood alcohol concentration parameter, and the acquisition time parameter. Wherein, the electronic device 100 may acquire the blood alcohol concentration parameter and the acquisition time parameter through the electronic device 200 . Wherein, the blood alcohol concentration parameter may be used to indicate the concentration of ethanol in the user's blood, and the unit may be mg/100ml. The collection time parameter may be used to indicate the collection time point when the electronic device 200 collects the blood alcohol concentration parameter.
  • the electronic device 200 may be any electronic device including an alcohol sensor.
  • the electronic device 200 may be a wearable electronic device (such as smart eyes, a smart watch, a Bluetooth headset, etc.), an electronic device integrated with an alcohol sensor (for example, the electronic device 100 carrying an alcohol sensor, carrying a seat belts with alcohol sensors, etc.).
  • the alcohol sensor can be used to detect the gas exhaled by the user to obtain the blood alcohol concentration of the user.
  • the electronic device 200 may send the blood alcohol concentration to the electronic device 100 after detecting the blood alcohol concentration of the user.
  • the blood alcohol concentration is lower than 20mg/100ml as the standard for the user to have sobered up. It can be understood that this standard is just an example, and the standard for sobering up can also be any blood alcohol concentration less than or equal to 20mg/100ml, which is not limited in this application. It can also be understood that, in some possible application scenarios, the electronic device 100 includes an alcohol sensor, and the electronic device 100 can directly obtain the user's blood alcohol concentration.
  • the electronic device 200 also includes an actigraph.
  • the electronic device 200 may detect the user's short-term memory body data through an actigraph, for example, the user's sleep quality, the user's sleep duration, and the like.
  • the electronic device 200 also includes an acceleration sensor, which can be used to detect the user's short-term memory type physical data, for example, the user's exercise situation and so on.
  • the electronic device 200 may send the user's short-term memory body data to the electronic device 100 .
  • the electronic device 100 may be used to obtain a predicted sobering time based on parameters of alcohol intake, physiological information parameters, blood alcohol concentration parameters and acquisition time parameters.
  • the electronic device 100 can also be used to obtain the ingestible alcohol volume based on the alcohol degree parameter, the physiological information parameter, and the expected sobering time.
  • the communication system 10 may further include a server 300, which is not shown in the figure.
  • the server 300 may be a cloud server.
  • a communication connection is established between the server 300 and the electronic device 100, and the server 300 can be used to store the above parameters (eg, physiological information parameters).
  • the server 300 can be used to store the above parameters (eg, physiological information parameters).
  • model training based on the above parameters to obtain a preset alcohol model with an accuracy rate greater than the first threshold (for example, 90%), and obtain a prediction result based on the preset alcohol model and the relevant parameters of the user's current drinking (for example, the alcohol intake volume, predicted sobriety time, etc.).
  • the server 300 may store multiple users' alcohol intake parameters, physiological information parameters, blood alcohol concentration parameters, collection time parameters, predicted absorption rate, predicted metabolic rate, C-T curve and predicted hangover time etc.
  • the server 300 can train the alcohol prediction model based on these data.
  • the electronic device 100 may send the predicted sober up time to the electronic device 200, and the electronic device 200 may display the predicted sober up time. Further, the steps performed by the electronic device 100 may be performed by the electronic device 200, which is not limited in this application.
  • the module schematic diagram provided by the embodiment of the present application includes but is not limited to a perception module 310 , a storage module 320 , a training module 330 , a prediction module 340 , a correction module 360 and a display module 350 .
  • the operations performed by each module can be divided into a model training process and a model prediction process.
  • the model training process is shown by the dotted arrow in FIG.
  • the historical parameters are used to train the alcohol prediction model, and an alcohol prediction model with an accuracy rate reaching the first threshold is obtained.
  • the model prediction process is shown by the solid arrow in FIG. 3 , and the electronic device 100 can obtain a predicted sobering time or an ingestible alcohol volume based on the trained alcohol prediction model.
  • the perception module 310 may be used to obtain parameters required for model training/model prediction.
  • the sensing module 310 may obtain parameters through the camera of the electronic device 100, related sensors, etc., or the sensing module 310 may obtain parameters through other electronic devices (such as the electronic device 200) that have established a communication connection with the electronic device 100, or, the sensing module 310 It is also possible to obtain relevant parameters by acquiring user input.
  • the sensing module 310 can be used to acquire parameters of alcohol intake.
  • the perception module 310 may obtain the degree of alcohol consumed by the user (ie, alcohol degree parameter) and the volume of the alcohol (drink volume parameter) through the camera.
  • the sensing module 310 may obtain an image of a container of drinking water through a camera, and the sensing module 310 may obtain parameters of drinking drinking water through an image recognition algorithm based on the container image.
  • the perception module 310 may also obtain the drinking parameters input by the user.
  • the perception module 310 can also be used to acquire physiological information parameters (including long-term memory physiological parameters and short-term memory physiological parameters). For example, the perception module 310 may obtain short-term memory physiological parameters (for example, sleep quality, sleep time, etc.) of the user through an actigraph. For another example, the perception module 310 may detect the user's short-term memory physiological parameters (eg, exercise conditions, etc.) through an acceleration sensor, an inertial measurement unit, and the like. For another example, the perception module 310 may also acquire some physiological information parameters input by the user, and the partial physiological information parameters input by the user include some long-term memory parameters (for example, gender), and some short-term memory parameters (for example, weight, height, age) wait.
  • physiological information parameters including long-term memory physiological parameters and short-term memory physiological parameters.
  • the perception module 310 may obtain short-term memory physiological parameters (for example, sleep quality, sleep time, etc.) of the user through an actigraph.
  • the perception module 310 may detect the user's short-term memory physiological parameters
  • the sensing module 310 can also be used to acquire the user's blood alcohol concentration parameter and the time for acquiring the blood alcohol concentration parameter (also referred to as acquisition time parameter).
  • the perception module 310 may obtain the user's blood alcohol concentration parameters through an alcohol sensor.
  • the perception module 310 can send the blood alcohol concentration parameter and the acquisition time parameter to the correction module 360 for correction of the prediction result.
  • the perception module 310 may obtain some short-term memory physiological parameters (eg, body weight, body mass index, etc.) of the user through a body fat scale connected to the electronic device 100 .
  • some short-term memory physiological parameters eg, body weight, body mass index, etc.
  • the above data collected by the sensor may also be manually input by the user.
  • the perception module 310 may also send all acquired parameters to the storage module 320 for model training/model prediction.
  • the perception module 310 can directly send the obtained parameters to the prediction module 340, and the prediction module 340 can predict the time of sobering up or the volume of ingestible alcohol based on the parameters sent by the perception module 310.
  • the storage module 320 may be used to store parameters used for model training/model prediction.
  • the storage module 320 may be configured to receive the parameters obtained by the perception module 310 and store them in a memory (for example, the internal memory 121 ).
  • the storage module 320 may also receive and store the prediction result sent by the prediction module 340 .
  • the storage module 320 may also receive and store the corrected prediction result (also referred to as the corrected result) sent by the correcting module 360 .
  • the storage module 320 may send all stored parameters (which may be referred to as historical parameters) to the training module 330 .
  • Historical parameters may include, but are not limited to, stored physiological information parameters, input alcohol parameters, blood alcohol concentration parameters, acquisition time parameters, prediction results, correction results, etc.
  • the storage module 320 can send the parameters sent by the perception module 310 for predicting the user's sobering time (for example, the physiological information parameters recently acquired by the storage module 320, the alcohol degree parameters, the alcohol volume parameters, etc.) to the predictor.
  • Module 340 the parameters sent by the perception module 310 for predicting the user's sobering time (for example, the physiological information parameters recently acquired by the storage module 320, the alcohol degree parameters, the alcohol volume parameters, etc.) to the predictor.
  • the parameters sent by the perception module 310 for predicting the user's sobering time (for example, the physiological information parameters recently acquired by the storage module 320, the alcohol degree parameters, the alcohol volume parameters, etc.) to the predictor.
  • Module 340 the parameters sent by the perception module 310 for predicting the user's sobering time (for example, the physiological information parameters recently acquired by the storage module 320, the alcohol degree parameters, the alcohol volume parameters, etc.) to the predictor.
  • the parameters sent by the perception module 310 for predicting the
  • the training module 330 can use neural network algorithms (for example, convolutional neural network algorithms, recurrent neural network algorithms, etc.) to store some historical parameters (for example, physiological information parameters, input drink parameters, prediction results) sent by the storage module As the input value of the model, another part of historical parameters (such as blood alcohol concentration parameters, collection time parameters and correction results, etc.) is used as the output value of the model to train the model to obtain a trained alcohol prediction model. That is to say, the training module 330 can train an alcohol prediction model with an accuracy greater than the first threshold based on the user's historical parameters.
  • the training module 330 can run in the processor 110 of the electronic device 100 .
  • the processor 110 may also be an artificial intelligence (AI) chip.
  • AI artificial intelligence
  • the initial alcohol prediction model may be an alcohol prediction model trained in advance through data of other similar users.
  • one or more of the physiological information parameters and alcohol intake parameters of the similar user and the current user are the same or similar.
  • the gender of the user is male
  • the height is 178cm
  • the weight is 83kg
  • the volume of alcohol intake is 340ml
  • the alcohol intake is 20% (that is, the volume of alcohol intake is 68ml)
  • the sleep time of the previous day is 7 hours.
  • the similar user of this user may be a male user with a height of 175cm-185cm, a weight of 80kg-85kg, a volume of alcohol intake of 50ml-80ml, and a sleep time of 6 hours-8 hours the previous day.
  • the value ranges of the physiological information parameters and the alcohol intake parameters are only examples, and the ranges of these parameters may be larger or smaller, which is not limited in this application.
  • the electronic device 100 may determine similar users according to more or less parameters. For example, the electronic device 100 may also determine similar users based on sleep quality parameters. For example, the electronic device 100 may divide the sleep quality into excellent, good, poor, and extremely poor based on the length of the user's deep sleep time, light sleep time, and rapid eye movement time. When the sleep quality is the same, the electronic device 100 may determine that they are similar users.
  • the training module 330 can send the trained alcohol prediction model to the prediction module 340 .
  • the alcohol prediction model can be used to obtain the user's alcohol metabolism rate (ie, predicted metabolism rate) and alcohol absorption rate (ie, predicted absorption rate).
  • the prediction module 340 can be used to calculate and obtain a prediction result. Specifically, the predicting module 340 may input the physiological information parameters and alcohol degree parameters sent by the storage module 320 into the alcohol predicting model to obtain predicted metabolic rate and predicted absorption rate. The prediction module 340 can also obtain the blood alcohol concentration-time curve based on the user's weight parameter among the predicted metabolic rate, predicted absorption rate, alcohol volume parameter, alcohol degree parameter, and physiological information parameters. The prediction module 340 can run in the processor 110 of the electronic device 100 .
  • the blood alcohol concentration-time curve can be obtained based on the predicted metabolic rate, predicted absorption rate, alcohol volume parameter, alcohol degree parameter, and user weight parameter.
  • the specific formula is as follows:
  • c is the blood alcohol concentration of the user, and t is the time corresponding to the blood alcohol concentration.
  • k a is the predicted absorption rate, v m is the predicted metabolic rate, and k m is the Michaelis constant, which is a known fixed value.
  • c 0 is the maximum blood alcohol concentration, which can be obtained by the following formula:
  • Ba is the degree of alcohol intake
  • V a is the volume of alcohol intake
  • m is the weight of the user
  • r is a fixed coefficient, which can be 0.75.
  • the maximum blood alcohol concentration of the user is about 87.3mg/100ml.
  • the prediction module 340 can substitute the maximum blood alcohol concentration, predicted metabolic rate and predicted absorption rate into formula 1 to obtain a C-T curve.
  • the C-T curve obtained by the prediction module 340 may be as shown in FIG. 4 .
  • the time corresponding to the maximum blood alcohol concentration C0 is T0.
  • the time corresponding to the threshold blood alcohol concentration is T1, and T1 can be used to indicate the predicted sobering time.
  • the threshold blood alcohol concentration is written as 20 mg/100ml.
  • the threshold blood alcohol concentration can take other values lower than 20 mg/100ml, which is not limited in this application.
  • the prediction module 340 can also obtain the predicted sobering time based on the blood alcohol concentration-time curve, the blood alcohol concentration parameter and the acquisition time parameter. Specifically, the prediction module 340 can determine the position of the blood alcohol concentration indicated by the blood alcohol concentration parameter on the C-T curve based on the blood alcohol concentration parameter and the C-T curve, that is, can determine the corresponding time point of the blood alcohol concentration parameter on the C-T curve. Afterwards, the prediction module 340 may obtain the time difference between the time point and the time point corresponding to the threshold blood alcohol concentration based on the time point corresponding to the blood alcohol concentration parameter. The prediction module 340 may add a time difference to the time point indicated by the acquisition time parameter to obtain the predicted sobering time.
  • the prediction module 340 can obtain two time points corresponding to the blood alcohol concentration parameters in the C-T curve. That is, the prediction module 340 can obtain two predicted sobering times.
  • the prediction module 340 may send the two predicted sober times to the display module 350 .
  • the display module 350 may display the two predicted sobering times in the form of time ranges. For example, the display module 350 receives the predicted sobering time A and the predicted sobering time B, wherein the predicted sobering time A is earlier than the predicted sobering time B, and the display module 350 can display the predicted sobering time as the predicted sobering time A to the predicted sobering time B.
  • the display module 350 may only display the latest predicted sober up time, for example, the display module 350 may only display the predicted sober up time B.
  • the predicting module 340 can determine the blood alcohol concentration parameters at C-T based on the most recently obtained two groups of blood alcohol concentration parameters and collection time parameters. Unique position in the curve and get predicted sobriety time.
  • the sensing module 310 when the sensing module 310 detects that the user's blood alcohol concentration is C0, it is determined that the user can sober up after (T1-T0) hours, ie, 6.6 hours. If the time corresponding to T0 is 19:42 local time, the predicted time to sober up is 02:18 the next day. For another example, when the sensing module 310 detects that the user's blood alcohol concentration is 78mg/100ml, since the blood alcohol concentration corresponds to two time points in the curve, the sensing module 310 may, after a preset time (for example, 7 minutes), again Detect the user's blood alcohol concentration.
  • a preset time for example, 7 minutes
  • the prediction module 340 can determine that the current time point is after T0, and determine that the user can sober up after 5.1 hours. If the time corresponding to the current time point is 19:42 local time, the predicted time to sober up is 00:45 the next day.
  • the prediction module 340 may obtain a blood alcohol concentration-time curve based on the blood alcohol concentration parameter and the acquisition time parameter.
  • the training module 330 can use a neural network algorithm (for example, a convolutional neural network algorithm, a recurrent neural network algorithm, etc.), and use the blood alcohol concentration parameter and the acquisition time parameter stored in the storage module 320 as the input value of the model, and the blood alcohol concentration -
  • the time curve is used as the output of the model, and the model will be trained to obtain a trained alcohol prediction model.
  • the prediction module 340 then takes the blood alcohol concentration parameters and collection time parameters recently acquired by the sensing module 310 as the input of the alcohol prediction model to obtain a blood alcohol concentration-time curve.
  • the predicting module 340 may fit the blood alcohol concentration-time curve based on the blood alcohol concentration parameter and the collection time parameter.
  • the predicting module 340 may obtain a blood alcohol concentration-time curve based on one or more of physiological information parameters, alcohol intake parameters, blood alcohol concentration parameters, and collection time parameters.
  • the predicting module 340 may directly obtain the predicted sobering time based on one or more of physiological information parameters, alcohol intake parameters, blood alcohol concentration parameters and acquisition time parameters.
  • the training module 330 can use one or more of physiological information parameters, alcohol intake parameters, blood alcohol concentration parameters and collection time parameters as the input of the alcohol prediction model, and use the predicted sobering time as the output of the alcohol prediction model, training Get the alcohol prediction model.
  • the prediction module 340 can obtain the predicted sobering time based on the alcohol prediction model input parameters recently acquired by the perception module 310, through the alcohol prediction model.
  • the prediction module 340 may send the predicted sober up time to the display module 350 after obtaining the predicted sober up time.
  • the display module 350 can display the predicted sobering time.
  • the prediction module 340 may also send the prediction results to the correction module 360 .
  • Predicted results may include, but are not limited to, predicted metabolic rate, predicted absorption rate, and blood alcohol concentration-time profile.
  • the prediction module 340 can also send the prediction result to the storage module 320 . It can be understood that when the prediction module 340 is only used to predict the blood alcohol concentration-time curve, the prediction result obtained by the prediction module 340 only includes the blood alcohol concentration-time curve.
  • the correction module 360 can be used to adjust the prediction result based on the blood alcohol concentration parameter and the acquisition time parameter acquired by the sensing module 310 . After the correction module 360 acquires the collection time parameter after the user drinks alcohol and the corresponding blood alcohol concentration parameter, the prediction result can be adjusted based on the blood alcohol concentration parameter and the collection time parameter for obtaining the blood alcohol concentration parameter, and the adjusted Predicted metabolic rate, adjusted predicted absorption rate, and adjusted C-T curve.
  • the correction module 360 may obtain the user's actual blood alcohol concentration-time curve based on the multiple sets of acquired blood alcohol concentration parameters and their corresponding acquisition time parameters.
  • the correction module 360 adjusts the prediction result based on the difference between the actual C-T curve and the C-T curve obtained by the prediction module 340, and obtains the adjusted predicted metabolic rate, the adjusted predicted absorption rate and the adjusted C-T curve.
  • the correction module 360 may obtain an error value between the blood alcohol concentration on the predicted blood alcohol concentration-time curve and the actual blood alcohol concentration based on multiple sets of blood alcohol concentration parameters and corresponding collection time parameters. The correction module 360 may add the error value to all blood alcohol concentration values on the blood alcohol concentration-time curve to obtain a corrected blood alcohol concentration-time curve.
  • the correction module 360 can obtain the predicted sobering time based on the adjusted blood alcohol concentration-time curve.
  • the correction module 360 can send the correction result to the display module 350 .
  • Corrected results may include, but are not limited to, adjusted predicted metabolic rate, adjusted predicted absorption rate, predicted sober time, and adjusted blood alcohol concentration-time profile.
  • the correction module 360 can also send the correction result to the storage module 320 .
  • the correction module 360 may directly obtain the predicted sobering time based on the prediction result, the blood alcohol concentration parameter and the collection time parameter. At this time, the correction results only include the predicted sober time.
  • the electronic device 100 does not include the correction module 360 .
  • the prediction module 340 of the electronic device 100 can directly obtain the predicted sobering time based on the prediction result, the blood alcohol concentration parameter and the acquisition time parameter.
  • the display module 350 can be used to display the predicted time of sobering up.
  • the display module 350 can display the predicted sobering time on the display screen 194 of the electronic device 100 .
  • the display module 350 can also display the predicted metabolic rate and predicted absorption rate, as well as the user's historical metabolic rate and historical absorption rate.
  • the display module 350 can also display prompt information, which is used to remind the user of the difference between the current metabolic rate and the historical metabolic rate, and the difference between the current absorption rate and the historical absorption rate.
  • the perception module 310 can be used to acquire the expected sobering time, physiological information parameters and alcohol degree parameters input by the user.
  • the description of the acquisition of the physiological information parameters and alcohol degree parameters by the sensing module 310 can refer to the above description of the embodiment of the sensing module 310 , which will not be repeated here.
  • the perception module 310 may send the obtained expected sobering time, physiological information parameters and alcohol degree parameters to the prediction module 340 .
  • the prediction module 340 can use the parameters of the degree of alcohol and the parameters of physiological information as the input of the alcohol prediction model to obtain the predicted metabolic rate and predicted absorption rate.
  • the prediction module 340 can obtain the maximum blood alcohol concentration based on the predicted metabolic rate, the predicted absorption rate, the expected sobering time, the threshold blood alcohol concentration and Formula 1. It can be understood that when the prediction module 340 obtains the maximum blood alcohol concentration, the prediction module 340 can obtain the C-T curve based on the maximum blood alcohol concentration, predicted metabolic rate, predicted absorption rate, expected sobering time, and threshold blood alcohol concentration. The prediction module 340 then obtains the volume of ingestible alcohol based on the maximum blood alcohol concentration and the user's body weight in the physiological information parameters and Formula 2. The prediction module 340 can send the ingestible alcohol volume to the display module 350, and the display module 350 can be used to display the ingestible alcohol volume.
  • the predicting module 340 may directly obtain the ingestible alcohol volume based on one or more of the physiological information parameters, alcohol degree parameters, and expected sobering time.
  • the training module 330 can use one or more of physiological information parameters and alcohol alcohol parameters as the input of the ingestible alcohol volume prediction model, and use the ingestible alcohol volume as the output of the ingestible alcohol volume prediction model to train A prediction model for the volume of ingestible alcohol is obtained.
  • the prediction module 340 can obtain the ingestible alcohol volume through the ingestible alcohol volume prediction model based on the input parameters required by the ingestible alcohol prediction model acquired recently by the perception module 310 .
  • the electronic device 100 may display a desktop 501 .
  • the desktop 501 may include multiple application icons, for example, an alcohol detection application icon 502 and so on.
  • the alcohol detection application icon 502 may be used to trigger the display of an alcohol detection application interface (for example, the alcohol detection interface 510 shown in FIG. 5B ).
  • Alcohol detection applications can be used to predict the time of sobering up or the volume of drinkable alcohol.
  • a status bar may also be displayed on the top of the desktop 501, and a Bluetooth icon may be displayed in the status bar. The Bluetooth icon is used to indicate that the electronic device 100 establishes a communication connection with the electronic device 200 .
  • the electronic device 100 receives the user's input (for example, click) on the alcohol detection application icon 502 , and in response to the input, the electronic device 100 may display an alcohol detection interface 510 as shown in FIG. 5B .
  • the alcohol detection interface 510 may include a user parameter column 511 , and the user parameter column 511 includes information such as the user's gender, height, weight, and sleep duration. Wherein, the sleep duration parameter in the user parameter column 511 may be acquired by the electronic device 100 from the electronic device 200 . Parameters such as gender, height, and weight may be pre-stored for the electronic device 100 or input by the user. The electronic device 100 may receive user input and modify parameters in the user parameter column 511 . Alcohol detection interface 510 may also include time prediction control 512 and volume prediction control 513 . Wherein, the time prediction control 512 can be used to predict the time for the user to sober up, and the volume prediction control 513 can be used to predict the volume of alcohol that the user can ingest.
  • the electronic device 100 may, upon receiving the user's input on the time prediction control 512, display a detection prompt interface 530 as shown in FIG. 5C in response to the input.
  • the detection prompt interface 530 includes a prompt box 531 .
  • the prompt box 531 displays prompt information, which can be used to remind the user to breath on the alcohol sensor. In this way, the electronic device 100 can obtain the blood alcohol concentration of the user through the alcohol sensor.
  • the prompt information may include but not limited to text prompt information, animation prompt information, picture prompt information, voice prompt information and the like.
  • the prompt information may include picture-type prompt information as shown in FIG. 5C , and the picture-type prompt information is used to prompt the location of the alcohol sensor.
  • the prompt information may also include text prompt information as shown in FIG. 5C : "Blood alcohol concentration is being detected, please breathe towards the alcohol sensor pointed by the arrow".
  • the prompt information 531 shown in FIG. 5C will be displayed only when the electronic device 100 establishes a communication connection with the electronic device 200 including the alcohol sensor.
  • the electronic device 100 carries an alcohol sensor, and the electronic device 100 may prompt the user to breathe on the alcohol sensor of the electronic device 100 to obtain the blood alcohol concentration of the user.
  • the electronic device 100 does not establish a communication connection with the electronic device including the alcohol sensor, and the electronic device 100 may prompt the user to detect the blood alcohol concentration by himself and input the blood alcohol concentration into the electronic device 100 .
  • the electronic device 200 After the electronic device 200 detects the blood alcohol concentration of the user, it may send the blood alcohol concentration and the collection time for obtaining the blood alcohol concentration to the electronic device 100 . After receiving the blood alcohol concentration and collection time from the electronic device 200, the electronic device 100 may display a time prediction interface 540 as shown in FIG. 5D.
  • a wine parameter column 541 may be displayed in the time prediction interface 540 .
  • the drink parameter column 541 can be used to display the volume and degree of the drink that the user drinks.
  • the drink parameter column 541 may include a drink parameter entry 542, and the drink parameter entry 542 includes a photo identification icon 542A.
  • the camera identification icon 542A can be used to trigger the electronic device 100 to activate the camera, and identify the images captured by the camera to obtain the volume of alcohol consumed by the user and the degree of alcohol consumed. It should be noted that the electronic device 100 may receive the user's input, and display the drinking volume and drinking degree input by the user in the drink parameter item 542 .
  • the drink parameter column 541 may also include an add button, which can be used to trigger the electronic device 100 to display another drink parameter item above or below the drink parameter item 542 . In this way, the electronic device 100 can collect parameters of various types of drinks.
  • the time prediction interface 540 may also include a current detection record column 544 , a re-enter button 545 and a start prediction button 546 .
  • the detection record column 544 can be used to display the user's blood alcohol concentration.
  • the blood alcohol concentration may be sent from the electronic device 200 to the electronic device 100, or manually input by the user.
  • the detection record column 544 may display one or more detection records, and the one or more detection records include a detection record 544A, and the detection record 544A includes the blood alcohol concentration and the collection time of the blood alcohol concentration.
  • the electronic device 100 may receive a user's input and change the value in the detection record.
  • the re-input button 545 can be used to trigger the electronic device 100 to notify the electronic device 200 to detect the user's blood alcohol concentration again.
  • the re-input button 545 can be used to add a new detection record in the detection record column 544, The user can input the blood alcohol concentration and the corresponding collection time in the newly added detection record.
  • the start prediction button 546 may be used to trigger the electronic device 100 to obtain a predicted sobering time based on the parameters obtained above.
  • the electronic device 100 may receive a user's input on the photo identification icon 542A shown in FIG. 5D , and in response to the input, display a photo identification interface 550 as shown in FIG. 5E .
  • the photographing recognition interface 550 displays images captured by the camera of the electronic device 100 .
  • the photo identification interface 550 may also include information on the degree of alcohol identified. For example, Fig. 5E shows that the alcohol content is 20% in text next to the wine bottle. It should be noted that, when the alcohol content is not marked on the packaging of the wine bottle, the electronic device 100 can identify the packaging information (for example, brand, name, etc.) .
  • the photo identification interface 550 may also include volume information of the identified wine container. For example, the electronic device 100 recognizes that the volume of the wine bottle is 220ml.
  • the electronic device 100 may also display the number of containers near the volume information, and the electronic device 100 may receive user input to modify the number of containers.
  • the electronic device 100 can identify the packaging information (for example, brand, name, etc.) Bottle volume information.
  • the photo identification interface 550 may also include a re-identification button 551 and a confirmation button 552 .
  • the re-recognition button 551 can be used to trigger the electronic device 100 to re-recognize relevant information in the image currently displayed on the photo recognition interface 550 .
  • Confirm button 552 can be used to confirm the recognition result.
  • the electronic device 100 may receive the user's input on the confirmation button 552 shown in FIG. 5E , and in response to the input, display the time prediction interface 540 as shown in FIG. 5F .
  • the drink parameter item 542 of the time prediction interface 540 also displays the value of the volume of drink and the degree of drink.
  • the electronic device 100 may also receive the user's input on the re-input button 545 shown in FIG. 5F , and in response to the input, the electronic device 100 may notify the electronic device 200 to collect the user's blood alcohol concentration again. It can be understood that the electronic device 100 may also display prompt information, and the function and content of the prompt information may refer to the above prompt information shown in FIG. 5C , which will not be repeated here.
  • the electronic device 100 After the electronic device 100 receives the user's blood alcohol concentration collected again by the electronic device 200 and the corresponding collection time, it may display the detection record 544B as shown in FIG. 5G .
  • the electronic device 100 may, after receiving the user's input on the start prediction button 546 shown in FIG. 5G , calculate the predicted sobering time in response to the input. Wherein, the electronic device 100 can obtain the C-T curve based on the physiological information parameters, the degree of alcohol intake and the stored alcohol prediction model. Afterwards, the electronic device 100 can obtain the predicted sobering time based on the C-T curve, the user's blood alcohol concentration, and the collection time. Specifically, reference may be made to the above-mentioned embodiment shown in FIG. 3 , which will not be repeated here. Here, the predicted sobering time obtained by the electronic device 100 is 00:45 of the next day. After the electronic device 100 obtains the predicted sobering time, it may display a prediction result interface 570 as shown in FIG. 5H.
  • predicted results interface 570 may include result information 572 .
  • the result information 572 includes predicted sober time information.
  • the result information 572 may be one or more of text-type information, picture-type information, voice-type information, and the like.
  • the result information 572 may be text information: "The blood alcohol concentration is expected to drop to 20mg/100ml after 5.1 hours, and the time for sobering up is 00:45 tomorrow morning".
  • the prediction result interface 570 may also include a blood alcohol concentration-time graph 571, which may be used to show the user's current blood alcohol concentration, current time information, and predicted sobering time. In this way, the electronic device 100 can more intuitively display the change of the user's blood alcohol concentration through the blood alcohol concentration-time graph 571 , reflecting the time for the user to sober up.
  • the prediction result interface 570 may also include the alcohol absorption rate, alcohol metabolism rate and their change curves within a user-preset time period (for example, within one month). In this way, the user can check the changes of his alcohol absorption rate and alcohol metabolism rate, adjust his life routine, drinking habits, and so on.
  • the electronic device 100 may obtain a prediction result based on only one detection record. It can also be understood that since the blood alcohol concentration in the blood alcohol concentration-time curve corresponds to two collection times except for the maximum blood alcohol concentration, there is a period of error in the prediction result based on one detection record.
  • the electronic device 100 may obtain a predicted metabolic rate and a predicted absorption rate through an alcohol prediction model based on the physiological information parameter and the degree of alcohol alcohol parameter.
  • the electronic device 100 may also obtain the ingestible alcohol volume based on the expected drinking time, predicted metabolic rate and predicted absorption rate.
  • the electronic device 100 may display the time prediction interface 601 as shown in FIG. 6A after receiving the user's input on the volume prediction control 513 shown in FIG. 5B , in response to the input.
  • the time prediction interface 601 may include a drink alcohol alcohol column 602, and the alcohol alcohol alcohol alcohol column 602 may be used to display the alcohol alcohol alcohol alcohol to be ingested.
  • the drink degree column 602 includes a photo identification icon 602A, which can be used to trigger the electronic device 100 to activate the camera, and identify the images captured by the camera to obtain the user's drink degree.
  • the detailed description of identifying the degree of wine can refer to the above embodiment described in FIG. 5E , which will not be repeated here.
  • the electronic device 100 may also receive the user's input, and display the drinking degree input by the user in the alcohol degree column 602 .
  • the alcohol content column 602 displays the alcohol content, and the value of the alcohol content is 20%.
  • the time prediction interface 601 may also include an expected time column 603, and the expected time column 603 may be used to display an expected sobering time.
  • the desired time column 603 may include a time wheel, and the time wheel may be used to receive user input and adjust the numbers on the time wheel to obtain the expected sobering time.
  • the desired time column 603 may also display a specific numerical value of the desired sobering up time.
  • the desired time column is not limited to the above-mentioned expected time column 603.
  • the expected time column can be in other forms.
  • the expected time column can be an input box, and the input box can be input by the user to obtain the expected sobering up. time. This embodiment of the present application does not limit it.
  • the expected time to sober up is displayed in the expected time column 603, which is "19:35 Beijing time".
  • the electronic device 100 may acquire the travel or work time of the user by querying the user's schedule or memo, and use this time as the expected sobering time.
  • the time prediction interface 601 can also include a start prediction button 604, which can be used to trigger the electronic device 100 to predict the volume of ingestible alcohol.
  • the electronic device 100 may obtain the maximum blood alcohol concentration based on the expected drinking time, the predicted metabolic rate, the predicted absorption rate and the threshold blood alcohol concentration after receiving the user's input on the start prediction button 604 . Then, based on the parameters of the degree of alcohol, the user's body weight, and the maximum blood alcohol concentration, the ingestible alcohol volume can be obtained through the above-mentioned formula 2 shown in FIG. 3 .
  • the electronic device 100 may display a prediction result interface 610 as shown in FIG. 6B after obtaining the ingestible alcohol volume.
  • the prediction result interface 610 may include result information 611 .
  • the result information 611 includes information on the volume of alcohol that can be ingested.
  • the result information 611 may be one or more of text-type information, picture-type information, voice-type information, and the like.
  • the result information 611 may be text information: "It is expected to sober up after 3 hours, and the volume of drinkable alcoholic water is about 82ml".
  • the prediction result interface 610 may also include a blood alcohol concentration-time graph, and the blood alcohol concentration-time graph may be used to show a predicted blood alcohol concentration-time change curve.
  • the electronic device 100 can obtain the predicted sobering time based on the physiological information parameters, alcohol alcohol concentration parameters, alcohol volume parameters, blood alcohol concentration parameters, and collection time parameters for collecting blood alcohol concentration parameters.
  • the electronic device 100 may also obtain the ingestible alcohol volume based on the expected sobering time, physiological information parameters, and alcohol degree parameters. In this way, since the electronic device 100 uses the user's physical parameters to obtain the predicted drinking time and ingestible alcohol volume, the predicted result is more accurate. Users can predict the time of sobering up, and then work or travel after sobering up. Users can also drink moderately without affecting the itinerary through the volume of ingestible alcohol.
  • the method includes:
  • the electronic device 100 acquires physiological information parameters, alcohol intake parameters, blood alcohol concentration parameters, and acquisition time parameters.
  • the physiological information parameters may include long-term memory parameters (for example, gender) and short-term memory parameters (for example, height, weight, sleep time).
  • the physiological information parameter can be obtained by user input.
  • the electronic device 100 may also establish a connection with an electronic device (for example, the electronic device 200 ) carrying an actigraph, and obtain the sleep time of the user through the actigraph of the electronic device.
  • the electronic device 100 may also establish a connection with a body fat scale, and obtain the user's weight through the body fat scale.
  • the drink intake parameter may include a drink volume parameter and a drink degree parameter.
  • the alcohol intake parameter can be obtained by user input.
  • the electronic device 100 can identify the degree and volume of the alcohol ingested by taking a photo.
  • the electronic device 100 can obtain the container image of the ingested alcohol through the camera, and obtain the ingested alcohol parameters through an image recognition algorithm based on the container image.
  • the specific steps for the electronic device 100 to acquire the parameters of the drink can refer to the above-mentioned embodiment shown in FIG. 5E , which will not be repeated here.
  • the electronic device 100 may also establish a connection with an electronic device carrying an alcohol sensor (for example, the electronic device 200 ), and acquire blood alcohol concentration parameters and acquisition time parameters through the alcohol sensor of the electronic device.
  • an alcohol sensor for example, the electronic device 200
  • FIG. 5C Reference may be made to the above-mentioned embodiment shown in FIG. 5C , which will not be repeated here.
  • the electronic device 100 executes step S701 in response to the first input.
  • the first input may include but not limited to single click, double click and long press, etc.
  • the first input may be an input directed to the alcohol detection application icon 502 shown in FIG. 5A above.
  • the electronic device 100 may execute step S702 and step S703 in response to the second input.
  • the second input may include but not limited to single click, double click and long press, etc.
  • the second input may be an input to the start prediction control 546 shown above in FIG. 5G.
  • the electronic device 100 may obtain a blood alcohol concentration-time curve based on the physiological information parameter, the alcohol intake parameter and the alcohol prediction model.
  • the electronic device 100 may use the physiological information parameter and the alcohol degree parameter as input of the alcohol prediction model to obtain the predicted absorption parameter and the predicted metabolic parameter.
  • the electronic device 100 can also obtain the maximum blood alcohol concentration through the formula 2 shown in FIG. Based on the maximum blood alcohol concentration, the electronic device 100 predicts the absorption parameters and the predicted metabolic parameters, and obtains the blood alcohol concentration-time curve through the above formula 1 shown in FIG. 3 .
  • the electronic device 100 may obtain a predicted sobering time based on the blood alcohol concentration-time curve, the blood alcohol concentration parameter, and the acquisition time parameter.
  • the electronic device 100 can determine the position of the blood alcohol concentration indicated by the blood alcohol concentration parameter on the C-T curve based on the blood alcohol concentration parameter and the C-T curve, that is, can determine the corresponding time point of the blood alcohol concentration parameter on the C-T curve. Afterwards, the prediction module 340 may obtain the time difference between the time point and the time point corresponding to the threshold blood alcohol concentration based on the time point corresponding to the blood alcohol concentration parameter. The prediction module 340 may add a time difference to the time point indicated by the acquisition time parameter to obtain the predicted sobering time.
  • the electronic device 100 can obtain the relationship between the blood alcohol concentration on the predicted blood alcohol concentration-time curve and the actual blood alcohol concentration based on multiple sets of blood alcohol concentration parameters and their corresponding acquisition time parameters. difference.
  • the electronic device 100 may add the error value to all blood alcohol concentration values on the blood alcohol concentration-time curve to obtain a corrected blood alcohol concentration-time curve.
  • the electronic device 100 then obtains the predicted sobering time based on the corrected blood alcohol concentration-time curve. In this way, the electronic device 100 can obtain a more accurate prediction of the time of sobering up.
  • the electronic device 100 may also obtain a corrected predicted metabolic rate and a corrected predicted absorption rate based on the corrected blood alcohol concentration-time curve.
  • the electronic device 100 can store the physiological information parameters, alcohol intake parameters, predicted absorption rate, predicted metabolic rate, blood alcohol concentration-time curve, corrected predicted absorption rate, corrected predicted metabolic rate, and corrected The final blood alcohol concentration-time curve, and based on the stored data, the alcohol prediction model is trained. That is, the electronic device 100 may adjust model parameters of the alcohol prediction model based on the corrected predicted metabolic rate and the error between the predicted metabolic rate, and the corrected predicted absorption rate and the error between the predicted absorption rate. The electronic device 100 may also calculate the accuracy of the alcohol prediction model with model parameters adjusted, and the electronic device 100 may store the alcohol prediction model after determining that the accuracy of the alcohol prediction model reaches a preset threshold.
  • the electronic device 100 may re-train the model at intervals of a preset time (for example, 1 month), or the electronic device 100 may re-train the model each time after obtaining the predicted sobering time or the volume of alcohol that can be ingested. Model training.
  • the electronic device 100 displays the predicted sobering time.
  • the electronic device 100 may display a predicted sobering time.
  • the electronic device 100 may also display predicted absorption rate and predicted metabolic rate. Specifically, reference may be made to the above-mentioned embodiments shown in FIG. 5A-FIG. 5H , which will not be repeated here.
  • the electronic device 100 when the electronic device 100 detects the user's driving operation, it may display prompt information, and the prompt information may be used to remind the user that he is in a drunk state and not to drive.
  • the electronic device 100 may determine the driving time of the user by querying the user's schedule or memo.
  • the electronic device 100 can obtain the maximum blood alcohol level directly based on the expected sobering time, predicted metabolic rate and predicted absorption rate after obtaining the predicted metabolic rate and predicted absorption rate. Concentration, and then based on the maximum blood alcohol concentration, alcohol alcohol parameters and user weight parameters, the ingestible alcohol volume is obtained.
  • the expected sobriety time may be input by the user.
  • the electronic device 100 may acquire the travel or work time of the user by querying the user's schedule or memo, and use this time as the expected sobering time.
  • the electronic device 100 can also display the ingestible alcohol volume.
  • the electronic device 100 can also display the ingestible alcohol volume. For details, refer to the above-mentioned embodiments shown in FIGS. 6A-6B , which will not be repeated here.
  • the electronic device 100 may send the physiological information parameters and alcohol intake parameters to the server 300, and the server 300 performs the calculation of the predicted sobering time/ingestible alcohol volume, and the training of the alcohol prediction model.
  • the server 300 can also be used to store the above parameters. In this way, computing and storage resources of the electronic device 100 can be saved.
  • the electronic device 100 may predict and obtain the expected sobering time based on one or more of the above alcohol intake parameters, physiological information parameters, blood alcohol concentration parameters and collection time parameters.
  • the electronic device 100 may also obtain the drinkable volume of the user based on one or more of the drink degree parameter and the physiological information parameter, as well as the expected sobering time. In this way, the electronic device 100 can also obtain the expected sobering time or the volume of ingestible alcohol when the acquired parameters are one or more of the above parameters.
  • fatigue driving has become an important cause of traffic safety accidents.
  • Drivers drive vehicles on the road in a fatigued state, causing unnecessary casualties and economic losses.
  • detecting whether a driver is driving with fatigue has become an urgent problem to be solved.
  • the embodiment of the present application provides a detection method.
  • the electronic device 100 can acquire the user's behavior data when the user is driving.
  • the electronic device 100 can obtain the fatigue level of the user before driving based on the behavior data.
  • the electronic device 100 can also acquire the on-board driving data and the physical condition data, and the electronic device 100 can obtain the user's driving fatigue level based on the physical condition data and the on-vehicle driving data.
  • the electronic device 100 may obtain the user's current fatigue level (also referred to as the final fatigue level) based on the fatigue level before driving and the fatigue level during driving.
  • the electronic device 100 can also obtain and display driving advice based on the final fatigue level.
  • the driving suggestion may include but not limited to the recommended driving duration.
  • the recommended driving time is used to indicate the total time the user can drive before reaching a preset level of fatigue.
  • the electronic device 100 can combine the user's pre-driving and driving data to obtain the user's fatigue level, and based on the user's fatigue level, give corresponding driving suggestions, reduce the time of the user's fatigue driving, and reduce the probability of driving accidents. , Improve the problem of fatigue driving.
  • the driving suggestion may include the combination of the electronic device 100 and the user's previous fatigue.
  • the driving data on the car gives you the recommended driving time.
  • the recommended driving time is the driving time before the user reaches severe fatigue.
  • the driving suggestion may include recommended driving time and sobriety reminder information, and the sobriety reminder information may be used to remind the user to lower the temperature in the car or drink refreshing drinks, play refreshing music, etc.;
  • the driving suggestion may include parking prompt information, and the parking prompt information may be used to prompt the user to stop and rest as soon as possible.
  • the driving suggestion may also include a recommended driving duration, and in this case, the value of the recommended driving duration is zero.
  • the electronic device 100 may also plan the nearest parking location and display navigation information to the parking location.
  • the electronic device 100 does not acquire the driving data on the vehicle, and the electronic device 100 can obtain the degree of fatigue before driving based on the behavior data.
  • the electronic device 100 combines the stored historical data of the user's previous driving (for example, final fatigue level, driving duration, etc.) to obtain driving suggestions.
  • the driving suggestion may include a recommended driving duration.
  • the recommended driving time is used to indicate how long the user will drive for severe fatigue. In this way, the electronic device 100 can recommend the driving time for the user before the user starts driving, thereby reducing the probability of traffic accidents.
  • the communication system 20 may include but not limited to an electronic device 100 , an electronic device 500 , an electronic device 600 and an electronic device 700 .
  • the electronic device 100 may establish a communication connection with the electronic device 500 (for example, a Bluetooth connection, etc.).
  • the electronic device 100 may also establish a communication connection with the electronic device 600 .
  • the electronic device 600 may establish a communication connection with the electronic device 700 .
  • the electronic device 700 is an electronic device including a camera (for example, a vehicle camera, a driving recorder, etc.), and the electronic device 700 can be used to acquire user's facial image data.
  • Electronic device 700 may also send facial image data to electronic device 600 .
  • the electronic device 600 can be used to obtain driving data.
  • the electronic device 600 may be a vehicle-machine device, a vehicle-mounted tablet, and the like.
  • the driving data can be used to reflect the environmental conditions in the vehicle, the driving road conditions, the driving state of the user, etc. during the driving process of the user.
  • Driving data may include but not limited to light, noise, temperature in the car, vehicle speed, acceleration, variance of speed, variance of acceleration, frequency of deviation between the vehicle and the lane, following distance, road conditions, user facial image data, The moment when the user drives the vehicle, and the driving time of the user drives the vehicle, etc.
  • the electronic device 600 may transmit driving data to the electronic device 100 .
  • the electronic device 600 may also be used to receive facial image data sent by the electronic device 700 .
  • the electronic device 600 can also be used to obtain facial data of the user through image recognition based on the facial image data of the user.
  • the user's facial data may include but not limited to the focus of the user's eyes, head movement (head bowing frequency), blinking frequency, yawning frequency and so on.
  • the electronic device 600 may send the user's facial data to the electronic device 100 .
  • the electronic device 700 may obtain the facial data of the user based on the facial image data, and send the facial data of the user to the electronic device 600 .
  • the electronic device 500 can be used to detect the user's physical condition in real time and obtain user data.
  • the user data can be used to characterize the user's physical condition and user behavior.
  • the electronic device 500 may be a wearable device (for example, a smart watch, a smart bracelet) and the like.
  • the above user data may include stable user data and fluctuating user data.
  • the stable user data can be used to indicate the user's physical feature data (for example, height, gender, age, weight, etc.) that will not fluctuate in a short period of time.
  • the fluctuating user data can be used to indicate the user's physical condition data fluctuating in a short period of time. That is to say, the electronic device 500 can be used to acquire fluctuating user data.
  • the fluctuating user data acquired by the electronic device 500 may include but not limited to the user's heart rate, body temperature, blood sugar, sleep quality (for example, can be identified by sleep duration), exercise conditions (including exercise duration, exercise intensity, etc.), blood Oxygen saturation etc.
  • the electronic device 500 may send the acquired fluctuating user data to the electronic device 100 .
  • the electronic device 500 may also be used to acquire user data related to user behavior, which may include but not limited to sleeping, sitting still, walking, running and so on. In the following embodiments, only four user behaviors of sleeping, sitting, walking and running will be used to compose. It can be understood that, in the actual application process, other user behaviors (for example, lying down) can also be included or the above user behaviors can be subdivided (for example, walking can be divided into strolling, brisk walking, etc.). No limit. It can be understood that the electronic device 500 may acquire user behavior by detecting user data such as the user's heart rate, body temperature, and exercise conditions. The electronic device 500 can send these user data to the electronic device 100 .
  • user data such as the user's heart rate, body temperature, and exercise conditions.
  • the electronic device 100 can obtain behavior data and a part of the body state data through the user data obtained by the electronic device 500, and the electronic device 100 can also obtain another part of the user state data of the user.
  • the electronic device 100 can also acquire driving data through the electronic device 600 to obtain driving data on the vehicle. It should be noted that the user data and driving data acquired by the electronic device 100 may also be obtained through user input.
  • the behavioral sequence ⁇ sit still, run, sleep> and the behavioral sequence ⁇ sit still, run> resulted in different levels of pre-driving fatigue.
  • the first fatigue model may be a recurrent neural network (recurrent neural network, RNN) model used for processing data with a time series relationship.
  • the electronic device 100 can also be used to obtain the final fatigue level.
  • the electronic device 100 may perform weighted sum calculation based on the fatigue degree before driving and the fatigue degree during driving to obtain the final fatigue degree.
  • the electronic device 100 can obtain driving suggestions based on the final fatigue level. That is to say, the electronic device 100 can obtain whether the user is mildly fatigued, moderately fatigued, or severely fatigued based on the final fatigue level, and give corresponding driving suggestions according to the fatigue state of the user.
  • the driving suggestion may include a recommended driving duration, and the recommended driving duration is the driving duration for the user to reach severe fatigue.
  • the electronic device 100 may send the driving suggestion to the electronic device 600, and the electronic device 600 may display the driving suggestion.
  • the electronic device 100 can be used to store behavior data, body state data and vehicle driving data. And these data are used as parameters for model training to carry out model training. Specifically, the electronic device 100 can obtain the fatigue degree of the user while driving and the time interval between different fatigue degrees according to the stored vehicle driving data and body state data. For example, the electronic device 100 may mark that the user yawns 1-2 times within a preset time, which is mild fatigue. The number of yawns is 3-5 times as moderate fatigue, and the number of yawns more than 5 times is severe fatigue. Afterwards, the electronic device 100 can input the behavior data, body state data and vehicle driving data into the corresponding model during the training process to obtain the final fatigue level.
  • the electronic device 100 may take the fatigue degree of the user as the real result, and obtain an error value between the final fatigue degree and the real result based on the real result.
  • the electronic device 100 may adjust the parameters of the model based on the error value until the error value is smaller than a preset threshold, and the model training is completed.
  • the electronic device 100 may use a model whose error value is smaller than a preset threshold to detect user fatigue. It should be noted that the electronic device 100 marking the user's fatigue level by the number of yawns is only an example, and the electronic device 100 may also mark the user's fatigue level by other data (for example, the number of times the user bows his head), which is not limited in this application.
  • the electronic device 100 may also store the fatigue level before driving, the fatigue level during driving and the final fatigue level obtained based on the behavior data, body state data and vehicle driving data. It should be noted that, since the electronic device 100 will collect driving data and body state data on the vehicle in real time during the driving process of the user. The electronic device 100 may recalculate the driving fatigue level and the final fatigue level based on the on-board driving data and body state data collected within the preset time interval (for example, 15 minutes). The electronic device 100 may correlate and store the body state data, vehicle driving data, driving fatigue level, final fatigue level and the driving time of the user. In this way, it is convenient for the electronic device 100 to obtain the relationship between the user's driving time and fatigue level.
  • the electronic device 100 when the electronic device 100 has not acquired the driving data on the car (that is, when the user has not driven for a trip), the electronic device 100 can obtain the driving data based on the user's current behavior data through the first fatigue model. fatigue level.
  • the electronic device 100 may be based on the degree of fatigue before driving and the stored relationship between the degree of fatigue before driving and the duration of driving.
  • the electronic device 100 can also determine the recommended driving time for the user to reach severe fatigue based on the relationship between the driving time and the degree of fatigue before driving.
  • the electronic device 100 when the electronic device 100 can determine the total driving time of the user, it can also determine whether the user will experience fatigue driving according to the total driving time and the recommended driving time.
  • the electronic device 100 may include, but not limited to, a user data collection module 910 , an on-board data collection module 930 , a data preprocessing module 920 , a model calculation module 940 and a driving advice judgment module 950 .
  • the user data related to the user's physical condition and behavior may include but not limited to the user's age, gender, height, weight, body fat, heart rate, body temperature, blood sugar concentration, blood oxygen saturation, sleep quality, sleep duration, exercise duration , exercise intensity, etc.
  • the user data collection module 910 may be configured to receive data input by the user, and obtain user data therefrom.
  • the user data collecting module 910 may obtain user data through corresponding sensors.
  • an acceleration sensor may be used to acquire the user's motion.
  • the user's heart rate and the like may be acquired through an optical sensor.
  • the user data collection module 910 may also send the user data to the data preprocessing module 920 .
  • the on-board data acquisition module 930 can be used to acquire driving data of the user during driving.
  • the on-vehicle data acquisition module 930 may acquire driving data through an electronic device (for example, the electronic device 600 ) that establishes a communication connection with the electronic device 100 .
  • the driving data can be used to reflect the environmental conditions in the vehicle, the driving road conditions, the driving state of the user, etc. during the driving process of the user.
  • Driving data may include but not limited to light, noise, temperature in the car, vehicle speed, acceleration, variance of speed, variance of acceleration, frequency of deviation between the vehicle and the lane, following distance, road conditions, weather conditions, user’s face Image data, the moment when the user drives the vehicle, and the driving time of the user drives the vehicle, etc.
  • the on-vehicle data acquisition module 930 can acquire driving data through corresponding software or hardware.
  • a camera of the electronic device 700 may be used to acquire the user's eye movement and the like.
  • the road conditions in the form of vehicles for example, tidal roads, rockfall roads
  • the weather conditions during driving for example, sunny, rainy, rainy
  • the acceleration of the vehicle and the like may be acquired through an acceleration sensor.
  • the on-board data collection module 930 can also send the driving data to the data preprocessing module 920 .
  • the data preprocessing module 920 can obtain behavior data based on user data.
  • the behavior data is used to indicate the user's behaviors that occurred in chronological order within a preset time period (for example, within one hour) before the driving trip. For example, the user successively performs activities of running, walking and sleeping within a preset time period. Then, the data preprocessing module 920 can obtain the user's behavior data based on the heart rate, body temperature, location and other data in the user data, and the behavior data can be expressed as ⁇ running, walking, sleeping>.
  • the data preprocessing module 920 can also obtain body state data based on user data.
  • the physical state data is used to characterize the physical state of the user.
  • the physical state data can be divided into stable data and fluctuating data.
  • the stable data can be used to represent the data that the user will not change greatly within a period of time, such as age, gender, height, weight, body fat, etc.
  • fluctuating data can be used to characterize the data that fluctuates with the user's behavior and environment changes, such as heart rate, body temperature, blood sugar, blood oxygen saturation, sleep quality, exercise duration, exercise intensity, etc.
  • the data preprocessing module 920 can also obtain on-board driving data based on the driving data.
  • the driving data on the vehicle can be used to characterize the surrounding environment when the user is driving the vehicle and the real-time driving situation of the user.
  • the driving data on the vehicle may include surrounding environment data and user facial data.
  • the surrounding environment data are used to characterize the environment inside the vehicle (such as temperature, light intensity, etc.) and the driving conditions of the vehicle (such as vehicle speed, acceleration, following distance, driving time, etc.).
  • the user's face data can be used to characterize the user's driving state, for example, the user's yawn frequency, nodding frequency, etc.
  • the time when the user drives the vehicle will also affect the user's driving state (for example, it is easier to feel tired when driving the vehicle at noon or early in the morning).
  • the data preprocessing module 920 can also record the time when the feature data is obtained.
  • the model calculation module 940 can be used to calculate the user's fatigue level.
  • the model calculation module 940 may run on a processor of the electronic device 100, for example, the processor of the electronic device 100 may be the aforementioned processor 110 or an AI chip or the like.
  • the model calculation module 940 can also be used to send the result of the fatigue level to the driving suggestion judgment module 950 .
  • the model calculation module 940 may use the behavior data as an input of the first fatigue model, and calculate the degree of fatigue before driving.
  • the model calculation module 940 can determine the second fatigue model based on the stable data in the body state data, and use the fluctuating data in the body state data and the on-board form data as the data of the second fatigue model to obtain the fatigue degree during driving.
  • the model calculation module 940 can perform weighted summation of the fatigue degree before driving and the fatigue degree during driving to obtain the final fatigue degree.
  • the model calculation module 940 may send the final fatigue level to the driving suggestion judgment module 950 .
  • the model calculation module 940 may determine the weights of the fatigue level before driving and the fatigue level during driving when calculating the final fatigue level based on the user's driving time.
  • the weight of the fatigue degree during driving increases as the driving time increases, and the weight of the fatigue degree before driving decreases synchronously.
  • the model calculation module 940 can add 0.05 to the weight value of the fatigue degree during driving when the driving time increases by 30 minutes, and the fatigue degree before driving Decrease the value of the degree weight by 0.05.
  • the model calculation module 940 can also adjust the weights in other ways.
  • the model calculation module 940 can adjust the weight of the fatigue degree before driving when the driving time reaches 2 hours. 0.4, and the weight of fatigue in driving is 0.6.
  • the model calculation module 940 can also adjust the weight of the fatigue degree before driving to 0.2, and the weight of the fatigue degree during driving to 0.8 when the driving time reaches 5 hours, which is not limited in this embodiment of the present application.
  • the model calculation module 940 can only obtain the degree of fatigue before driving.
  • the model calculation module 940 may only send the fatigue level before driving to the driving suggestion judgment module 950 .
  • the driving suggestion judging module 950 can be used to acquire the travel information of the user.
  • the driving suggestion judging module 950 may notify the user data collection module 910 to send the user data to the data preprocessing module 920 at the trigger moment when the travel information of the user is acquired.
  • the driving suggestion judging module 950 can obtain the user's destination point and arrival time through the user's schedule, ticket purchase information (also called ticket information, for example, train ticket, plane ticket, performance ticket, movie ticket, etc.) .
  • ticket purchase information also called ticket information, for example, train ticket, plane ticket, performance ticket, movie ticket, etc.
  • the destination point is the place recorded in the schedule or the place where the ticket is used.
  • the arrival time is the time recorded in the schedule or the departure time or performance start time indicated by the ticket.
  • the driving suggestion judging module 950 may obtain based on the airline ticket that the user's destination point is the departure airport, the arrival time is the check-in time of the plane, and so on.
  • the arrival time can be preset time (for example, 30 minutes) earlier than the time recorded in the ticket or schedule, so that the user can avoid missing the itinerary.
  • the driving suggestion judging module 950 can start M hours before the arrival time to determine whether the distance between the user's real-time location and the destination point exceeds a distance threshold (for example, 1 kilometer), wherein, M is greater than or equal to 0, for example, M The value can take 5.
  • the driving suggestion judgment module 950 determines that the distance between the user's current location and the destination point exceeds the distance threshold, the driving suggestion judgment module 950 can determine the user's departure time based on the driving time and arrival time from the current location to the destination point.
  • the driving suggestion judging module 950 may take the departure time as the trigger time, and notify the user data collection module 910 to send the user data to the data preprocessing module 920 at the trigger time.
  • the driving suggestion judging module 950 can directly obtain the departure time of the user from the schedule or the set alarm clock.
  • the driving suggestion judging module 950 may acquire the navigation information of the user, and determine the departure time of the user based on the navigation information.
  • the driving suggestion judging module 950 may use N hours before the departure time as the trigger time, wherein the trigger time is later than M hours before the arrival time, and the trigger time is later than the current time.
  • N is greater than or equal to 0, for example, the value of N may be 1.
  • the driving suggestion judging module 950 may obtain driving suggestions based on the received fatigue level before driving.
  • the driving suggestion may include a recommended driving duration, and the recommended driving duration is used to indicate the total driving duration when the user reaches severe fatigue.
  • the driving suggestion judging module 950 can determine the historical pre-driving fatigue level that is closest to the currently obtained pre-driving fatigue level.
  • the driving suggestion judging module 950 can determine the historical final fatigue level that reaches severe fatigue the earliest among the multiple historical final fatigue levels corresponding to the closest historical pre-driving fatigue level.
  • the driving duration corresponding to the final fatigue level of the history is determined as the recommended driving duration.
  • the driving suggestion judging module 950 determines that the user may experience mild fatigue or moderate fatigue during driving, and the driving suggestion judging module 950 determines that the time difference between the current time and the departure time is less than or equal to the time threshold (for example, 30 minutes), travel Prompts can be used to prompt the user to prepare a refreshing drink. If the driving suggestion judging module 950 determines that the user may experience severe fatigue during driving, the travel prompt may be used to prompt the user to travel by other travel modes (eg, public transportation or driving instead).
  • the time threshold for example, 30 minutes
  • the driving suggestion judging module 950 may gradually reduce the preset judging time based on the increase of the driving time. It should be noted that the preset judgment time cannot be reduced to 0.
  • the driving suggestion judging module 950 can obtain driving suggestions based on the final fatigue level sent by the model calculating module 940 .
  • the driving suggestion may be used to prompt the user whether he is tired.
  • the driving suggestion may also include a recommended driving duration.
  • the driving suggestion may also include wake-up reminder information, which may be used to remind the user to lower the temperature in the car or drink refreshing drinks, play refreshing music, and so on.
  • the driving suggestion may include parking prompt information, and the parking prompt information may be used to prompt the user to stop and rest as soon as possible.
  • the software modules shown in the embodiment of the present invention do not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or less software modules than the above, or combine some modules, or split some modules, and so on.
  • the electronic device 100 can acquire the user's behavior data before the user drives. And based on the behavior data, the fatigue level before driving and driving suggestions are obtained.
  • the electronic device 100 can also acquire the user's body state data and on-board driving data during the user's driving, and obtain the final fatigue level and driving suggestions based on the behavior data, body state data, and on-board driving data.
  • the electronic device 100 can prompt the user whether it is possible to drive and how long driving may cause fatigue.
  • the electronic device 100 can also detect the fatigue degree of the user in real time while the user is driving, and prompt the user to reduce the fatigue feeling by lowering the temperature inside the car when the user reaches mild fatigue or moderate fatigue. When the user reaches severe fatigue, prompt the user to stop and rest as soon as possible. It greatly reduces the probability of users having car accidents due to fatigue driving.
  • the travel information may include but not limited to departure time, arrival time and trigger time.
  • the departure time is the time when the user starts driving
  • the arrival time is the time when the user stops
  • the trigger time is the time when the electronic device 100 acquires behavior data.
  • the electronic device 100 may obtain the user's destination point and arrival time through the user's schedule, ticket purchase information (eg, train tickets, air tickets, show tickets, movie tickets, etc.) and the like.
  • the arrival time can be preset time (for example, 30 minutes) earlier than the time recorded in the ticket or schedule, so that the user can avoid missing the itinerary.
  • the electronic device 100 may directly obtain the departure time of the user from the schedule or the set alarm clock.
  • the driving suggestion judging module may obtain the user's navigation information, and determine the user's departure time based on the navigation information.
  • starting to drive may include but not limited to establishing a communication connection between the electronic device 100 and the electronic device 600, wearing a seat belt, closing the driver's door, releasing the handbrake, starting the car, and stepping on the gas pedal.
  • the electronic device 100 determines that the user is driving the vehicle through the location of the user during the driving process of the user. It should be noted that the electronic device 100 has not acquired the behavior data yet. The electronic device 100 may use the time when the user's driving behavior is detected as the trigger time and the departure time. It can be understood that the electronic device 100 may directly perform step S1006 and subsequent steps after performing steps S1002 to S1004.
  • the electronic device 100 After the electronic device 100 determines that the current moment is the trigger moment, it may acquire the behavior data of the user within a preset time (for example, within 6 hours) before the trigger moment.
  • a preset time for example, within 6 hours
  • the electronic device 100 may directly acquire behavior data within a preset time before the trigger moment through the electronic device 500 .
  • the electronic device 100 may acquire user data within a preset time before the trigger moment through the electronic device 500, and then obtain behavior data based on the user data.
  • the electronic device 100 uses the behavior data as an input of the first fatigue model to obtain the fatigue level before driving.
  • the electronic device 100 may use the behavior data in the form of a behavior sequence as an input of the first fatigue model to obtain the degree of fatigue before driving.
  • the output of the first fatigue model is determined by the number of user actions in the input action sequence and the sequence of user actions.
  • the sequence of user behaviors in the behavior sequence is different, and the degree of fatigue before driving obtained by the first fatigue model is different.
  • the electronic device 100 may take the most frequent behavior of the user between the trigger time and the departure time as the last action in the behavior sequence. For example, the electronic device 100 acquires the behavior sequence of the user as ⁇ exercise, sitting still> at the trigger moment. The electronic device 100 detects that the user sleeps the most times between the trigger time and the departure time in the previous period of time (for example, within the previous month). The electronic device 100 can obtain the behavior sequence as ⁇ exercise, sit still, sleep>. The electronic device 100 detects that the user has sat still the most times between the trigger time and the departure time within a previous period of time (for example, within the previous month). The electronic device 100 can obtain the behavior sequence as ⁇ exercise, sit still>. Alternatively, the electronic device 100 may directly use the behavior sequence.
  • the electronic device 100 obtains and displays driving suggestions based on the fatigue level before driving.
  • the electronic device 100 may also determine the closest historical pre-driving fatigue level based on the departure time or the trigger time. Among the one or more historical pre-drive fatigue levels, the departure time or trigger time between the closest historical pre-drive fatigue level and the currently obtained pre-drive fatigue level is closest.
  • the electronic device 100 may obtain one or more final fatigue levels corresponding to the closest historical fatigue level. And based on one or more final fatigue levels, the final fatigue level that reaches severe fatigue is the earliest. The electronic device 100 may use the driving duration corresponding to the final fatigue level that reaches severe fatigue the earliest as the recommended driving duration.
  • the electronic device 100 may display a driving suggestion including a recommended driving duration. In this way, the electronic device 100 can remind the user of the maximum driving time that can be driven continuously when the user has not started driving, or the driving time does not exceed the preset initial time (for example, 10 minutes), so as to improve the user's fatigue driving problem.
  • the preset initial time for example, 10 minutes
  • the electronic device 100 may obtain driving suggestions based on the expected driving time, departure time and trigger time.
  • the driving suggestion may include a travel reminder, and the travel reminder may be used to prompt the user to take a break for a period of time. If the trigger time is earlier than the departure time, and the time difference between the trigger time and the departure time is less than or equal to the time threshold (for example, 30 minutes), the driving suggestion may include a travel reminder, which may be used to remind the user to prepare a refreshing drink.
  • the electronic device 100 may only execute steps S1001 to S1004. In this way, the electronic device 100 can obtain the recommended driving time before the user drives the vehicle to avoid fatigue driving.
  • the electronic device 100 determines whether the user is in a driving state.
  • the electronic device 100 may determine whether the user is in a driving state at the departure time. It can be understood that, due to the small time range of the departure time, errors may occur when the electronic device 100 makes a judgment, so the electronic device 100 can judge the user every preset judgment time within a period of time including the departure time. Is it driving.
  • the electronic device 100 determines that the user is not in the driving state, it re-determines whether the user is in the driving state after a preset determination time interval.
  • the electronic device 100 acquires driving data on the vehicle and body state data.
  • Physical state data includes stable data and fluctuating data.
  • the electronic device 100 determines a second fatigue model based on the stable data in the physical condition data.
  • the server 300 stores the physical state data, vehicle driving data, and driving fatigue levels of multiple users.
  • the server 300 can classify users into different types based on the stable type data in the body state data.
  • the server 300 may classify the users into users of a certain age group, a certain height range, and a certain gender of a certain weight range based on age, weight, gender, height, and the like. For example, the server 300 may classify users whose age is between 20 and 35, whose weight is between 60 kg and 70 kg, and whose height is between 170 cm and 180 cm, and whose gender is male, into one category.
  • the electronic device 100 can train the second fatigue model based on the stored user's physical state data, vehicle driving data, and fatigue degree during driving, and save the trained second fatigue model.
  • the electronic device 100 may use the trained second fatigue model to calculate the fatigue level of the user during driving. That is to say, the electronic device 100 may obtain the second fatigue model through training based on historical user physical state data, historical vehicle driving data, and historical fatigue levels during driving.
  • the electronic device 100 may directly train the second fatigue model based on historical user body state data, historical vehicle driving data, and historical fatigue levels during driving.
  • the electronic device 100 takes the driving data and the fluctuation data on the vehicle as input of the second fatigue model, and obtains the degree of fatigue during driving.
  • the model calculation module can use the fluctuating data in the body state data and the driving data on the vehicle as the data of the second fatigue model to obtain the degree of fatigue during driving.
  • the second fatigue model can be used to process input data without a time series relationship to obtain an output result.
  • the electronic device 100 obtains the final fatigue level based on the fatigue level before driving and the fatigue level during driving.
  • the electronic device 100 may perform weighted summation of the fatigue degree before driving and the fatigue degree during driving to obtain the final fatigue degree.
  • the weights of the degree of fatigue before driving and the degree of fatigue during driving are both greater than zero, and the sum of the weights of the degree of fatigue before driving and the weight of the degree of fatigue during driving is equal to 1.
  • the electronic device 100 may determine the weights of the fatigue level before driving and the fatigue level during driving when calculating the final fatigue level based on the user's driving time.
  • the electronic device 100 may increase the weight of the fatigue degree during driving and decrease the weight of the fatigue degree before driving as the driving time increases.
  • the electronic device 100 obtains and displays driving suggestions based on the final fatigue level.
  • the electronic device 100 can obtain and display driving suggestions based on the most moderate fatigue level.
  • the driving suggestion may be used to prompt the user whether he is tired.
  • the driving suggestion may also include a recommended driving duration.
  • the driving suggestion may also include wake-up reminder information, which may be used to remind the user to lower the temperature in the car or drink refreshing drinks, play refreshing music, and so on.
  • the electronic device 100 may directly notify the car air conditioner to lower the temperature inside the car, or/and notify the car stereo to play refreshing music.
  • the driving suggestion may include parking prompt information, and the parking prompt information may be used to prompt the user to stop and rest as soon as possible.
  • the electronic device 100 may send the driving suggestion to the electronic device 600, and the electronic device 600 may display the driving suggestion. Further optionally, the electronic device 100 may also send the navigation information to the electronic device 600, and the electronic device 600 may display the navigation information.
  • the electronic device 100 determines whether the user is in a driving state.
  • step S1010 it may be determined whether the user is still in the driving state after a preset determination time interval. For example, the electronic device 100 may determine whether the user is in a driving state according to the speed and acceleration of the vehicle. When the electronic device 100 determines that the user is still driving, the electronic device 100 may perform step S1006-step S1011. When the electronic device 100 determines that the user is not in the driving state, the electronic device 100 may stop executing the fatigue detection process (ie, step S1006-step S1011).
  • the electronic device 100 may adjust the preset determination time based on the driving time. The longer the driving time is, the shorter the preset determination time is, wherein the value of the preset determination time is greater than zero.
  • steps S1002 - S1004 , and S1007 - S1009 may be executed by the server 300 .
  • the electronic device 100 may determine whether the user is in a driving state every preset determination time. And after it is determined that the user is in the driving state, the behavior data of the user within a preset time period before the time when the user is determined to be in the driving state is acquired. The electronic device 100 can obtain the degree of fatigue before driving based on the behavior data. Afterwards, the electronic device 100 may directly execute step S1006 to step S1011. In this way, the electronic device 100 can only judge the fatigue degree of the user during the driving behavior, so as to avoid the fatigue driving behavior of the user.
  • the electronic device 100 may directly determine the driving fatigue level of the user based on the physical condition data of the user. In some embodiments, the electronic device 100 may determine the degree of fatigue during driving through the second fatigue model based on the physical condition data of the user.
  • the second fatigue model can be trained based on the user's historical physical condition data, or can be downloaded from the server 300 based on the user's physical condition data. Specifically, for the step of the electronic device 100 acquiring the second fatigue model from the server 300, reference may be made to the embodiment shown in the above step S1007, which will not be repeated here.
  • Fig. 11 and Fig. 12 exemplarily show two application scenarios of the detection method.
  • the electronic device 100 may obtain the behavior data of the user, and obtain the degree of fatigue before driving based on the behavior data.
  • the electronic device 100 can obtain driving suggestions based on the travel information and the degree of fatigue before driving.
  • FIG. 11 exemplarily shows an indoor environment where a user is located, where the user is using the electronic device 100 .
  • the electronic device 100 may acquire the travel information of the user based on the ticket information of the user. For example, the electronic device 100 detects that the user's departure time is "13:30", and the user's boarding location is "Shenzhen Bao'an Airport T3".
  • the user's travel information acquired by the electronic device 100 includes the expected driving time from the current location to the user's boarding location, the departure time and the arrival time.
  • the electronic device 100 can determine the arrival time as "13:00", and determine the departure time is "12:00".
  • the electronic device 100 may set the trigger time as "11:00”.
  • the electronic device 100 may obtain the user's behavior data during "9:00-11:00", for example, the electronic device 100 may obtain the user's behavior data through the electronic device 500 .
  • the electronic device 100 may obtain the fatigue level before driving based on the user behavior data and the first fatigue model.
  • the electronic device 100 can also obtain driving suggestions based on the degree of fatigue before driving.
  • the electronic device 100 determines that the user will experience mild fatigue or moderate fatigue during driving.
  • the electronic device 100 may display driving advice including travel prompt information.
  • the travel prompt information may be one or more of text prompt information, picture prompt information, and voice prompt information.
  • the travel prompt information can be a text-type prompt message: "Hi user, according to your flight information, you may need to drive to the airport next. You may feel tired during the driving process. It is recommended that you take a half-hour lunch break , and then drive out.” In this way, before driving, the user can reduce his fatigue level according to the driving suggestion and improve the fatigue driving problem.
  • the electronic device 100 may acquire the user's physical state data and vehicle driving data, and obtain the driving fatigue level based on the physical state data and the vehicle driving data.
  • the electronic device 100 may obtain the final fatigue level based on the fatigue level before driving and the fatigue level during driving shown in FIG. 11 .
  • the electronic device 100 can obtain driving advice based on the final fatigue level.
  • FIG. 12 exemplarily shows an in-vehicle environment.
  • the electronic device 100 may establish a communication connection with the electronic device 600 .
  • the electronic device 100 can also acquire the driving data on the vehicle through the electronic device 600 .
  • the electronic device 100 can obtain the final fatigue level and driving advice based on the vehicle driving data and the like. For details, reference may be made to the embodiment shown in FIG. 10 , and details are not repeated here.
  • the electronic device 100 may obtain a driving suggestion including wakefulness prompt information.
  • the electronic device 100 may send the driving suggestion to the electronic device 600 .
  • the electronic device 600 may display driving advice including sobriety reminder information.
  • the sobriety prompt information may be one or more of text prompt information, picture prompt information, and voice prompt information.
  • the sobriety prompt information may be a text type prompt message: "Hello, driver, you are currently tired. It is recommended that you lower the temperature in the car or play refreshing music to avoid fatigue driving". In this way, during the driving process, the user can reduce his fatigue level according to the driving suggestion and improve the fatigue driving problem.
  • hailing a taxi through a mobile phone has become a way for many users to travel. For example, when a user drinks alcohol, is tired, or the vehicle is charging, the user can hail a taxi through the taxi app. However, users may lose their belongings in the car while riding. If the passenger loses the item on the car, the passenger needs to find the driver to retrieve the item lost on the car, which delays the journey of the passenger and the driver. And the possibility of the items left on the car retrieved by the passengers is not high. Therefore, the embodiment of the present application provides a detection method.
  • the electronic device 100 can establish a Bluetooth connection with the in-vehicle device 900 .
  • the in-vehicle device 900 may acquire the in-vehicle image before the passenger gets on the car (also referred to as the in-vehicle image before getting on the car) after detecting the passenger's door opening operation.
  • the in-vehicle device 900 may also obtain an in-vehicle image of the passenger after getting off the vehicle (also referred to as an in-vehicle image after getting off the vehicle) after detecting that the passenger has alighted from the vehicle.
  • the in-vehicle device 900 may determine whether the passenger's belongings are still included in the vehicle after the passenger gets off the vehicle based on the in-vehicle image before getting on the vehicle and the in-vehicle image after getting off the vehicle.
  • the in-vehicle device 900 When the in-vehicle device 900 determines that there are items belonging to passengers in the vehicle, it can broadcast an item missing prompt message, which can be used to remind the driver and passengers that items left in the vehicle. At the same time, the in-vehicle device 900 may also send the item missing prompt information to the electronic device 100, and the electronic device 100 may display the item missing prompt information after receiving the item missing prompt information.
  • the item missing prompt message is used to remind passengers that there are items left on the vehicle. In this way, passenger items can be prevented from being left in the vehicle.
  • the electronic device 100 may be a mobile phone, a tablet computer, a wearable device, and the like.
  • the in-vehicle device 900 may be used to acquire vehicle data, for example, the in-vehicle device 900 may be used to detect the opening and closing of a door, acquire images inside the vehicle, and detect the speed and acceleration of the vehicle, and so on.
  • the electronic device 100 may include but not limited to a Bluetooth module 1302 , an acceleration sensor 1301 and a processor 1303 .
  • the Bluetooth module 1302 can be used to establish a guest Bluetooth connection with the vehicle-machine device 900 .
  • the guest Bluetooth connection can be used to establish a Bluetooth connection between the electronic device 100 and the vehicle-machine device 900 without user input, which can realize pairing and key verification.
  • the electronic device 100 can set the Bluetooth function by calling related functions, so as to realize the visitor's Bluetooth connection.
  • the electronic device 100 may directly create a pairing request through the createBond() function, and send the pairing request to the in-vehicle device 900 .
  • the electronic device 100 can also set the key to a specified value by calling the setPin() function to set the key.
  • the electronic device 100 can also cancel the key input through the cancelPairingUserInput() function.
  • the electronic device 100 can establish a Bluetooth connection with the car-machine device 900 that does not require pairing and a key (ie, a guest Bluetooth connection).
  • a key ie, a guest Bluetooth connection
  • the processor 1303 may be used to determine whether to disconnect the visitor's Bluetooth connection with the in-vehicle device 900 .
  • the processor 1303 may be the processor 110 shown in FIG. 1 . That is to say, the processor 1303 may determine whether to disconnect the visitor's Bluetooth connection based on the acceleration of the electronic device 100 and the acceleration of the in-vehicle device 900 .
  • the processor 1303 may send a confirmation success signaling to the vehicle-machine device 900 through the Bluetooth module 1302, and the confirmation success signaling may be used to indicate that the vehicle The host device 900 does not disconnect the guest Bluetooth connection.
  • the processor 1303 may disconnect the Bluetooth connection with the in-vehicle device 900 .
  • the processor 1303 may also send a confirmation failure signaling to the in-vehicle device 900 through the Bluetooth module 1302, and the confirmation failure signaling may be used to instruct the in-vehicle device 900 to disconnect the visitor's Bluetooth connection.
  • the processor 1303 may determine that the acceleration of the electronic device 100 and the acceleration of the vehicle-machine device 900 are the same when the absolute value of the difference between the acceleration of the electronic device 100 and the acceleration of the vehicle-machine device 900 does not exceed the acceleration deviation threshold.
  • the acceleration deviation threshold may be a fixed value (for example, 0.001m/s- 2 ).
  • the acceleration deviation threshold may be obtained based on the maximum error value of the sensor. Wherein, the maximum error value of the sensor can be provided by the manufacturer of the sensor.
  • the electronic device 100 and the in-vehicle device 900 store the maximum error values of their respective sensors.
  • the electronic device 100 and the in-vehicle device 900 may transmit the maximum error value of each sensor before transmitting the acceleration.
  • the acceleration deviation threshold may be the sum of the maximum error value of the sensor of the electronic device 100 and the maximum error value of the sensor of the in-vehicle device 900 . In this way, an applicable acceleration deviation threshold can be obtained based on different electronic devices.
  • the in-vehicle device 900 includes but not limited to an acceleration sensor 1401 , a Bluetooth module 1402 , a camera 1403 and a processor 1404 .
  • the acceleration sensor 1401 may be used to acquire the acceleration of the vehicle-machine device 900 .
  • the acceleration sensor 1401 can also be used to send the acceleration to the processor 1404 .
  • the acceleration sensor 1401 can also send the acceleration to the Bluetooth module 1402 .
  • the Bluetooth module 1402 can be used to establish a guest Bluetooth connection with the electronic device 100 .
  • the bluetooth module 1402 can also be used to receive the data sent by the electronic device 100 (for example, the acceleration of the electronic device 100, confirmation success signaling, confirmation failure signaling, item missing prompt information, etc.).
  • the Bluetooth module 1402 can also be used to send the data of the in-vehicle device 900 (for example, the acceleration of the in-vehicle device 900 ) to the electronic device 100 .
  • the processor 1404 may be used to determine whether the passenger's items are left in the vehicle based on the images in the vehicle. That is to say, the processor 1404 may, after detecting the passenger's operation of getting on the car, use the camera 1403 to acquire an image inside the car before getting on the car. After the processor 1404 detects the operation of the passenger getting off the vehicle, the camera 1403 may acquire an image inside the vehicle after getting off the vehicle. Wherein, the processor 1404 may detect the passenger's boarding operation and disembarking operation through an image recognition algorithm (for example, a convolutional neural network algorithm) through the image acquired by the camera. The processor 1404 can determine the item information in the car before the passenger gets on the car through the image inside the car before getting on the car.
  • an image recognition algorithm for example, a convolutional neural network algorithm
  • the processor 1404 may instruct the vehicle central control display to display the first missing prompt information.
  • the processor 1404 may also send item missing indication information to the electronic device 100 through the Bluetooth module 1402 .
  • the missing item indication information may be used to instruct the electronic device 100 to display second missing prompt information, and the second missing prompt information may be used to remind passengers that there are items left in the vehicle.
  • the vehicle-machine equipment 900 further includes a vehicle door sensor and a pressure sensor.
  • the door sensor can be used to detect the passenger's operation to open the door.
  • a pressure sensor can be used to detect whether a passenger is in the seat. In this way, the vehicle-machine device 900 can detect the passenger's operation of getting on and off the vehicle through the door sensor and the pressure sensor.
  • the electronic device 100 may display a desktop 1501 .
  • the desktop 1501 may include multiple application icons, for example, a taxi application icon 1502 and so on.
  • the taxi-hailing application icon 1502 can be used to trigger the display of the taxi-hailing application interface (for example, the taxi-hailing application interface 1510 shown in FIG. 15B ).
  • Ride-hailing apps can be used to send the driver's departure and destination points.
  • the taxi app can also be used to send the driver's information (location information, license plate number, vehicle color, etc.) to the passenger.
  • a status bar may also be displayed on the top of the desktop 1501, and a Bluetooth icon may be displayed in the status bar.
  • the bluetooth icon is used to indicate that the electronic device 100 has turned on the bluetooth function.
  • the taxi application interface 1510 may include a text box 1511 , a text box 1512 and a call vehicle control 1513 .
  • the text box 1511 can be used to acquire and display the user's departure location.
  • Text box 1512 may be used to obtain and display the user's destination point.
  • the call vehicle control 1513 may be used to send the departure and destination points to the driver's electronic device (eg, the vehicle device 900 ).
  • text box 1511 may display a departure location "AA Street" and text box 1512 may display a destination location "BB Building".
  • the electronic device 100 may send the departure point and the destination point to the vehicle-machine device 900 .
  • the vehicle-machine device 900 may send vehicle information (eg, vehicle location information, license plate number, driver name, vehicle color, etc.) to the electronic device 100 .
  • the electronic device 100 may display a taxi application interface 1530 as shown in FIG. 15D .
  • the electronic device 100 When the electronic device 100 receives the vehicle information, it can turn on the visitor's Bluetooth function, and broadcast the visitor's Bluetooth connection request.
  • the visitor Bluetooth function can be used for establishing a visitor Bluetooth connection between the electronic device 100 and the vehicle-machine device 900 .
  • the visitor Bluetooth connection can be used for transmitting acceleration information between the electronic device 100 and the vehicle-machine device 900 , and can also be used for the vehicle-machine device 900 to send a prompt message to the electronic device 100 that there are items left on the vehicle.
  • the electronic device 100 can set the key, cancel the key information input setting, and cancel the pairing request creation setting by setting the Bluetooth function.
  • the electronic device 100 can establish a Bluetooth connection with the car-machine device 900 that does not require pairing and a key (ie, a guest Bluetooth connection). It should also be noted that the user of the electronic device 100 may be referred to as a passenger in the subsequent description.
  • the in-vehicle device 900 may acquire an image of the interior of the vehicle before getting on the vehicle through the camera when the operation of opening the vehicle door by the passenger is detected.
  • the vehicle-machine device 900 may obtain the operation of opening the vehicle door through the vehicle door sensor, or recognize the operation of the passenger to open the vehicle door through the picture collected by the camera.
  • the in-vehicle device 900 can also enable the visitor's Bluetooth function after detecting that the passenger gets on the vehicle.
  • the in-vehicle device 900 can identify the passengers through the images collected by the camera, and determine that the passengers get on the vehicle.
  • the in-vehicle device 900 may use a pressure sensor to determine whether a passenger gets on the vehicle.
  • the in-vehicle device 900 may receive a guest Bluetooth connection request from the electronic device 100 after turning on the guest Bluetooth function. After receiving the visitor's Bluetooth connection request from the electronic device 100 , the in-vehicle device 900 may send a visitor's Bluetooth connection response to the electronic device 100 . After the electronic device 100 receives the visitor's Bluetooth connection response, the electronic device 100 and the in-vehicle device 900 establish a visitor's Bluetooth connection.
  • the electronic device 100 and the in-vehicle device 900 can exchange their respective accelerations through the visitor's Bluetooth connection.
  • the acceleration can be used to determine whether the electronic device 100 and the in-vehicle device 900 are in the same vehicle. If the electronic device 100 and the car-machine device 900 determine that the accelerations of the electronic device 100 and the car-machine device 900 are different, that is, they are not in the same car, the electronic device 100 and the car-machine device 900 may disconnect the visitor's Bluetooth connection.
  • the electronic device 100 can record the identification of the electronic device that is not in the same vehicle, prevent the wrong electronic device from being re-connected, and increase the possibility of establishing a guest Bluetooth connection between the electronic device 100 and the electronic device in the same vehicle. It can be understood that the electronic device 100 may delete the identification information of the in-vehicle device 900 after a preset access prohibition time.
  • the electronic device 100 and the car-machine device 900 may not disconnect the visitor's Bluetooth connection.
  • the embodiment of this application will be written with the electronic device 100 and the vehicle-machine device 900 in the same vehicle.
  • the in-vehicle image of the passenger after getting off the vehicle may be used as the in-vehicle image after getting off the vehicle.
  • the in-vehicle device 900 may determine that the passenger got off the car after recognizing that there is no passenger in the seat area in the image.
  • the in-vehicle device 900 may determine whether the passenger leaves the seat through the pressure sensor at the seat, and when the in-vehicle device 900 determines that the passenger has left the seat, the in-vehicle camera acquires the in-vehicle image after getting off the vehicle.
  • the in-vehicle device 900 can jointly determine whether the passenger gets off the vehicle through the camera and the pressure sensor. For example, the in-vehicle device 900 may determine whether the passenger has left the seat area based on the image collected by the camera after the pressure sensor determines that the passenger has left the seat. In this way, the in-vehicle image acquired by the in-vehicle device 900 does not include passengers, which makes it easier to identify items in the vehicle.
  • the in-vehicle device 900 may broadcast the first missing prompt information through the in-vehicle audio system after determining that the passenger's items are left in the vehicle.
  • the first omission prompt message may be: "The passenger's things are left in the car, please remind the passenger to get them back".
  • the in-vehicle device 900 may also send the missing item indication information to the electronic device 100 . After receiving the item missing indication information, the electronic device 100 may display the second missing prompt information.
  • the electronic device 100 is not limited to displaying the second missing prompt information in the form of text, and the electronic device 100 may also display the second missing prompt information in the form of voice broadcast. Further, the electronic device 100 may also prompt the user to view the second missing prompt information by vibrating the body.
  • the in-vehicle device 900 may determine whether the passenger gets off the car again through the method described in the above embodiments (for example, a pressure sensor) when detecting the passenger's operation to close the car door.
  • the vehicle-machine device 900 determines that the passenger got off the vehicle, it broadcasts the first missing prompt information and sends the item missing indication information to the electronic device 100 .
  • the scene where the passenger gets off the car temporarily (for example, the scene where the passenger gets off the car and yields), it is possible to avoid wrongly reminding the passenger that there are items left in the car.
  • the method includes:
  • the electronic device 100 receives an input from a passenger for a first application.
  • the first application may be a taxi-hailing application (for example, the taxi-hailing application shown in FIG. 15A above).
  • the first application can be used to receive the passenger's input, and obtain the passenger's taxi information from the passenger's input.
  • the taxi-hailing information may include a trigger location and a destination.
  • the first application can also be used to send the passenger's taxi information to the driver.
  • the input for the first application may be for the icon of the first application (for example, the above-mentioned input for the taxi-hailing application icon 1502 shown in FIG. 15A ), or for the taxi-hailing control of the taxi-hailing page provided by the first application. . It is the above-mentioned input for the taxi-hailing application icon 1502 shown in FIG. 15A (for example, the above-mentioned input for the taxi-hailing application icon 1502 shown in FIG. 15A ).
  • the electronic device 100 may broadcast the visitor's Bluetooth connection request to nearby electronic devices after receiving the passenger's input for the first application.
  • the electronic device 100 may re-broadcast after a preset time interval (for example, 2 minutes) after receiving the input for the icon of the first application. Guest Bluetooth connection request.
  • a preset time interval for example, 2 minutes
  • the in-vehicle device 900 may acquire an in-vehicle image before the passenger boards the vehicle (also referred to as an in-vehicle image before boarding) through an in-vehicle camera when detecting a passenger's boarding operation.
  • the in-vehicle device 900 can also use an image recognition algorithm to identify the item information in the image in the vehicle before getting on the vehicle.
  • FIG. 17A shows an in-vehicle image acquired by the in-vehicle device 900 before getting on the vehicle.
  • the in-vehicle device 900 can obtain the item list in the vehicle as ⁇ bottle, 1> ⁇ through the in-vehicle image before getting on the vehicle, where the bottle is the identifier of the item, and 1 is the quantity of the item.
  • the in-vehicle image before boarding and the obtained item list shown in FIG. 17A are only examples, and will not specifically limit the in-vehicle image before boarding acquired by the in-vehicle device 900 in actual applications.
  • the identifier of the item in the item list may be marked as item A.
  • the in-vehicle device 900 detects the passenger's sitting operation, and turns on the visitor's Bluetooth function.
  • the in-vehicle device 900 When the in-vehicle device 900 detects the passenger's sitting operation (that is, detects that the passenger is sitting in the car), it can turn on the visitor's Bluetooth function and receive the visitor's Bluetooth connection request sent by the electronic device 100 .
  • the in-vehicle device 900 may detect the passenger's sitting operation through a pressure sensor, an in-vehicle camera, and the like. In this way, the in-vehicle device 900 can avoid the scene where the driver temporarily gets off the car as the scene in which the passenger gets on the car and sits down.
  • the door closing operation may be regarded as the passenger's sitting down operation.
  • the in-vehicle device 900 may directly enable the visitor's Bluetooth function after detecting the passenger's boarding operation.
  • the electronic device 100 sends a guest Bluetooth connection request to the in-vehicle device 900 .
  • the electronic device 100 may broadcast a visitor's Bluetooth connection request after receiving the passenger's input for the first application.
  • the in-vehicle device 900 may receive the visitor's Bluetooth connection request broadcast by the electronic device 100 after the visitor's Bluetooth function is turned on.
  • the in-vehicle device 900 sends a visitor Bluetooth connection response to the electronic device 100 .
  • the in-vehicle device 900 After receiving the visitor Bluetooth connection request sent by the electronic device 100 , the in-vehicle device 900 sends a visitor Bluetooth connection response to the electronic device 100 to establish a visitor Bluetooth connection with the electronic device 100 . It can be understood that the electronic device 100 receives the guest Bluetooth connection response, and establishes a guest Bluetooth connection with the in-vehicle device 900 .
  • the target in-vehicle device is the in-vehicle device in the same vehicle as the electronic device 100 after the passenger boards the vehicle.
  • the electronic device 100 may determine the in-vehicle device with the strongest Bluetooth signal from one or more received visitor Bluetooth connection responses, and establish a guest Bluetooth connection with the in-vehicle device. It can be understood that the stronger the bluetooth signal, the closer the distance between the in-vehicle device and the electronic device 100 .
  • the guest Bluetooth connection can only be used to transmit motion information requests, motion information, item omission indication information and calibration information (for example, the maximum error value of the sensor, the specified acquisition time point, the time of acquisition acceleration).
  • the motion information may include but not limited to acceleration, speed and so on. That is to say, when the motion information is acceleration, the motion information request is an acceleration request.
  • the electronic device 100 may send a guest Bluetooth connection request including a specified header to the in-vehicle device 900 .
  • the in-vehicle device 900 may also send a visitor Bluetooth connection response including a specified header to the electronic device 100 .
  • the electronic device 100 and the in-vehicle device 900 can continue to transmit the acceleration through the data packet including the specified header.
  • the data in the data packet is the encrypted acceleration.
  • the encryption and decryption methods of the electronic device 100 and the in-vehicle device 900 are the same.
  • the guest Bluetooth connection request sent by the electronic device 100 is: 1001 0000.
  • 1001 is the specified header.
  • 0000 is the data in the data packet, and it can be understood that the data in the data packet can be any value.
  • writing is performed with a data value of 0000.
  • the in-vehicle device 900 receives the visitor's Bluetooth connection request, it determines that the packet header is 1001, and returns a visitor's Bluetooth connection response to the electronic device 100 .
  • the guest Bluetooth connection response can be: 1001 0000.
  • 1001 is the specified packet header
  • 0000 is the data in the data packet.
  • the electronic device 100 may send the acceleration to the in-vehicle device 900 .
  • the acceleration sent by the electronic device 100 is: 1001 5001.
  • 1001 is the specified header
  • 5001 is the encrypted acceleration.
  • the encryption method of the electronic device 100 and the in-vehicle device 900 is to arrange the original data in reverse order
  • the in-vehicle device 900 can obtain an acceleration of 1.005 m/s 2 based on 5001. It should be noted that the above data packet structure and data encryption and decryption methods between the visitor's Bluetooth connection are only examples, and do not limit the embodiment of the present application.
  • the electronic device 100 may send a guest Bluetooth connection request including a specified header and a specified data segment to the vehicle-machine device 900 .
  • the specified header is the same fixed data segment obtained by the electronic device 100 and the in-vehicle device 900 from the server.
  • the specified data segment may be a data segment of a specified length randomly generated by the electronic device 100 .
  • the in-vehicle device 900 After the in-vehicle device 900 receives the visitor's Bluetooth connection request, it may encrypt the specified data segment based on an encryption algorithm, and use the encrypted specified data segment as a packet header of the visitor's Bluetooth connection response.
  • the electronic device 100 may establish a guest Bluetooth connection with the in-vehicle device 900 after determining that the packet header of the guest Bluetooth connection response is an encrypted data segment.
  • both the electronic device 100 and the in-vehicle device 900 can use the encrypted data segment as the header of the data packet for transmitting the acceleration. It can be understood that the data in the data packet used to transmit the acceleration is the encrypted acceleration. It should be noted that the encryption and decryption algorithms in the electronic device 100 and the in-vehicle device 900 are the same.
  • the guest Bluetooth connection request sent by the electronic device 100 is: 1001 0000.
  • 1001 is the specified header.
  • 0000 is the data in the packet.
  • the in-vehicle device 900 After the in-vehicle device 900 receives the visitor's Bluetooth connection request, it determines that the packet header is 1001, and returns a visitor's Bluetooth connection response to the electronic device 100 .
  • the electronic device 100 and the in-vehicle device 900 encrypt data by adding 1 to the value of the original data, the in-vehicle device 900 can obtain the header of the visitor's Bluetooth connection response as 0001.
  • the guest Bluetooth connection response can be: 0001 0000. Among them, 0001 is the specified packet header, and 0000 is the data in the data packet.
  • the car-machine device 900 After the electronic device 100 and the car-machine device 900 establish a guest Bluetooth connection, it can be verified whether the electronic device 100 and the car-machine device 900 are in the same car. When the electronic device 100 and the car-machine device 900 are in the same car, the car-machine device 900 can send the missing item indication information to the electronic device 100 through the visitor's Bluetooth connection.
  • the electronic device 100 and the vehicle-machine device 900 are in the same vehicle.
  • the electronic device 100 may determine whether the motion state of the electronic device 100 is the same as that of the vehicle-machine device 900 through the motion information of the electronic device 100 and the motion information of the vehicle-machine device 900 .
  • the difference between the motion information of the electronic device 100 and the motion information of the in-vehicle device 900 is smaller than the motion deviation threshold, whether the motion state of the electronic device 100 is the same as that of the in-vehicle device 900 .
  • the motion deviation threshold may be preset, or may be obtained based on an error value of a sensor, which is a sensor for acquiring motion information.
  • the motion information may include but not limited to acceleration, speed and so on.
  • the motion information can be represented in the form of acceleration, and the motion deviation threshold is the acceleration deviation threshold.
  • the electronic device 100 and the in-vehicle device 900 may determine whether the motion states of the electronic device 100 and the in-vehicle device 900 are the same by performing step S1606-step S1610.
  • the electronic device 100 may send an acceleration request to the in-vehicle device 900 .
  • the acceleration request may be used to instruct the in-vehicle device 900 to send the acquired acceleration to the electronic device 100 .
  • the in-vehicle device 900 acquires the first acceleration of the in-vehicle device 900 based on the acceleration request.
  • the in-vehicle device 900 may acquire the first acceleration of the in-vehicle device 900 after receiving the acceleration request.
  • the in-vehicle device 900 sends the first acceleration to the electronic device 100 .
  • the electronic device 100 acquires the second acceleration of the electronic device 100.
  • the electronic device 100 determines whether the first acceleration is the same as the second acceleration.
  • the electronic device 100 may also record the identification information of the car-machine device 900 (for example, the car-machine device Bluetooth device name of device 900).
  • the electronic device 100 may not establish a guest Bluetooth connection with the car-machine device 900 when it is determined based on the identification information that the device establishing the visitor Bluetooth connection is the car-machine device 900 .
  • the time at which the acceleration is obtained may be recorded. In this way, the accelerations obtained at the same time point can be compared, and the accelerations of the electronic device 100 and the in-vehicle device 900 are avoided due to different time points for obtaining the accelerations.
  • the acceleration request may include a specified acquisition time point.
  • the specified acquisition time is after the time point when the electronic device 100 sends the acceleration request.
  • the electronic device 100 and the in-vehicle device 900 can acquire the acceleration at a specified acquisition time point, making the determination result more accurate.
  • the time of the electronic device 100 and the in-vehicle device 900 are not synchronized. Before the electronic device 100 and the in-vehicle device 900 transmit the acceleration, time calibration can also be performed. For example, the electronic device 100 and the in-vehicle device 900 may perform time synchronization through a satellite or cellular network.
  • the electronic device 100 may send a confirmation success signaling to the in-vehicle device 900 .
  • the electronic device 100 determines that the first acceleration is different from the second acceleration, it directly sends a confirmation failure signaling to the in-vehicle device 900 and disconnects the visitor's Bluetooth connection.
  • the electronic device 100 may send acceleration requests to the vehicle-machine device 900 for a preset number of times (for example, 3 times).
  • the electronic device 100 may determine that the acceleration of the electronic device 100 is the same as the acceleration of the in-vehicle device 900 when it is determined that the number of times the first acceleration and the second acceleration are the same reaches a preset number threshold (for example, 2 times), and the preset number of times threshold The value is less than or equal to the preset number of times.
  • the electronic device 100 and/or the in-vehicle device 900 may determine whether the acceleration of the electronic device 100 is the same as that of the in-vehicle device 900 for M consecutive times, and determine that the acceleration of the electronic device 100 is the same as the acceleration of the in-vehicle device 900 When the number of times is greater than or equal to N times, it is determined that the motion states of the electronic device 100 and the in-vehicle device 900 are the same.
  • the preset time intervals between the acceleration requests of the electronic device 100 are different. Specifically, after sending the first acceleration request, the electronic device 100 may send the second acceleration request at a preset time interval A, and then send the third acceleration request at a preset time interval B.
  • the value of the preset time length B and the preset The value of duration A is different. For example, the value of the preset duration A is 1 minute, and the value of the preset duration B is 2 minutes.
  • the in-vehicle device 900 may send the first acceleration list to the electronic device 100 after receiving the acceleration request.
  • the electronic device 100 may also acquire the second acceleration list.
  • the electronic device 100 may determine whether the acceleration of the electronic device 100 is the same as the acceleration of the in-vehicle device 900 based on the first acceleration list and the second acceleration list.
  • the first acceleration list includes multiple accelerations.
  • the second acceleration list includes multiple accelerations.
  • the electronic device 100 may sequentially compare multiple accelerations in the first acceleration list with multiple accelerations in the second acceleration list, and record the same number of comparisons.
  • the electronic device 100 may divide the same number of comparisons by the total number of comparisons to obtain the pass rate.
  • the electronic device 100 determines that the first acceleration list and the second acceleration list are the same. It can be understood that the electronic device 100 may send multiple acceleration requests to the vehicle-machine device 900 .
  • a preset passing threshold for example, 0.8
  • the pass rate is 0.8
  • the first acceleration list and the second The acceleration list is the same.
  • the first acceleration list and the second acceleration list further include an acquisition time corresponding to each acceleration.
  • the first acceleration list may be ⁇ 193532, 1.005>, ⁇ 193537, 1.343>, ⁇ 193542, 1.532>, . . . , ⁇ 1933603, 1.935> ⁇ .
  • 193532 in ⁇ 193532, 1.005> is used to indicate that the acquisition time is "19:35:32”
  • 1.005 is used to indicate that the acceleration acquired by the vehicle-machine device 900 is 1.005m/s 2 .
  • the electronic device 100 may only compare accelerations with the same time in the acceleration list, and calculate the passing rate.
  • the acceleration request may include a specified acquisition time point.
  • the acceleration request may include multiple specified acquisition time points, and the electronic device 100 and the in-vehicle device 900 may acquire acceleration at multiple specified acquisition time points.
  • the acceleration request may include the acquisition start time point, the acquisition end time point and the acquisition time interval.
  • the time difference between the acquisition start time point and the acquisition end time point is an integer multiple of the acquisition time interval.
  • the electronic device 100 and the in-vehicle device 900 may acquire acceleration at intervals between the acquisition start time point and the acquisition end time point to obtain an acceleration list.
  • the above operations of sending an acceleration request and judging whether the acceleration of the electronic device 100 is the same as that of the vehicle-machine device 900 may be performed by the vehicle-machine device 900 .
  • the electronic device 100 and the in-vehicle device 900 continuously determine that the first acceleration and the second acceleration are the same for a preset number of times, or when the electronic device 100 and the in-vehicle device 900 continuously determine the preset number of times, the obtained In the determination result, when the number of times that the first acceleration and the second acceleration are the same reaches a preset number threshold, it is determined that the accelerations of the electronic device 100 and the in-vehicle device 900 are the same. Both the electronic device 100 and the in-vehicle device 900 may disconnect the visitor's Bluetooth connection when it is determined that the accelerations of the electronic device 100 and the in-vehicle device 900 are different.
  • the electronic device 100 sends a confirmation success signaling to the in-vehicle device 900 .
  • the electronic device 100 may determine the accelerations of the electronic device 100 and the vehicle-machine device 900 , that is, the electronic device 100 and the vehicle-machine device 900 are in the same vehicle.
  • the electronic device 100 may send a confirmation success signaling to the in-vehicle device 900 .
  • the successful confirmation signaling can be used to instruct the in-vehicle device 900 not to disconnect the visitor's Bluetooth connection.
  • the in-vehicle device 900 may send a confirmation success signal to the electronic device 100, and the confirmation success signal The command may be used to instruct the electronic device 100 to maintain a communication connection with the in-vehicle device 900 .
  • the communication connection with the electronic device 100 is disconnected.
  • the in-vehicle device 900 may send a confirmation failure signaling to the electronic device 100, and the confirmation failure signal The command can be used to instruct the electronic device 100 to disconnect the communication connection.
  • the electronic device 100 may obtain the Bluetooth identifier of the target vehicle-machine device (for example, the vehicle-machine device 900 ) through the server of the first application, and carry the Bluetooth identifier in the broadcast visitor Bluetooth connection request.
  • the target in-vehicle device may send a visitor's Bluetooth connection response to the electronic device 100 when it is determined that the Bluetooth identifier carried in the visitor's Bluetooth connection request is the same as the Bluetooth identifier of the target in-vehicle device.
  • the electronic device 100 receives the visitor Bluetooth connection response of the target vehicle-machine device, it can establish a visitor Bluetooth connection with the target vehicle-machine device, and receive item missing indication information through the visitor Bluetooth connection.
  • the electronic device 100 may send the Bluetooth identification of the electronic device 100 to the target in-vehicle device (eg, in-vehicle device 900 ) through the server of the first application.
  • the target vehicle-machine device can send a visitor's Bluetooth connection response carrying the Bluetooth identification of the electronic device 100 to the electronic device 100.
  • the electronic device 100 determines that the Bluetooth identifier carried in the visitor's Bluetooth connection response is the Bluetooth identifier of the electronic device 100, it may establish a visitor's Bluetooth connection with the target in-vehicle device that sent the visitor's Bluetooth connection response.
  • the electronic device 100 may receive item missing indication information through the visitor's Bluetooth connection.
  • the in-vehicle device 900 detects the passenger's alighting operation, and acquires an in-vehicle image of the passenger after getting out of the vehicle.
  • the in-vehicle device 900 may acquire an in-vehicle image of the passenger after getting off the vehicle (also referred to as an in-vehicle image after getting off the vehicle) after detecting that the passenger has alighted from the vehicle.
  • the in-vehicle device 900 may trigger the in-vehicle device 900 to detect whether the passenger gets off or not through the pressure sensor and/or the camera in the car after detecting the passenger's door opening operation through the door sensor. car.
  • the in-vehicle device 900 may perform step S1613 after detecting that the passenger got off the vehicle.
  • the in-vehicle device 900 may acquire the in-vehicle image at a preset time interval (for example, 1 ms) after receiving the confirmation success signaling, and determine the passenger's position based on the in-vehicle image. Whether to get off. That is to say, the in-vehicle device 900 may determine that the occupant got off the vehicle when it is recognized that the in-vehicle images do not include the occupant's image. When the in-vehicle device 900 determines that the passenger got off the vehicle, step S1613 may be performed.
  • a preset time interval for example, 1 ms
  • the in-vehicle device 900 determines whether there is an item missing based on the in-vehicle image before getting on the vehicle and the in-vehicle image after getting off the vehicle.
  • the in-vehicle device 900 can use an image recognition algorithm to identify the item information in the in-vehicle image after getting off the vehicle.
  • the in-vehicle device 900 can compare whether the items in the in-vehicle image before getting on the vehicle are the same as those in the in-vehicle image after getting off the vehicle. When the in-vehicle device 900 determines that the items in the in-vehicle image before getting on the vehicle are the same as the items in the in-vehicle image after getting off the vehicle, it is determined that no item is missing.
  • the in-vehicle device 900 determines that the items in the in-vehicle image before getting on the vehicle are different from those in the in-vehicle image after getting off the vehicle, it is determined that there is an item missing (that is, the passenger's item is left in the vehicle).
  • FIG. 17B shows the in-vehicle image acquired by the in-vehicle device 900 after getting off the vehicle.
  • the in-vehicle device 900 can obtain the item list in the vehicle as ⁇ bottle, 1>, ⁇ bag, 1> ⁇ through the in-vehicle image after getting off the vehicle.
  • the in-vehicle device 900 can determine that the item list of the in-vehicle image before getting on the vehicle (see the embodiment shown in FIG. 17A above) is different from the item list in the in-vehicle image after getting off the vehicle, and determine that an item is missing.
  • the in-vehicle device 900 may directly use an image comparison method (for example, pixel comparison) to compare whether the in-vehicle image before getting on the car is the same as the in-vehicle image after getting off the car.
  • an image comparison method for example, pixel comparison
  • the in-vehicle device 900 determines that the in-vehicle image before getting on the vehicle is the same as the in-vehicle image after getting off the vehicle, it is determined that no item is missing.
  • the in-vehicle device 900 determines that the in-vehicle image before getting on the vehicle is different from the in-vehicle image after getting off the vehicle, it is determined that an item is missing.
  • step S1614 and step S1616 may be performed.
  • the in-vehicle device 900 When the in-vehicle device 900 determines that no item is missing, it can disconnect the visitor's Bluetooth connection with the electronic device 100 .
  • the in-vehicle device 900 sends item missing indication information to the electronic device 100 .
  • the missing item instruction information is used to instruct the electronic device 100 to execute step S1615.
  • the in-vehicle device 900 may execute step S1614 after determining that the item is missing and after detecting the passenger's door closing operation.
  • the in-vehicle device 900 may detect the passenger's door closing operation after determining that the item is missing. And after the passenger's door closing operation is detected, the in-vehicle image is acquired, and when it is determined that the in-vehicle image does not include the passenger's image, step S1614 is executed.
  • step S1614 is executed for a detailed description of the vehicle-machine device 900 detecting the passenger's door-closing operation.
  • the in-vehicle device 900 can send the in-vehicle image before getting on the car and the in-vehicle image after getting off the car to the electronic device 100 through the visitor's Bluetooth connection, and the electronic device 100 can then based on the in-vehicle image before getting on the car and the in-vehicle image after getting off the car The image judges whether the passenger's items are left on the car.
  • the electronic device 100 may display the second missing prompt information.
  • the second missing prompt information can be used to remind passengers that there are items left in the car.
  • the electronic device 100 may display a prompt box 1541 as shown in FIG. 15E after receiving the missing item indication information.
  • the electronic device 100 may prompt passengers that there are items left in the car by displaying text, vibrating, playing animation, broadcasting voice, displaying pictures, etc. in one or more ways .
  • the in-vehicle device 900 broadcasts the first missing prompt information.
  • the first omission prompt information is used to remind the driver and passengers that the items are left in the car.
  • the in-vehicle device 900 may acquire an image inside the vehicle before getting on the vehicle when detecting a passenger's door opening operation.
  • the in-vehicle device 900 may acquire the in-vehicle image after detecting that the passenger got off the vehicle.
  • the in-vehicle device 900 may broadcast the first missing prompt information when it is determined that the passenger's items are left in the vehicle based on the in-vehicle image before getting on the vehicle and the in-vehicle image after getting off the vehicle. In this way, it is unnecessary to establish a guest Bluetooth connection with the electronic device 100 .
  • the embodiment of the present application provides a detection method.
  • the electronic device 100 detects a scene to be charged, it can obtain charging station information through the server 1000 and obtain charging car information through the vehicle-machine device 900 .
  • the electronic device 100 may obtain charging service information based on the charging station information and the charging car information.
  • the charging service information includes one or more charging station options, one charging station option corresponds to one charging station, the charging station indicated by the charging station option includes the charging equipment that the car-machine device 900 can use, and the car-machine device 900 can The charging station you arrived at before you finished.
  • Charging station options include information on charging prices, charging times, and more.
  • one or more charging station options include the first charging station option.
  • the electronic device 100 After the electronic device 100 receives the user's input on the option of the first charging station, it may display navigation information to the first charging station. The electronic device 100 may also send a charging service reservation request to the server 1000 . In this way, the user can quickly select and reach an available charging station.
  • the server 1000 may obtain the parking location information of the in-vehicle device 900 .
  • the parking location information may be used to indicate the parking area where the vehicle-machine equipment 900 is located.
  • the server 1000 may also send a charging confirmation prompt to the electronic device 100, and the electronic device 100 may display a charging start control after receiving the charging confirmation prompt.
  • the electronic device 100 may send a charging start request to the server 1000 after receiving the user's input on the control to start charging.
  • the server 1000 may send the parking location information to the charging device 1100 after receiving the charging start request.
  • the charging device 1100 can arrive at the location of the in-vehicle device 900 based on the parking location information, and charge the in-vehicle device 900 . After the charging device 1100 starts charging the vehicle-machine device 900 , it can send vehicle charging information to the electronic device 100 through the server 1000 .
  • the vehicle charging information may include the electric quantity of the vehicle-machine device 900 .
  • the electronic device 100 may display the vehicle charging information. In this way, the user can check the charging status of the in-vehicle device 900 in real time.
  • the communication system 30 includes an electronic device 100 and an in-vehicle device 900 .
  • a communication connection for example, a Bluetooth connection
  • Data can be transmitted between the electronic device 100 and the in-vehicle device 900 through the communication connection.
  • the in-vehicle device 900 is an electric vehicle or a device constituting an electric vehicle.
  • the in-vehicle device 900 may include, but not limited to, an in-vehicle camera and the like.
  • the in-vehicle device 900 can be used to acquire the data of the electric vehicle (for example, the remaining power of the charging vehicle, the image in front of the vehicle, etc.).
  • the electronic device 100 may be a handheld electronic device, a wearable device, etc.
  • the hardware structure of the electronic device 100 may refer to the embodiment shown in FIG. 1 , which will not be repeated here. It should be noted that, in the following embodiments, the embodiment of the present application will be written with the car-machine device 900 as a charging car.
  • the electronic device 100 may display a desktop 1801, and the desktop 1801 includes a plurality of application icons (eg, car charging application icons).
  • the desktop 1801 may also include one or more card components (for example, charging service card 1802).
  • the card component also referred to as a card
  • the specified function information can be used to trigger the electronic device 100 to perform the operation indicated by the function information (for example, trigger the electronic device 100 to display the specified function in the card component page corresponding to the information).
  • Cards can be displayed on the desktop or other specified shortcut interfaces (such as negative one screen, service center, etc.).
  • the charging service card 1802 may display function information for providing car charging service.
  • the charging service card 1802 can be used to trigger the electronic device 100 to display the power information of the in-vehicle device 900 , charging service information and so on.
  • the electronic device 100 When the electronic device 100 detects a scene to be charged, it can obtain charging station information through the server 1000 , and obtain charging car information through the vehicle-machine device 900 .
  • the electronic device 100 may obtain charging service information based on the charging station information and the charging car information.
  • the charging service information includes one or more charging station options, wherein the first charging station option is included in the one or more charging station options.
  • the electronic device 100 may display a charging information bar 1804 as shown in FIG. 18B .
  • the charging station option may include but not limited to identification information of the charging station, estimated charging time, estimated charging fee and to-be-traveled distance.
  • the identification information of the charging station may be used to indicate the charging station.
  • the estimated charging duration can be used to represent the charging time of the in-vehicle device 900
  • the estimated charging cost can be used to represent the cost required to fully charge the in-vehicle device 900 .
  • the distance to be traveled may be used to indicate the distance from the vehicle-machine device 900 to the charging station.
  • the electronic device 100 may also obtain the priority of each charging station option based on one or more of these parameters, such as estimated charging fee, estimated charging duration, and to-be-traveled distance.
  • the electronic device 100 may display various charging station options in order from the position closest to the status bar to the position farthest from the status bar according to the priority. Wherein, the electronic device 100 may display the charging station option with the highest priority at a position closest to the status bar. For example, the electronic device 100 may set the priority of the charging station option with the shortest expected charging time to be the highest.
  • the charging service card 1802 displays remaining power information 1803 and a charging station information column 1804 .
  • the remaining power information 1803 may be used to indicate the remaining power of the in-vehicle device 900 .
  • Charging station information column 1804 may include one or more charging station options.
  • the one or more charging station options include charging station option 1804A.
  • the charging station options may include, but are not limited to, the name of the charging station, estimated charging time, estimated charging cost, and waiting distance.
  • the electronic device 100 may receive a user's sliding input (eg, slide up) on the charging station information bar 1804 to display different charging station options.
  • the charging station option 1804A can be used to indicate the charging station A, for example, the name of the charging station A is displayed in the charging station option 1804A, the estimated charging time of the charging station A is 1 hour, and the estimated charging fee of the charging station A is 20 yuan , the distance to be traveled between the vehicle-machine equipment 900 and the charging station A is 1.2 km.
  • the charging service card 1802 may also include charging prompt information, and the charging prompt information may be used to remind the user that the in-vehicle device 900 needs to be charged.
  • the charging prompt information may be one or more of text prompt information, animation prompt information, and voice prompt information.
  • the charging prompt information may be a text prompt: "The current power is low, please charge as soon as possible".
  • the electronic device 100 may only display the charging station option with the highest priority in the charging service card 1802 .
  • the electronic device 100 can also display more controls on the charging service card 1802 . The more controls can be used to trigger the electronic device 100 to jump to display the charging service interface, and the charging service interface can be used to display charging station options.
  • the electronic device 100 may send a charging service reservation request to the server 1000 in response to the input.
  • the charging service reservation request includes vehicle identification information and charging station identification information.
  • the vehicle identification information is used to indicate the vehicle-machine device 900
  • the charging station identification information is used to indicate the charging station A.
  • the server 1000 may determine the charging device 1100 based on the charging station identification information. The server 1000 can send the car identification information to the charging device 1100, and the charging device 1100 can charge the car-machine device 900 after the car-machine device 900 arrives at the charging station A.
  • the electronic device 100 may also display a navigation image 1813 as shown in FIG. 18C in response to the user's input (for example, single click) on the charging station option 1804A after receiving the input.
  • the charging service card 1802 may display reminder information 1811 of a successful reservation, information on the distance to be traveled 1812 and a navigation image 1813 .
  • the reservation success prompt information 1811 may be used to prompt the user to go to the charging station A to charge the in-vehicle device 900 .
  • the reservation success prompt information 1811 may be text type prompt information: "Successful reservation of charging service".
  • the distance to travel information 1812 may be used to prompt the user the distance from the current location to the charging station A (for example, 1 km).
  • the navigation image 1813 can be used to display the driving route from the current location to the charging station A.
  • the electronic device 100 may jump to display the map interface of the map application in response to the input, and display the navigation from the current location to the charging station A on the map interface. map.
  • the server 1000 When the server 1000 detects that the in-vehicle device 900 arrives at the charging station A, it can acquire the parking location information of the in-vehicle device 900 , which can be used to indicate the location of the in-vehicle device 900 in the charging station A.
  • the server 1000 may also send a charging start request to the electronic device 100, and the electronic device 100 may display a charging start control 1822 as shown in FIG. 18D after receiving the charging start request.
  • the electronic device 100 may display a charging start control 1822 on the charging service card 1802 .
  • the charging start control 1822 may be used to trigger the electronic device 100 to send a charging start response to the server 1000 .
  • a charging confirmation prompt 1821 may also be displayed on the charging service card 1802 .
  • the confirmation charging prompt 1821 may be used to prompt the user whether to start charging.
  • the charging confirmation prompt 1821 may be a text type prompt message: "arrive at charging station A, whether to start charging”.
  • a later inquiry control may also be displayed on the charging service card 1802 . The later query control can be used to trigger the electronic device 100 to display the charging service card 1802 as shown in FIG.
  • the charging service card 1802 may also display a charging rejection control, which may be used to trigger the electronic device 100 to send a charging rejection response to the server 1000, and the server 1000 may notify the charging device 1100 to cancel charging the vehicle device 900.
  • a charging rejection control which may be used to trigger the electronic device 100 to send a charging rejection response to the server 1000, and the server 1000 may notify the charging device 1100 to cancel charging the vehicle device 900.
  • the electronic device 100 After the electronic device 100 receives the user's input on the charging start control 1822 , in response to the input, it may send a charging start response to the server 1000 . After receiving the charging start response, the server 1000 may send the parking location information to the charging device 1100 . After receiving the parking location information, the charging device 1100 may go to the location indicated by the parking location information. After the charging device 1100 arrives at the place indicated by the parking position information, it can also confirm whether the vehicle parked at the place is the vehicle-machine device 900 through the vehicle identification information. After the charging device 1100 determines that the in-vehicle device 900 is out, it can start charging the in-vehicle device 900 .
  • the charging service card 1802 displays vehicle charging prompt information 1831, and the vehicle charging prompt information 1831 may include text prompt information, picture prompt information, animation prompt information, voice prompt information, or Various.
  • the vehicle charging prompt information 1831 may be used to remind the user that the vehicle-machine device 900 is being charged.
  • the vehicle charging prompt information 1831 may also be used to remind the user of the real-time battery capacity of the vehicle-machine device 900 .
  • the vehicle charging prompt information 1831 may also be used to remind the user of the charging time of the vehicle-machine device 900 .
  • the vehicle charging prompt information 1831 may include text prompt information: "charging, and charging is expected to be completed in 1 hour", and the vehicle charging prompt information 1831 may also include text prompt information: "current power: 20%”.
  • a charge cancel control 1832 may also be displayed on the charge service card 1802 , and the charge cancel control 1832 may be used to trigger the electronic device 100 to send charge cancel information to the server 1000 .
  • the server 1000 may notify the charging device 1100 to stop charging the in-vehicle device 900 .
  • the electronic device 100 is not limited to displaying the content displayed in the charging service card 1802 shown in FIGS. 18A-18E in the form of a card.
  • the electronic device 100 may display the content displayed in the charging service card 1802 on the interface of the car charging application, which is not limited in this embodiment of the present application.
  • the charging device 1100 can send the power of the in-vehicle device 900 to the electronic device 100 every preset time (for example, 1s). .
  • the in-vehicle device 900 may send the electric power of the in-vehicle device 900 to the electronic device 100 when the value of the electric power changes, for example, from 20% to 21%.
  • the electronic device 100 may display charging station options corresponding to charging stations available to the user, and display navigation information to the charging station after the user selects a certain charging station option.
  • the electronic device 100 can also display the power of the in-vehicle device 900 in real time, and the user can check the charging status of the in-vehicle device 900 in real time.
  • the operations performed by the above-mentioned electronic device 100 may be performed by the in-vehicle device 900 .
  • the electronic device 100 may obtain the vehicle charging information from the in-vehicle device 900, and display vehicle charging prompt information based on the vehicle charging information.
  • the user can leave the charging station where the vehicle-machine device 900 is located during the charging process of the vehicle-machine device 900 .
  • the user can know the charging status of the in-vehicle device 900 through the electronic device 100 .
  • the method includes:
  • the electronic device 100 detects a scene to be charged.
  • the scene to be charged may include but not limited to a low battery scene, a parking lot scene, a destination scene and the like.
  • the electronic device 100 may acquire the power of the on-board device 900 at intervals of a preset time (for example, 1 second), and when the electronic device 100 determines that the power of the out-of-car device 900 is lower than a preset power threshold (for example, 20%), it determines that The current scene is a low battery scene.
  • a preset time for example, 1 second
  • a preset power threshold for example, 20%
  • the electronic device 100 can also acquire the image of the road ahead through the on-board camera of the vehicle-machine device 900 (for example, a driving recorder), and use an image recognition algorithm to identify whether the image of the road ahead includes parking lot entrance information (for example, a parking lot sign, etc.) ).
  • parking lot entrance information for example, a parking lot sign, etc.
  • the electronic device 100 may determine that the current scene is a parking lot scene.
  • the electronic device 100 may obtain the location information of the vehicle-machine device 900 through a global navigation and positioning system, and may also obtain the location information of a parking lot near the vehicle-machine device 900 through a map server.
  • a specified distance threshold for example, 10 meters
  • the electronic device 100 may also store the user's historical parking locations (eg, work locations). When the electronic device 100 detects that the distance between the in-vehicle device 900 and the historical parking place is less than a specified distance threshold, it determines that the current scene is the destination scene. Alternatively, the electronic device 100 may obtain the destination address input by the user, and when the electronic device 100 detects that the distance between the in-vehicle device 900 and the destination is less than a specified distance threshold, it determines that the current scene is the destination scene.
  • a specified distance threshold e.g., work locations
  • the electronic device 100 may determine the power consumed by the in-vehicle device 900 to reach the destination, and compare whether the consumed power is greater than the remaining power of the in-vehicle device 900.
  • the electronic device 100 determines that When the power consumed by the in-vehicle device 900 to reach the destination is greater than the remaining power of the in-vehicle device 900, the difference between the consumed power and the remaining power may be calculated.
  • the electronic device 100 may acquire charging station information near the driving route of the car-machine device 900 when the power consumed by the car-machine device 900 is greater than the difference between the consumed power and the remaining power, and based on the charging station information near the driving route and Charging vehicle information is obtained and displayed charging service information.
  • the electronic device 100 may obtain the charging station information near the driving route of the vehicle-machine device 900 when the remaining power of the vehicle-machine device 900 is less than the power consumption of the vehicle-machine device 900 on the remaining distance, and based on the charging station information near the driving route, Station information and charging car information are obtained and displayed charging service information.
  • the electronic device 100 may acquire destination information of the user, where the destination information includes a destination address and a route to the destination.
  • the destination route may be obtained by the electronic device 100 from a map server based on the location of the electronic device 100 and the destination address.
  • the electronic device 100 determines that the power of the car-machine device 900 is lower than the power consumed by the car-machine device 900 traveling to the destination according to the route to the destination, the electronic device 100 obtains charging information of one or more charging stations.
  • steps S1902 and S1903 may be performed. It should be noted that the embodiment of the present application does not limit the execution order of step S1902 and step S1903, for example, the electronic device 100 may first execute step S1902, or the electronic device 100 may first execute step S1903, or the electronic device 100 may Step S1902 and step S1903 are executed synchronously.
  • the electronic device 100 may not execute step S1901, and directly execute steps S1902-S1904.
  • the electronic device 100 acquires charging station information (including information about the first charging station) from the server 1000 .
  • the server 1000 may be any server storing charging station information of multiple charging stations, for example, the server 1000 may be a server corresponding to the above-mentioned car charging application.
  • the multiple charging stations include the first charging station.
  • Charging station information may include, but is not limited to, identification information (for example, name) of the charging station, the number of non-working charging devices in the charging station, the charging power of the non-working charging devices in the charging station, and the charging power of the non-working charging devices in the charging station.
  • the type of charging interface for example, five holes and three pins, nine holes and two pins, etc.), the location of the charging station, the charging cost per unit of electricity, etc.
  • the server 1000 may send charging station information (ie, charging information of one or more charging stations) to the electronic device 100 .
  • charging station information ie, charging information of one or more charging stations
  • the server 1000 may only send to the electronic device 100 the charging station information corresponding to the charging station including the charging device that is not working.
  • the electronic device 100 may also acquire charging station information through historical transaction records with the charging station, location-based services (location based services, LBS), wireless beacon (Beacon) scanning, and the like.
  • location-based services location based services, LBS
  • Beacon wireless beacon
  • the electronic device 100 acquires the charging car information from the car-machine device 900 .
  • the charging vehicle information may include but not limited to the charging interface model of the vehicle-machine device 900 , the remaining power of the vehicle-machine device 900 , the battery capacity of the vehicle-machine device 900 , the location of the vehicle-machine device 900 , historical charging records and so on.
  • the electronic device 100 obtains and displays the charging service information based on the charging station information and the charging car information; the charging service information includes one or more charging station options, and the one or more charging station options include the first charging station option, the second A charging option corresponds to a first charging station.
  • the charging station option includes the identification information of the charging station, the estimated charging time, the estimated charging fee and the distance to be traveled.
  • the identification information of the charging station may be used to indicate the charging station.
  • the estimated charging duration can be used to represent the charging time of the in-vehicle device 900
  • the estimated charging cost can be used to represent the cost required to fully charge the in-vehicle device 900 .
  • the distance to be traveled may be used to indicate the distance from the vehicle-machine device 900 to the charging station.
  • the description of the charging station option obtained by the electronic device 100 is as follows:
  • the electronic device 100 can filter out that the number of non-working charging devices is greater than zero based on the number of non-working charging devices at the charging station, the charging interface models of the charging devices, and the charging interface model of the vehicle-machine device 900, and the number of charging devices
  • the charging interface type includes one or more charging stations having the interface type of the vehicle-machine device 900 .
  • the electronic device 100 obtains the distance between the electronic device 100 and the one or more charging stations based on the location of the one or more charging stations and the location of the vehicle-machine device 900 among the selected one or more charging stations. (Also known as distance to travel).
  • the electronic device 100 can also calculate the distance that the in-vehicle device 900 can travel before running out of power (also referred to as the travelable distance) based on the remaining power of the in-vehicle device 900 .
  • the electronic device 100 may filter out the charging stations whose to-be-traveled distance is less than the possible-travelable distance from the one or more charging stations.
  • the charging stations whose distance to travel is less than the travelable distance obtained through screening may be referred to as pre-selected charging stations.
  • the electronic device 100 can calculate the time required for charging at each pre-selected charging station based on the charging power of the non-working charging device at the pre-selected charging station and the charging fee per unit of electricity, as well as the remaining power and battery capacity of the vehicle-machine device 900 (i.e., estimated charging time) and cost (i.e., estimated charging cost).
  • the electronic device 100 may display the one or more charging station options.
  • the electronic device 100 may display the one or more charging station options through the charging service card 1802 shown in FIG. 18B .
  • the electronic device 100 may obtain the distance to travel from the vehicle-machine device 900 to each charging station based on the positions of the vehicle-machine device 900 and each charging station. Based on the distance to be traveled and the driving speed of the vehicle-machine device 900 , the arrival time point of the vehicle-machine device 900 at each charging station is obtained, and the charging station information including unused charging equipment after the arrival time point is obtained from the server 1000 . The electronic device 100 then obtains charging station options based on the charging station information and the charging car information. In this way, when the electronic device 100 arrives at the charging station, a charging station with unused charging equipment can be provided, thereby improving the utilization rate of the charging equipment.
  • the electronic device 100 can obtain the priority of one or more charging station option settings based on one or more of the parameters such as estimated charging cost, estimated charging time, and to-be-traveled distance, and charge according to one or more
  • the priority of the station is used to set the position of the one or more charging station options on the display screen of the electronic device 100 .
  • the charging station option with higher priority is closer to the status bar on the display screen of the electronic device 100 .
  • electronic device 100 may prioritize the one or more charging station options based on estimated charging costs.
  • the electronic device 100 may set a charging station option priority with a lower estimated charging fee to be higher.
  • the electronic device 100 stores historical charging records, or the electronic device 100 may obtain the historical charging records from the vehicle-machine device 900 .
  • the historical charging record includes the charging information of the charging station that the vehicle-machine device 900 was charged before (for example, the name of the charging station, the location of the charging station, the number of times of charging at the charging station, etc.).
  • the electronic device 100 may set the priority of the charging station option corresponding to the charging station with the most charging times in the vicinity of the vehicle-machine device 900 (for example, within a radius of 1 km around the vehicle-machine device 900 ).
  • the first charging station option includes identification information of the first charging station, estimated charging time, and the like.
  • the first charging station option may be used to trigger the electronic device 100 to select a charging device (for example, the charging device 1100 ) at the first charging station.
  • the electronic device 100 receives the user's input for the first charging station option.
  • the input for the first charging station option may include but not limited to single click, double click, long press and so on.
  • the input may be the input for charging station option 1804A shown in FIG. 18B described above.
  • the electronic device 100 sends a charging service reservation request to the server 1000 , the charging service reservation request includes vehicle identification information and charging station identification information, wherein the vehicle identification information can be used to indicate the vehicle-machine device 900 .
  • the charging station identification information may be used to indicate the first charging station.
  • the electronic device 100 may send a charging service reservation request to the server 1000 in response to the input.
  • the charging service reservation request includes vehicle identification information and charging station identification information, wherein the vehicle identification information may be used to indicate the vehicle-machine device 900 .
  • the car identification information may include but not limited to the license plate number, model, color, etc. of the car-machine equipment.
  • the charging station identification information is used to indicate the first charging station corresponding to the first charging station option.
  • the server 1000 sends the vehicle identification information to the charging device 1100.
  • the server 1000 may determine, based on the charging station identification information, that the vehicle dispatching device 900 will be charged by an unused charging device of the first charging station.
  • the server 1000 may send the car identification information to an unused charging device of the first charging station, for example, the charging device 1100 .
  • the charging device 1100 After the charging device 1100 receives the car identification information, the charging device 1100 cannot be used by other car-machine devices except the car-machine device 900 .
  • the electronic device 100 may display navigation information to the first charging station corresponding to the first charging station option (for example, the location of the electronic device 100 to the first charging station). navigation route to the charging station).
  • the first charging station may be charging station A, and the electronic device 100 may display the above-mentioned navigation image 1813 shown in FIG. 18C after receiving an input for the option of the first charging station.
  • the server 1000 detects that the in-vehicle device 900 has driven into the first charging station, and may obtain parking location information of the in-vehicle device 900 .
  • the server 1000 can detect whether the in-vehicle device 900 drives into the first charging station in various ways. In some embodiments, the server 1000 can detect whether the in-vehicle device 900 drives into the first charging station through a camera of the first charging station or a fully automatic electronic toll collection system (electronic toll collection, ETC). Specifically, the server 1000 may acquire an image of a vehicle entering the first charging station through a camera at an entrance of the first charging station. The server 1000 can identify the vehicle identification information in the vehicle image through an image recognition algorithm. The server 1000 may confirm whether the vehicle-machine device in the vehicle image is the vehicle-machine device 900 based on the vehicle identification information.
  • ETC electronic toll collection
  • the server 1000 determines that the in-vehicle device in the vehicle image is the in-vehicle device 900, it can determine that the in-vehicle device 900 is driving into the first charging station. Alternatively, the server 1000 can automatically identify the license plate number of the vehicle driving into the first charging station through ETC, and determine whether the vehicle is the vehicle-machine device 900 based on the license plate number. When the server 1000 determines that the vehicle is the vehicle-machine device 900, that is It can be determined that the out-of-vehicle device 900 has driven into the first charging station.
  • the server 1000 may acquire the location of the vehicle-exit device 900 through the electronic device 100 at intervals of a preset time, and when it is determined that the location of the vehicle-exit device 900 overlaps with the position of the first charging station, determine the location of the vehicle-exit device 900.
  • Device 900 drives into a first charging station.
  • the electronic device 100 may send a signaling to the server 1000 to instruct the vehicle-machine device 900 to drive into the first charging station after the vehicle-machine device 900 drives into the first charging station.
  • the server 1000 may It is determined that the vehicle unloading device 900 drives into the first charging station.
  • the server 1000 may acquire the parking location information of the in-vehicle device 900 .
  • the parking position information may be used to indicate the position of the vehicle-machine equipment 900 in the first charging station.
  • the parking location information may include one or more of a parking area number, a parking space number, an indoor positioning fingerprint, and an indoor GPS signal.
  • the server 1000 can acquire the parking location information of the vehicle-machine equipment 900 in various ways.
  • the server 1000 may obtain the parking area number and the parking space number of the parking position of the vehicle-machine device 900 through the camera of the first charging station.
  • the electronic device 100 may obtain the parking area number, the parking space number, etc. of the parking location of the car-machine device 900 through the camera of the car-machine device 900 .
  • the server 1000 may send the query location information to the electronic device 100. After the electronic device 100 receives the query location information, it may display the location prompt information, and the location prompt information may be used to prompt the user to input the parking location information (for example, the parking space number). .
  • the electronic device 100 may receive the parking location information input by the user, and send the parking location information to the server 1000 .
  • the server 1000 may send a charging start request to the electronic device 100 .
  • the server 1000 After the server 1000 detects that the in-vehicle device 900 has driven into the first charging station, it may send a charging start request to the electronic device 100 .
  • the charging start request may be used to instruct the electronic device 100 to display a charging start control.
  • the electronic device 100 may display a charging start control.
  • the electronic device 100 may display a charging start control.
  • the charging start control may be used to trigger the electronic device 100 to send a charging start response to the server 1000 .
  • the electronic device 100 receives a user's input on the control to start charging.
  • the input for the charging start control can be single click, double click, long press and so on.
  • the input may be an input to the start charging control 1822 shown in FIG. 18D described above.
  • the electronic device 100 sends a charging start response to the server 1000.
  • the electronic device 100 may send a charging start response to the server 1000 in response to the input.
  • the charging start response may be used to instruct the server 1000 to notify the charging device 1100 to charge the in-vehicle device 900 .
  • the electronic device 100 may display the charging start control while displaying the navigation information.
  • the server 1000 does not need to detect whether the in-vehicle device 900 has driven into the first charging station, and the electronic device 100 may send a charging start request to the server 1000 when receiving the user's input on the charging start control.
  • the server 1000 may determine that the driving device 900 drives into the first charging station.
  • the server 1000 can obtain the parking location information of the vehicle-machine device 900 after determining that the vehicle-machine device 900 has driven into the first charging station. No longer.
  • the server 1000 sends the parking location information to the charging device 1100.
  • the server 1000 may send the parking location information to the charging device 1100 .
  • the charging device 1100 may acquire the location of the in-vehicle device 900 in the first charging station based on the parking location information.
  • the charging device 1100 can move to the position of the in-vehicle device 900 to charge the in-vehicle device 900 .
  • the charging device 1100 can confirm that the vehicle parked at the location is the vehicle-machine device 900 based on the vehicle identification information, and then charge the vehicle-machine device 900 .
  • the charging device 1100 sends the vehicle charging information to the server 1000.
  • the charging device 1100 can acquire the vehicle charging information of the vehicle-machine device 900 after connecting the charging interface with the charging interface of the vehicle-machine device 900 , and send the vehicle charging information to the electronic device 100 .
  • the vehicle charging information includes the electric quantity of the vehicle-machine device 900 .
  • the vehicle charging information may be used to indicate that the vehicle-machine device 900 is being charged.
  • the server 1000 sends the vehicle charging information to the electronic device 100 .
  • the server 1000 may send the vehicle charging information to the electronic device 100 .
  • the electronic device 100 displays vehicle charging information.
  • the electronic device 100 may display vehicle charging prompt information.
  • the vehicle charging prompt information may be used to prompt the user that the vehicle-machine device 900 is being charged.
  • the vehicle charging prompt information may also be used to remind the user of the real-time power of the vehicle-machine device 900 .
  • the vehicle charging prompt information may refer to the above-mentioned embodiment shown in FIG. 18E , which will not be repeated here.
  • the charging device 1100 may send vehicle charging information to the electronic device 100 every preset time (for example, 1 second).
  • the in-vehicle device 900 may send vehicle charging information to the electronic device 100 when the value of the electric quantity changes, for example, from 20% to 21%.
  • the electronic device 100 may directly acquire the power information of the in-vehicle device 900 from the in-vehicle device 900 and display it.
  • a detection method provided in the embodiment of the present application is introduced below.
  • Fig. 20 shows a schematic flowchart of a detection method provided in the embodiment of the present application.
  • the detection method includes the following steps:
  • the step of determining the predicted sobering time may be performed by other electronic devices, for example, a cloud server.
  • the step of determining the predicted sobering time may be performed by other electronic devices, for example, a cloud server.
  • the step of determining the predicted sobering time may be performed by other electronic devices, for example, a cloud server.
  • the electronic device 100 shown in FIGS. 2 shows the electronic device 200.
  • a detection method provided in the embodiment of the present application is introduced below.
  • Fig. 21 shows a schematic flowchart of a detection method provided in the embodiment of the present application.
  • the detection method includes the following steps:
  • the step of determining the first recommended driving duration may be performed by other electronic devices, for example, a cloud server.
  • the step of acquiring user behavior data may be performed by other electronic devices, for example, the electronic device 500 shown in FIG. 8 .
  • the step of displaying the first recommended driving duration may be performed by other electronic devices, for example, the electronic device 500 shown in FIG. 8 .
  • acquiring the user's behavior data specifically includes: acquiring the user's travel time, and acquiring the user's behavior data at the first moment before the travel time. There is a preset time difference between the travel time and the first time.
  • the travel time is the departure time shown in the above-mentioned FIGS. 8-12
  • the first time is the trigger time shown in the above-mentioned FIGS. 8-12 .
  • a detection method provided in the embodiment of the present application is introduced below.
  • Fig. 22 shows a schematic flowchart of a detection method provided in the embodiment of the present application.
  • the detection method includes the following steps:
  • the first electronic device detects a passenger's boarding operation, and acquires an in-vehicle image of the passenger before boarding.
  • the first electronic device establishes a communication connection with the second electronic device.
  • the first electronic device detects the passenger's getting off operation, and acquires an in-vehicle image of the passenger after getting off the car.
  • the first electronic device determines that the passenger's items are left in the vehicle, and broadcasts a first missing prompt message.
  • the first omission prompt information is used to remind passengers that items are left in the vehicle.
  • the first electronic device sends the missing item indication information to the second electronic device through the communication connection.
  • the second electronic device displays the second missing prompt information.
  • the second missing prompt information is used to remind the passenger that the item is left in the vehicle.
  • the first electronic device may be the vehicle-machine device 900 shown in FIGS. 13-17B above.
  • the vehicle-machine device 900 shown in FIGS. 13-17B above.
  • FIGS. 13-17B For a detailed description related to the execution of the above steps by the in-vehicle device 900 , reference may be made to the aforementioned embodiments shown in FIGS. 13-17B , which will not be repeated here.
  • the second electronic device may be the electronic device 100 shown in FIGS. 13-17B above.
  • the electronic device 100 shown in FIGS. 13-17B above.
  • FIGS. 13-17B For a detailed description related to the execution of the above steps by the electronic device 100, reference may be made to the aforementioned embodiments shown in FIG. 13-FIG. 17B , which will not be repeated here.
  • the first electronic device and the second electronic device may form the first communication system.
  • a detection method provided in the embodiment of the present application is introduced below.
  • Fig. 23 shows a schematic flowchart of a detection method provided in the embodiment of the present application.
  • the detection method includes the following steps:
  • the first electronic device acquires charging information of one or more charging stations.
  • the first electronic device displays one or more charging station options based on the charging information of the one or more charging stations, where the one or more charging station options include the first charging station option.
  • the first electronic device receives an input for the first charging station option, and displays first navigation information, where the first navigation information is used to indicate a route from the first electronic device to the charging station corresponding to the first charging station option.
  • the server detects that the first electronic device has arrived at the first charging station, and acquires parking location information of the first electronic device in the first charging station.
  • the server sends the parking location information to the charging device.
  • the charging device arrives at a position in the first charging station indicated by the parking position information, and charges the first electronic device.
  • the first electronic device may be the in-vehicle device 900 shown in FIGS. 18A-19 above.
  • the in-vehicle device 900 For a detailed description related to the execution of the above steps by the in-vehicle device 900 , reference may be made to the aforementioned embodiments shown in FIGS. 18A-19 , which will not be repeated here.
  • the server may be the server 1000 shown in FIGS. 18A-19 above.
  • the server 1000 performing the above steps, reference may be made to the foregoing embodiments shown in FIG. 18A-FIG. 19 , which will not be repeated here.
  • the charging device may be the charging device 1100 shown in FIGS. 18A-19 above.
  • the charging device 1100 performing the above steps, reference may be made to the foregoing embodiments shown in FIGS. 18A-19 , and details are not repeated here.
  • the first electronic device, the server and the charging device may form the second communication system.
  • the first electronic device may be the electronic device 100 shown in Fig. 18A-Fig.
  • the in-vehicle equipment 900 shown in 19 is charged.
  • the above detection methods shown in Fig. 20-Fig. 21 can be used in combination with each other.
  • the electronic device 100 described above in FIGS. 20-21 may be the same electronic device.
  • the electronic device 100 may execute the steps in the above embodiments shown in FIGS. 20-21 , which is not limited in the present application.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Molecular Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Chemical & Material Sciences (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Public Health (AREA)
  • Analytical Chemistry (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Optics & Photonics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Veterinary Medicine (AREA)
  • Hematology (AREA)
  • Urology & Nephrology (AREA)
  • Software Systems (AREA)
  • Food Science & Technology (AREA)
  • Medicinal Chemistry (AREA)
  • Medical Informatics (AREA)
  • Biochemistry (AREA)
  • Immunology (AREA)
  • Artificial Intelligence (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Quality & Reliability (AREA)
  • Multimedia (AREA)
  • Traffic Control Systems (AREA)

Abstract

本申请公开了一种检测的方法。当用户需要驾车出行时,电子设备可以获取生理信息参数,摄入酒水参数,血醇浓度参数,采集时间参数。并基于这些参数预测用户的醒酒时间,避免用户酒后驾车。电子设备还可以获取用户的行为数据,身体状况数据和车上行驶数据,并基于这些数据得到用户的推荐驾驶时长,避免用户疲劳驾驶。电子设备还可以在检测到待充电场景后,获取充电站信息和充电汽车信息,并基于这些信息得到第一充电站选项。电子设备可以显示到第一充电站的导航信息,便于用户充电。当用户打车出行时,车机设备可以获取上车前车内图像和下车后车内图像,并在基于这些图像判定出用户的物品遗留在车内时,提示用户有东西遗漏。

Description

一种检测的方法及装置
本申请要求于2021年12月30日提交中国专利局、申请号为202111667026.8、申请名称为“一种检测的方法及装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及传感器技术领域,尤其涉及一种检测方法及装置。
背景技术
当前,随着社会的发展,驾驶出行变得越来越普遍。驾车和打车出行能很便捷到达目的地,更符合人们日常的出行需求。但是,近年来各种驾驶相关的问题也频繁出现,比如酒后驾驶、疲劳驾驶、电动车充电麻烦、遗落财物等。
目前,针对一些驾驶出行的场景,电子设备不能很准确或智能地给用户提供有效的提醒信息,用户体验不佳。因此,基于驾驶出行的提醒或服务,亟待提高。
发明内容
本申请提供了一种检测的方法及装置,实现了当用户面临一些驾驶出行的相关问题(比如酒后驾驶、疲劳驾驶、电动车充电麻烦、遗落财物等)时,给出针对这些问题的出行提示或服务,提升用户体验。
第一方面,本申请提供了一种检测方法,包括:获取生理信息参数、血醇浓度参数和采集血醇浓度参数的采集时间参数;基于生理信息参数、血醇浓度参数和采集时间参数,确定预测醒酒时间;其中,预测醒酒时间用于指示用户的血醇浓度低于阈值血醇浓度的时间点;显示预测醒酒时间。
这样,用户可以在饮酒后通过本申请提供的检测方法,确定出自己还有多长时间才能醒酒,可以避免用户醉酒驾车,给自己和他人造成生命财产损失。
在一种可能的实现方式中,生理信息参数包括体重,身高,年龄,性别,睡眠时间,睡眠质量中的一种或多种。
这样,可以基于用户自身的生理信息,得到更加准确的预测醒酒时间。
在一种可能的实现方式中,基于生理信息参数、血醇浓度参数和采集时间参数,确定预测醒酒时间,具体包括:基于生理信息参数、血醇浓度参数和采集时间参数,通过酒精预测模型,确定预测醒酒时间。
在一种可能的实现方式中,在获取生理信息参数、血醇浓度参数和采集血醇浓度参数的采集时间参数之前,方法还包括:接收第一输入;获取生理信息参数、血醇浓度参数和采集血醇浓度参数的采集时间参数,具体包括:响应于第一输入,获取生理信息参数、血醇浓度参数和采集时间参数。
在一种可能的实现方式中,在基于生理信息参数、血醇浓度参数和采集时间参数,确定预测醒酒时间之前,方法还包括:接收第二输入;基于生理信息参数、血醇浓度参数和采集时间参数,确定预测醒酒时间,具体包括:响应于第二输入,基于生理信息参数、血醇浓度参数和采集时间参数,确定预测醒酒时间。
在一种可能的实现方式中,基于生理信息参数、血醇浓度参数和采集时间参数,确定预测醒酒时间,具体包括:获取摄入酒水参数;基于生理信息参数、摄入酒水参数、血醇浓度 参数和采集时间参数,确定预测醒酒时间。
这样,可以基于用户摄入的酒精量,得到更加准确的预测醒酒时间。
在一种可能的实现方式中,获取摄入酒水参数,具体包括:通过摄像头获取摄入酒水的容器图像;基于容器图像,确定出摄入酒水参数。
在一种可能的实现方式中,摄入酒水参数包括酒水度数参数和酒水体积参数,酒水度数参数用于指示用户摄入酒水的度数,酒水体积参数用于指示用户摄入酒水的体积。
在一种可能的实现方式中,基于生理信息参数、摄入酒水参数、血醇浓度参数和采集时间参数,确定预测醒酒时间,具体包括:基于摄入酒水参数,生理信息参数,通过酒精预测模型,得到预测酒精吸收速率和预测酒精代谢速率;基于生理信息参数,输入酒水参数,预测酒精吸收速率,预测酒精代谢速率,得到血醇浓度和时间的对应关系;基于血醇浓度参数,采集时间参数,血醇浓度和时间的对应关系,确定预测醒酒时间。
第二方面,本申请提供了另一种检测方法,包括:获取用户的行为数据;基于用户的行为数据,确定出用户的驾驶前疲劳程度;基于用户的驾驶前疲劳程度,确定出用户的第一推荐驾驶时长;显示第一推荐驾驶时长。
这样,用户可以在开车出行之前,得到推荐驾驶时长。用户可以基于该推荐驾驶时长,确定出自己可以驾驶的时间。可以理解的是,当推荐驾驶时长为零时,可以用于指示用户不适合驾车出行。这样,可以避免用户疲劳驾驶,造成对自身或他人生命财产的危害。
在一种可能的实现方式中,获取用户的行为数据,具体包括:获取用户的出行时刻;在出行时刻之前的第一时刻,获取用户的行为数据;其中,第一时刻与出行时刻相差预设时间。
在一种可能的实现方式中,获取用户的出行时刻,具体包括:获取用户的日程信息,日程信息包括用户的票据信息、会议信息和日程安排信息中的一种或多种;基于用户的日程信息,获取用户的出行时刻。
在一种可能的实现方式中,方法还包括:获取用户在车辆行驶状态中的身体状态数据;基于用户的身体状态数据,确定出用户的驾驶中疲劳程度;基于用户的驾驶前疲劳程度和用户的驾驶中疲劳程度,确定出用户的最终疲劳程度;基于用户的最终疲劳程度,确定出第二推荐驾驶时长;显示第二推荐驾驶时长。
这样,用户在驾驶过程中,得到驾驶时的推荐驾驶时长,避免出现疲劳驾驶的情形。
在一种可能的实现方式中,基于用户的身体状态数据,确定出用户的驾驶中疲劳程度,具体包括:基于用户的身体状态数据,通过第二疲劳模型,确定出驾驶中疲劳程度,第二疲劳模型根据用户的历史身体状态数据训练得到。
在一种可能的实现方式中,基于用户的身体状态数据,确定出用户的驾驶中疲劳程度,具体包括:获取用户在车辆行驶状态中的车上行驶数据;基于用户的身体状态数据和用户的车上行驶数据,确定出用户的驾驶中疲劳程度。
在一种可能的实现方式中,基于用户的身体状态数据和用户的车上行驶数据,确定出用户的驾驶中疲劳程度,具体包括:基于用户的身体状态数据,确定出第二疲劳模型;基于用户的车上行驶数据和用户的身体状态数据,通过第二疲劳模型,确定出驾驶中疲劳程度。
在一种可能的实现方式中,获取用户的行为数据,具体包括:获取用户的用户数据,用户的用户数据包括运动时长,运动强度,睡眠时长中的一种或多种;基于用户数据,确定出用户的行为数据。
在一种可能的实现方式中,基于用户的行为数据,确定出用户的驾驶前疲劳程度,具体 包括:基于用户的行为数据,通过第一疲劳模型,确定出用户的驾驶前疲劳程度;其中,第一疲劳模型根据用户的历史行为数据训练得到。
第三方面,本申请提供了另一种检测方法,应用于第一通信系统,第一通信系统包括第一电子设备和第二电子设备;方法包括:第一电子设备检测到乘客的上车操作,获取乘客上车前的车内图像;第一电子设备和第二电子设备建立通信连接;第一电子设备检测到乘客的下车操作,获取乘客下车后的车内图像;当第一电子设备基于乘客上车前的车内图像和乘客下车后的车内图像,确定出乘客的物品遗留在车内时,播报第一遗漏提示信息;其中,第一遗漏提示信息用于提示乘客的物品遗留在车内;第一电子设备通过通信连接将物品遗漏指示信息发送给第二电子设备;第二电子设备基于物品遗漏指示信息,显示第二遗漏提示信息,第二遗漏提示信息用于提示乘客的物品遗漏。
这样,在乘客的物品遗留在车内时,司机和乘客都可以收到提示,避免乘客的物品遗留在车内。同样,也避免了乘客寻回物品时,对乘客和司机的时间的占用。
在一种可能的实现方式中,第二电子设备为第一电子设备检测到的所有电子设备中信号最强的电子设备。
在一种可能的实现方式中,在第一电子设备和第二电子设备建立通信连接后,方法还包括:第一电子设备通过通信连接向第二电子设备发送所述第一电子设备的运动信息;当第二电子设备基于第一电子设备的运动信息,确定出第一电子设备的运动状态和第二电子设备的运动状态相同时,第二电子设备向第一电子设备发送确认成功信令;第一电子设备接收到确认成功信令,保持和第二电子设备的通信连接。
这样,可以确保第一电子设备和第二电子设备为在同一辆车内的电子设备,确保乘客可以收到第二遗漏提示信息。
在一种可能的实现方式中,第二电子设备基于第一电子设备的运动信息,确定出第一电子设备的运动状态和第二电子设备的运动状态相同,具体包括:第二电子设备连续N次确定出第一电子设备的运动信息和第二电子设备的运动信息相同时,确定出第一电子设备的运动状态和第二电子设备的运动状态相同;其中,N为正整数。
在一种可能的实现方式中,第二电子设备基于第一电子设备的运动信息,确定出第一电子设备的运动状态和第二电子设备的运动状态相同,具体包括:第二电子设备在M次判断第一电子设备的运动信息和第二电子设备的运动信息是否相同时,判定出第一电子设备的运动信息和第二电子设备的运动信息至少有N次相同,第二电子设备确定出第一电子设备的运动状态和第二电子设备的运动状态相同;其中,N小于等于M,M和N为正整数。
在一种可能的实现方式中,当第一电子设备的运动信息和第二电子设备的运动信息的差值小于运动偏差阈值时,第一电子设备的运动信息和第二电子设备的运动信息相同。
在一种可能的实现方式中,在第一电子设备和第二电子设备建立通信连接后,方法还包括:第二电子设备通过通信连接向第一电子设备发送第二电子设备的运动信息;当第一电子设备基于第二电子设备的运动信息,确定出第一电子设备的运动状态和第二电子设备的运动状态相同时,第一电子设备向第二电子设备发送确认成功信令;第二电子设备接收到确认成功信令,保持和第一电子设备的通信连接。
在一种可能的实现方式中,方法还包括:当第二电子设备基于第一电子设备的运动信息,确定出第一电子设备的运动状态和第二电子设备的运动状态不同时,第二电子设备断开和第一电子设备的通信连接。
在一种可能的实现方式中,第二电子设备断开和第一电子设备的通信连接,具体包括:第二电子设备向第一电子设备发送确认失败信令;第一电子设备接收到确认失败信令,断开和第二电子设备的通信连接。
在一种可能的实现方式中,在第二电子设备断开和第一电子设备的通信连接后,方法还包括:第二电子设备广播通信连接请求。
在一种可能的实现方式中,第一电子设备和第二电子设备建立通信连接,具体包括:第二电子设备广播通信连接请求;第一电子设备接收到第二电子设备的通信连接请求;第一电子设备向第二电子设备发送通信连接响应;第二电子设备接收到第一电子设备的通信连接响应,与第一电子设备建立通信连接。
在一种可能的实现方式中,第一电子设备和第二电子设备建立通信连接,具体包括:第一电子设备检测到乘客在车内坐下后,第一电子设备接收到第二电子设备的通信连接请求;第一电子设备向第二电子设备发送通信连接响应,与第二电子设备建立通信连接。
第四方面,本申请提供了另一种检测方法,应用于第二通信系统,第二通信系统包括第一电子设备,服务器和充电设备;方法包括:第一电子设备接收服务器发送的一个或多个充电站的充电信息;第一电子设备基于一个或多个充电站的充电信息,显示一个或多个充电站选项,一个或多个充电站选项包括第一充电站选项;第一电子设备接收到针对第一充电站选项的输入后,显示第一导航信息,第一导航信息用于指示第一电子设备所处位置到第一充电站选项对应的第一充电站的路线;服务器检测到第一电子设备到达第一充电站后,服务器获取第一电子设备在第一充电站中的停泊位置信息;服务器将停泊位置信息发送给充电设备;充电设备到达停泊位置信息指示的第一充电站的位置后,给第一电子设备充电。
这样,用户可以快速获取到服务器提供的充电服务,并且,充电设备可以自行寻找第一电子设备,给第一电子设备充电,减少用户充电的操作。
在一种可能的实现方式中,第一电子设备基于一个或多个充电站的充电信息,显示一个或多个充电站选项,具体包括:第一电子设备基于一个或多个充电站的充电信息和第一电子设备的充电信息,确定出一个或多个充电站选项。
在一种可能的实现方式中,一个或多个充电站选项包括充电价格,充电时间和到达距离,充电价格用于指示第一电子设备充满电量所需的费用,充电时间用于指示第一电子设备充满电量所需的时间,到达距离用于指示第一电子设备与充电站选项对应的充电站之间的距离。
在一种可能的实现方式中,第一电子设备接收服务器发送的一个或多个充电站的充电信息,具体包括:当第一电子设备检测到待充电场景时,第一电子设备接收服务器发送的一个或多个充电站的充电信息,待充电场景包括低电量场景和停车场场景;其中,低电量场景为第一电子设备的电量低于预设电量阈值的场景,停车场场景为第一电子设备和附近的停车地点之间的距离小于指定距离阈值的场景。
在一种可能的实现方式中,第一电子设备接收服务器发送的一个或多个充电站的充电信息,具体包括:第一电子设备获取用户前往的目的地信息,目的地信息包括目的地地址和到达目的地的路线;第一电子设备确定出第一电子设备的电量低于第一电子设备按照到达目的地的路线行驶至目的地地址消耗的电量后,第一电子设备接收服务器发送的一个或多个充电站的充电信息。
在一种可能的实现方式中,服务器将停泊位置信息发送给充电设备,具体包括:服务器向第一电子设备发送开始充电请求;第一电子设备接收到开始充电请求,显示开始充电控件; 当第一电子设备接收到针对开始充电控件的第四输入后,响应于第四输入,向服务器发送开始充电响应;服务器接收到开始充电响应,将停泊位置信息发送给充电设备。
在一种可能的实现方式中,通信系统还包括第二电子设备,在充电设备到达停泊位置信息指示的第一充电站的位置后,给第一电子设备充电之后,方法还包括:充电设备向第二电子设备发送车辆充电信息,车辆充电信息包括第一电子设备的电量;第二电子设备接收到车辆充电信息后,显示车辆充电提示信息,车辆充电提示信息用于提示用户第一电子设备的电量。
第五方面,本申请提供了一种通信装置,包括一个或多个处理器和一个或多个存储器。该一个或多个存储器与一个或多个处理器耦合,一个或多个存储器用于存储计算机程序代码,计算机程序代码包括计算机指令,当一个或多个处理器执行计算机指令时,使得通信装置执行上述任一方面任一项可能的实现方式中的检测方法。
第六方面,本申请实施例提供了一种计算机存储介质,包括计算机指令,当计算机指令在电子设备上运行时,使得通信装置执行上述任一方面任一项可能的实现方式中的检测方法。
第七方面,本申请实施例提供了一种计算机程序产品,当计算机程序产品在计算机上运行时,使得计算机执行上述任一方面任一项可能的实现方式中的检测方法。
附图说明
图1为本申请实施例提供的一种电子设备100的结构示意图;
图2为本申请实施例提供的一种通信系统示意图;
图3为本申请实施例提供的一种电子设备100的模块示意图;
图4为本申请实施例提供的一种血醇浓度-时间曲线示意图;
图5A-图5H为本申请实施例提供的一组界面示意图;
图6A-图6B为本申请实施例提供的另一组界面示意图;
图7为本申请实施例提供的一种检测方法的流程示意图;
图8为本申请实施例提供的另一种通信系统示意图;
图9为本申请实施例提供的另一种电子设备100的模块示意图;
图10为本申请实施例提供的另一种检测方法的流程示意图;
图11为本申请实施例提供的一种应用场景示意图;
图12为本申请实施例提供的另一种应用场景示意图;
图13为本申请实施例提供的另一种电子设备100的结构示意图;
图14为本申请实施例提供的一种车机设备900的结构示意图;
图15A-图15E为本申请实施例提供的另一组界面示意图;
图16为本申请实施例提供的另一种检测方法的流程示意图;
图17A为本申请实施例提供的一种上车前车内图像示意图;
图17B为本申请实施例提供的一种下车后车内图像示意图;
图18A-图18E为本申请实施例提供的另一组界面示意图;
图19为本申请实施例提供的另一种检测方法的流程示意图;
图20为本申请实施例提供的另一种检测方法的流程示意图;
图21为本申请实施例提供的另一种检测方法的流程示意图;
图22为本申请实施例提供的另一种检测方法的流程示意图;
图23为本申请实施例提供的另一种检测方法的流程示意图。
具体实施方式
下面将结合附图对本申请实施例中的技术方案进行清楚、详尽地描述。其中,在本申请实施例的描述中,除非另有说明,“/”表示或的意思,例如,A/B可以表示A或B;文本中的“和/或”仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。
以下,术语“第一”、“第二”仅用于描述目的,而不能理解为暗示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征,在本申请实施例的描述中,除非另有说明,“多个”的含义是两个或两个以上。
下面介绍本申请实施例提供的电子设备。
电子设备100可以是手机、平板电脑、桌面型计算机、膝上型计算机、手持计算机、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本,以及蜂窝电话、个人数字助理(personal digital assistant,PDA)、增强现实(augmented reality,AR)设备、虚拟现实(virtual reality,VR)设备、人工智能(artificial intelligence,AI)设备、可穿戴式设备、车载设备、智能家居设备和/或智慧城市设备,本申请实施例对该电子设备的具体类型不作特殊限制。
图1示出了电子设备100的结构示意图。
电子设备100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
可以理解的是,本发明实施例示意的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
其中,控制器可以是电子设备100的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
可以理解的是,本发明实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备100的结构限定。在本申请另一些实施例中,电子设备100也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块140用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。在一些有线充电的实施例中,充电管理模块140可以通过USB接口130接收有线充电器的充电输入。在一些无线充电的实施例中,充电管理模块140可以通过电子设备100的无线充电线圈接收无线充电输入。充电管理模块140为电池142充电的同时,还可以通过电源管理模块141为电子设备供电。
电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110,内部存储器121,显示屏194,摄像头193,和无线通信模块160等供电。电源管理模块141还可以用于监测电池容量,电池循环次数,电池健康状态(漏电,阻抗)等参数。在其他一些实施例中,电源管理模块141也可以设置于处理器110中。在另一些实施例中,电源管理模块141和充电管理模块140也可以设置于同一个器件中。
电子设备100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在电子设备100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递 给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其他功能模块设置在同一个器件中。
无线通信模块160可以提供应用在电子设备100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,电子设备100的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备100可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
电子设备100通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备100可以包括1个或N个显示屏194,N为大于1的正整数。
电子设备100可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电 信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备100可以包括1个或N个摄像头193,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备100可以支持一种或多种视频编解码器。这样,电子设备100可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备100的智能认知等应用,例如:图像识别,人脸识别,语音识别,文本理解等。
外部存储器接口120可以用于连接外部的非易失性存储器,实现扩展电子设备100的存储能力。外部的非易失性存储器通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部的非易失性存储器中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行电子设备100的各种功能应用以及数据处理。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储电子设备100使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
电子设备100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。
当压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。陀螺仪传感器180B可以用于确定电子设备100的运动姿态。气压传感器180C用于测量气压。磁传感器180D包括霍尔传感器,可以利用磁传感器180D检测翻盖皮套的开合。加速度传感器180E可检测电子设备100在各个方向上(一般为三轴)加速度的大小。距离传感器180F,用于测量距离。接近光传感器180G也可用于皮套模式,口袋模式自动解锁与锁屏。环境光传感器180L用于感知环境光亮度。指纹传感器180H用于采集指纹。温度传感器180J用于检测温度。触摸传感器180K,也称“触控面板”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于电子设备100的表面,与显示屏194所处的位置不同。骨传导传感器180M可以获取振动信号。按键190包 括开机键,音量键等。马达191可以产生振动提示。指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。SIM卡接口195用于连接SIM卡。
目前,各种汽车已经成为人们出行必不可少的一部分,在很多用车场景中,会出现一些问题。在一些可能的应用场景中,用户在饮酒后,会在很长一段时间内保持醉酒状态。如果用户在醉酒后,进行工作或驾驶等事务,可能会给自己或他人的身命财产安全造成极大的危害。
因此,本申请实施例提供了一种检测方法。电子设备100可以获取摄入酒水参数,生理信息参数,血醇浓度参数和采集时间参数。其中,摄入酒水参数包括酒水度数参数和酒水体积参数。生理信息参数为影响用户的酒精吸收速率和酒精代谢速率的身体数据,例如,睡眠时间、睡眠质量、体重等数据。血醇浓度参数可以用于指示用户的血醇浓度。采集时间参数用于指示电子设备100获取血醇浓度参数的时间点。电子设备100可以将酒水度数参数,生理信息参数输入到酒精预测模型中,得到预测代谢速率和预测吸收速率。其中,预测代谢速率和预测吸收速率为影响用户的血醇浓度的参数。其中,电子设备100可以基于生理信息参数、摄入酒水参数、预测酒精代谢速率和预测吸收速率得到用户的血醇浓度-时间(bloodalcoholconcentration-time,C-T)曲线。电子设备100还可以基于血醇浓度参数、采集时间参数和血醇-时间曲线,得到预测醒酒时间。这样,电子设备100可以通过检测到的参数,得到用户的醒酒时间,电子设备100可以提示用户将在什么时候醒酒,避免用户在醉酒时进行驾驶等活动,保护用户及他人的身命财产安全。
在一种可能的实现方式中,电子设备100可以接收用户输入的开始饮酒时间,并基于生理信息参数,摄入酒水参数和开始饮酒时间,得到预测醒酒时间。例如,电子设备100可以基于生理信息参数和摄入酒水参数,通过酒精预测模型,得到血醇浓度和时间的对应关系。再根据血醇浓度和时间的对应关系,以及开始饮酒时间,得到预测醒酒时间。再例如,电子设备100可以将生理信息参数,摄入酒水参数和开始饮酒时间作为酒精预测模型的输入,得到预测醒酒时间。这样,不需要酒精传感器,电子设备100也可以得到预测醒酒时间。
在一种可能的实现方式中,电子设备100还可以基于用户输入的期望醒酒时间,得到用户可摄入酒水体积。具体的,电子设备100接收到用户输入的期望醒酒时间后,电子设备100可以获取酒水度数参数和生理信息参数,并将酒水度数参数,生理信息参数输入到酒精预测模型中,得到预测代谢速率和预测吸收速率。电子设备100再基于预测代谢速率、预测吸收速率,期望醒酒时间和酒水度数参数,得到可摄入酒水体积。这样,电子设备100可以提示用户饮用不超过可摄入酒水体积的酒水时,可以在期望醒酒时间醒酒,不影响用户的后续行程。
在一种可能的实现方式中,电子设备100可以基于上述摄入酒水参数,生理信息参数,血醇浓度参数和采集时间参数中的一种或多种,预测得到预计醒酒时间。例如,电子设备100可以基于用户的生理信息参数,血醇浓度参数和采集血醇浓度参数的采集时间参数,确定出预测醒酒时间。电子设备100还可以基于酒水度数参数和生理信息参数中的一种或多种,以及期望醒酒时间,得到用户的可摄入酒水体积。这样,电子设备100可以在获取的参数为上述参数的一种或多种的情况下,也能得到预计醒酒时间,或可摄入酒水体积。
接下来介绍本申请实施例提供的一种通信系统10。
如图2所示,该通信系统10可以包括电子设备100和电子设备200。其中,电子设备100可以通过无线通信方式(例如,无线保真(wirelessfidelity,Wi-Fi),蓝牙等等)和电子设备200建立无线连接。电子设备100可以通接收电子设备200传输的数据,或者,电子设备100可以将用户输入的操作指令发送至电子设备200,电子设备200可以在接收到操作指令后,执行操作指令指示的操作等等。
在通信系统10中,电子设备100可以用于存储训练酒精预测模型所需的数据(例如,生理信息参数、酒水体积参数、酒水度数参数、血醇浓度参数、采集时间参数、预测酒精代谢速率、预测酒精吸收速率、C-T曲线、预测醒酒时间等等)。电子设备100可以基于这些数据进行酒精预测模型的训练,得到准确率大于第一阈值(例如,90%)的酒精预测模型。
电子设备100还可以基于酒精预测模型和用户当前饮酒的相关数据(即,生理信息参数、摄入酒水参数)得到预测结果。预测结果可以包括预测代谢速率、预测吸收速率和血醇浓度-时间曲线。电子设备100还可以基于预测结果、血醇浓度参数、采集时间参数,得到矫正结果。其中,电子设备100可以通过电子设备200获取血醇浓度参数和采集时间参数。其中,血醇浓度参数可以用于指示用户血液中乙醇的浓度,单位可以为mg/100ml。采集时间参数可以用于指示电子设备200采集血醇浓度参数的采集时间点。
其中,电子设备200可以为任意包括有酒精传感器的电子设备。示例性的,电子设备200可以为可穿戴式电子设备(例如,智能眼睛、智能手表、蓝牙耳机等等),集成有酒精传感器的电子设备(例如,携带有酒精传感器的电子设备100、携带有酒精传感器的安全带等等)。其中,酒精传感器可以用于检测用户呼出的气体,得到用户的血醇浓度。电子设备200可以在检测得到用户的血醇浓度后,将该血醇浓度发送至电子设备100。
需要说明的是,当血醇浓度达到20mg/100ml,且低于80mg/100ml时,属于饮酒驾驶。当血醇浓度达到80mg/100ml时,属于醉酒驾驶。故而,在本申请实施例的撰写中,以血醇浓度低于20mg/100ml作为用户已经醒酒的标准。可以理解的是,这个标准只是一种示例,醒酒的标准还可以为任意小于等于20mg/100ml的血醇浓度,本申请对此不作限定。还可以理解的是,在一些可能的应用场景中,电子设备100包括有酒精传感器,电子设备100可以直接获取用户的血醇浓度。
可选的,电子设备200还包括有体动记录仪。电子设备200可以通过体动记录仪检测用户的短期记忆型身体数据,例如,用户的睡眠质量、用户的睡眠时长等等。可选的,电子设备200还包括有加速度传感器,可以用于检测用户的短期记忆型身体数据,例如,用户的运动情况等等。电子设备200可以将用户的短期记忆型身体数据发送至电子设备100。
电子设备100可以用于基于摄入酒水参数,生理信息参数,血醇浓度参数和采集时间参数得到预测醒酒时间。电子设备100还可以用于基于酒水度数参数,生理信息参数,期望醒酒时间,得到可摄入酒水体积。
在一种可能的实现方式中,通信系统10还可以包括服务器300,未在图中示出。服务器300可以为云服务器。服务器300和电子设备100之间建立有通信连接,服务器300可以用于存储上述参数(例如,生理信息参数)。并基于上述参数进行模型训练,得到准确率大于第一阈值(例如,90%)的预设酒精模型,并基于预设酒精模型和用户当前饮酒的相关参数得到预测结果(例如,可摄入酒水体积、预测醒酒时间等等)。
在一种可能的实现方式中,服务器300中可以存储有多个用户的摄入酒水参数、生理信息参数、血醇浓度参数、采集时间参数、预测吸收速率、预测代谢速率、C-T曲线和预测醒酒时间等。服务器300可以基于这些数据进行酒精预测模型的训练。
在一种可能的实现方式中,电子设备100可以将预测醒酒时间发送至电子设备200,电子设备200可以显示该预测醒酒时间。进一步的,电子设备100执行的步骤可以由电子设备200执行,本申请对此不作限定。
下面介绍本申请实施例提供的一种电子设备100的模块示意图。
如图3所示,本申请实施例提供的模块示意图包括但不限于感知模块310、存储模块320、训练模块330、预测模块340、矫正模块360和显示模块350。其中,各个模块执行的操作可以分为模型训练流程和模型预测流程。其中,模型训练流程如图3中的虚线箭头所示,电子设备100可以在相隔预设时间(例如,一周)后,使用历史参数进行酒精预测模型的训练,或者,电子设备100可以在每次得到预测结果后,使用历史参数进行酒精预测模型的训练,得到准确率达到第一阈值的酒精预测模型。其中,模型预测流程如图3中的实线箭头所示,电子设备100可以基于训练好的酒精预测模型,得到预测醒酒时间或可摄入酒水体积。
其中,感知模块310可以用于获取模型训练/模型预测所需要的参数。感知模块310可以通过电子设备100的摄像头,相关传感器等获取参数,或者,感知模块310可以通过和电子设备100建立有通信连接的其他电子设备(例如电子设备200)获取参数,或者,感知模块310还可以通过获取用户的输入,得到相关参数。
感知模块310可以用于获取摄入酒水参数。例如,感知模块310可以通过摄像头,获取用户摄入的酒水度数(即酒水度数参数)和酒水体积(酒水体积参数)。具体的,感知模块310可以通过摄像头获取摄入酒水的容器图像,感知模块310可以基于该容器图像,通过图像识别算法,得到摄入酒水参数。再例如,感知模块310还可以获取用户输入的摄入酒水参数。
感知模块310还可以用于获取生理信息参数(包括有长期记忆型生理参数和短期记忆型生理参数)。例如,感知模块310可以通过体动记录仪检测得到用户短期记忆型生理参数(例如,睡眠质量、睡眠时间等等)。再例如,感知模块310可以通过加速度传感器、惯性测量单元等检测得到用户短期记忆型生理参数(例如,运动情况等等)。再例如,感知模块310还可以获取用户输入的部分生理信息参数,用户输入的部分生理信息参数包括部分长期记忆型参数(例如,性别)、部分短期记忆型参数(例如,体重,身高,年龄)等。
感知模块310还可以用于获取用户的血醇浓度参数,以及获取该血醇浓度参数的时间(又称为采集时间参数)。例如,感知模块310可以通过酒精传感器获取用户的血醇浓度参数。感知模块310可以将血醇浓度参数和采集时间参数发送至矫正模块360,用于矫正预测结果。
可选的,感知模块310可以通过和电子设备100连接的体脂称等获取用户的部分短期记忆型生理参数(例如,体重、体质指数等等)。可选的,上述通过传感器采集的数据,也可以由用户手动输入得到。
感知模块310还可以将获取到的所有参数发送给存储模块320,用于模型训练/模型预测。
可选的,在模型预测流程中,感知模块310可以直接将获取的参数发送至预测模块340,预测模块340可以基于感知模块310发送的参数进行预测醒酒时间或可摄入酒水体积的预测。
其中,存储模块320可以用于存储模型训练/模型预测所使用的参数。存储模块320可以用于接收感知模块310获取的参数,并将其存储在存储器(例如,内部存储器121)中。存储模块320还可以接收并存储预测模块340发送的预测结果。存储模块320还可以接收并存储矫正模块360发送的矫正后的预测结果(又称为矫正结果)。在模型训练流程中,存储模块320可以将存储的所有参数(可称为历史参数)发送给训练模块330。历史参数可以包括但不 限于存储的生理信息参数、输入酒水参数、血醇浓度参数、采集时间参数、预测结果、矫正结果等。在模型预测流程中,存储模块320可以将感知模块310发送的用于预测用户醒酒时间的参数(例如,存储模块320最近获取的生理信息参数、酒水度数参数、酒水体积参数等等)发送至预测模块340。
其中,训练模块330可以利用神经网络算法(例如,卷积神经网络算法、循环神经网络算法等等),将存储模块320发送的部分历史参数(例如,生理信息参数、输入酒水参数、预测结果)作为模型的输入值,将另一部分历史参数(例如,血醇浓度参数、采集时间参数和矫正结果等)作为模型的输出值,对模型进行训练,得到训练好的酒精预测模型。也就是说,训练模块330可以基于用户的历史参数训练得到准确率大于第一阈值的酒精预测模型。训练模块330可以运行在电子设备100的处理器110中。在此,处理器110还可以为智能(artificialintelligence,AI)芯片。
其中,初始的酒精预测模型可以为预先通过其他相似用户的数据训练得到的酒精预测模型。其中,相似用户和当前用户的生理信息参数、摄入酒水参数中的一项或多项相同或相似。例如,用户的性别为男,身高为178cm,体重83kg,摄入酒水体积为340ml,摄入酒水度数为20%(即,摄入酒精体积为68ml),前一天的睡眠时间为7小时。那么,该用户的相似用户可以为男性,身高在175cm-185cm,体重在80kg-85kg,摄入酒精体积在50ml-80ml,前一天的睡眠时间在6小时-8小时的用户。需要说明的是,该生理信息参数和摄入酒水参数的取值范围仅为示例,这些参数的范围可以更大或更小,本申请对此不作限定。
在一种可能的实现方式中,电子设备100可以根据更多或更少的参数进行相似用户的判定。例如,电子设备100还可以基于睡眠质量参数进行相似用户的判定。例如,电子设备100可以基于用户深度睡眠时间、浅度睡眠时间、快速眼动时间的时长将睡眠质量划分为优秀,良好,较差,极差。当睡眠质量相同时,电子设备100可以判定其为相似用户。
训练模块330可以将训练好的酒精预测模型发送至预测模块340。该酒精预测模型可以用于得到用户的酒精代谢速率(即,预测代谢速率)以及酒精吸收速率(即,预测吸收速率)。
预测模块340可以用于计算得到预测结果。具体的,预测模块340可以将存储模块320发送的生理信息参数和酒水度数参数,输入到酒精预测模型中,得到预测代谢速率和预测吸收速率。预测模块340还可以基于预测代谢速率、预测吸收速率、酒水体积参数、酒水度数参数、生理信息参数中的用户体重参数得到的血醇浓度-时间曲线。预测模块340可以运行在电子设备100的处理器110中。
其中,基于预测代谢速率、预测吸收速率、酒水体积参数、酒水度数参数、用户体重参数可以得到血醇浓度-时间曲线。具体公式如下:
Figure PCTCN2022141989-appb-000001
其中,c为用户的血醇浓度,t为血醇浓度对应的时间。k a为预测吸收速率,v m为预测代谢速率,k m为米氏常数,为已知的固定值。c 0为最大血醇浓度,可以由以下公式得到:
Figure PCTCN2022141989-appb-000002
其中,B a为摄入酒水度数,V a为摄入酒水体积,m为用户的体重,r为固定系数,可以取0.75。
示例性的,当用户摄入酒水体积为340ml,用户摄入酒水度数为20%,用户的体重为83kg时,用户的最大血醇浓度约为87.3mg/100ml。预测模块340可以将最大血醇浓度、预测代谢速率和预测吸收速率代入公式1,得到C-T曲线。其中,预测模块340得到的C-T曲线可以如图4所示。其中,最大血醇浓度C0对应的时间为T0。当用户的血醇浓度低于阈值血醇浓 度C1时,判定用户醒酒。阈值血醇浓度对应的时间为T1,T1可以用于指示预测醒酒时间。在此,以阈值血醇浓度为20mg/100ml进行撰写,当然,阈值血醇浓度可以取其他低于20mg/100ml的值,本申请对此不作限定。
预测模块340还可以基于血醇浓度-时间曲线,血醇浓度参数和采集时间参数,得到预测醒酒时间。具体的,预测模块340可以基于血醇浓度参数和C-T曲线,确定血醇浓度参数指示的血醇浓度在C-T曲线中的位置,即,可以确定出血醇浓度参数在C-T曲线中对应的时间点。之后,预测模块340可以基于血醇浓度参数对应的时间点,得到该时间点和阈值血醇浓度对应的时间点之间的时间差。预测模块340可以在采集时间参数指示的时间点上加上时间差,得到预测醒酒时间。
可以理解的是,当电子设备100仅获取到一组血醇浓度参数和采集时间参数时,预测模块340在C-T曲线中可以得到血醇浓度参数对应的两个时间点。即,预测模块340可以得到两个预测醒酒时间。预测模块340可以将两个预测醒酒时间发送至显示模块350。显示模块350可以以时间范围的形式显示该两个预测醒酒时间。例如,显示模块350收到预测醒酒时间A和预测醒酒时间B,其中,预测醒酒时间A早于预测醒酒时间B,显示模块350可以显示预测醒酒时间为预测醒酒时间A至预测醒酒时间B。或者,显示模块350可以只显示最晚的预测醒酒时间,例如,显示模块350可以只显示预测醒酒时间B。
还可以理解的是,当预测模块340获取到两组及以上的血醇浓度参数和采集时间参数时,可以基于最近获取的两组血醇浓度参数和采集时间参数,确定出血醇浓度参数在C-T曲线中的唯一位置,并得到预测醒酒时间。
示例性的,根据图4所示的血醇浓度-时间曲线,当感知模块310检测到用户血醇浓度为C0时,判定出用户可以在(T1-T0)小时后醒酒,即,6.6小时。若T0对应的时间为当地时间19:42,预测醒酒时间为次日02:18。再例如,当感知模块310检测到用户血醇浓度为78mg/100ml时,由于该血醇浓度在曲线中对应两个时间点,感知模块310可以在预设时间(例如,7分钟)后,再次检测用户的血醇浓度。若感知模块310再次检测到的血醇浓度为76mg/100ml,预测模块340可以确定出当前时间点在T0之后,判定出用户可以在5.1小时后醒酒。若当前时间点对应的时间为当地时间19:42,预测醒酒时间为次日00:45。
在一种可能的实现方式中,预测模块340可以基于血醇浓度参数和采集时间参数,得到血醇浓度-时间曲线。例如,训练模块330可以利用神经网络算法(例如,卷积神经网络算法、循环神经网络算法等等),将存储模块320存储的血醇浓度参数和采集时间参数作为模型的输入值,血醇浓度-时间曲线作为模型的输出,并将对模型进行训练,得到训练好的酒精预测模型。预测模块340再将感知模块310最近获取的血醇浓度参数和采集时间参数作为酒精预测模型的输入,得到血醇浓度-时间曲线。
在另一种可能的实现方式中,预测模块340可以基于血醇浓度参数和采集时间参数,拟合得到血醇浓度-时间曲线。
在另一种可能的实现方式中,预测模块340可以基于生理信息参数、摄入酒水参数、血醇浓度参数和采集时间参数中的一种或多种,得到血醇浓度-时间曲线。
在另一种可能的实现方式中,预测模块340可以基于生理信息参数、摄入酒水参数、血醇浓度参数和采集时间参数中的一种或多种,直接得到预测醒酒时间。例如,训练模块330可以将生理信息参数、摄入酒水参数、血醇浓度参数和采集时间参数中的一种或多种作为酒精预测模型的输入,将预测醒酒时间作为酒精预测模型的输出,训练得到酒精预测模型。预测模块340可以基于感知模块310最近获取的酒精预测模型输入所需的参数,通过酒精预测 模型,得到预测醒酒时间。
预测模块340可以在得到预测醒酒时间后,将预测醒酒时间发送至显示模块350。显示模块350可以显示该预测醒酒时间。
预测模块340还可以将预测结果发送至矫正模块360。预测结果可以包括但不限于预测代谢速率、预测吸收速率和血醇浓度-时间曲线。预测模块340还可以将预测结果发送至存储模块320。可以理解的是,当预测模块340只用于预测得到血醇浓度-时间曲线时,预测模块340得到的预测结果只包括该血醇浓度-时间曲线。
矫正模块360可以用于基于感知模块310获取的血醇浓度参数和采集时间参数调整预测结果。当矫正模块360获取到用户饮酒后的采集时间参数及其对应的血醇浓度参数后,可以基于血醇浓度参数和获取该血醇浓度参数的采集时间参数对预测结果进行调整,得到调整后的预测代谢速率、调整后的预测吸收速率和调整后的C-T曲线。
例如,矫正模块360可以基于获取的多组血醇浓度参数和其对应的采集时间参数,得到用户实际的血醇浓度-时间曲线。矫正模块360再基于该实际的C-T曲线和预测模块340得到的C-T曲线之间的差值,调整预测结果,得到调整后的预测代谢速率、调整后的预测吸收速率和调整后的C-T曲线。
再例如,矫正模块360可以基于多组血醇浓度参数及其对应的采集时间参数,得到预测血醇浓度-时间曲线上的血醇浓度和实际的血醇浓度之间的误差值。矫正模块360可以在血醇浓度-时间曲线上的所有血醇浓度的值上添加该误差值,得到矫正后的血醇浓度-时间曲线。
矫正模块360可以基于调整后的血醇浓度-时间曲线,得到预测醒酒时间。
矫正模块360可以将矫正结果发送给显示模块350。矫正结果可以包括但不限于调整后的预测代谢速率、调整后的预测吸收速率、预测醒酒时间和调整后的血醇浓度-时间曲线。矫正模块360还可以将矫正结果发送至存储模块320。
在一种可能的实现方式中,矫正模块360可以直接基于预测结果、血醇浓度参数和采集时间参数,得到预测醒酒时间。此时,矫正结果只包括有预测醒酒时间。
在另一种可能的实现方式中,电子设备100中不包括矫正模块360。电子设备100的预测模块340可以直接基于预测结果、血醇浓度参数和采集时间参数,得到预测醒酒时间。
显示模块350可以用于显示预测醒酒时间。显示模块350可以在电子设备100的显示屏194上显示预测醒酒时间。可选的,显示模块350还可以显示预测代谢速率和预测吸收速率,及用户的历史代谢速率和历史吸收速率。显示模块350还可以显示提示信息,提示信息用于提示用户此次的代谢速率和历史代谢速率的差别,及此次的吸收速率和历史吸收速率的差别。
在一种可能的实现方式中,当电子设备100用于预测用户的可摄入酒水体积时,感知模块310可以用于获取用户输入的期望醒酒时间,生理信息参数和酒水度数参数。其中,感知模块310获取生理信息参数和酒水度数参数的描述可以参见上述描述感知模块310的实施例,在此不再赘述。感知模块310可以将获取的期望醒酒时间,生理信息参数和酒水度数参数发送至预测模块340。预测模块340可以将酒水度数参数和生理信息参数作为酒精预测模型的输入,得到预测代谢速率和预测吸收速率。预测模块340可以基于预测代谢速率,预测吸收速率,期望醒酒时间,阈值血醇浓度和公式1,得到最大血醇浓度。可以理解的是,当预测模块340得到最大血醇浓度时,预测模块340可以基于最大血醇浓度,预测代谢速率,预测吸收速率,期望醒酒时间,阈值血醇浓度,得到C-T曲线。预测模块340再基于最大血醇浓度和生理信息参数中的用户体重和公式2,得到可摄入酒水体积。预测模块340可以将可摄入酒水体积发送至显示模块350,显示模块350可以用于显示该可摄入酒水体积。
在另一种可能的实现方式中,预测模块340可以基于生理信息参数、酒水度数参数中的一种或多种以及期望醒酒时间,直接得到可摄入酒水体积。例如,训练模块330可以将生理信息参数、酒水度数参数中的一种或多种作为可摄入酒水体积预测模型的输入,将可摄入酒水体积作为可摄入酒水体积预测模型的输出,训练得到可摄入酒水体积预测模型。预测模块340可以基于感知模块310最近获取的可摄入酒水预测模型输入所需的参数,通过可摄入酒水体积预测模型,得到可摄入酒水体积。
接下来介绍本申请实施例提供的一种检测方法的界面示意图。
示例性的,如图5A所示,电子设备100可以显示桌面501。其中,桌面501可以包括多个应用图标,例如,酒精检测应用图标502等等。其中,该酒精检测应用图标502可以用于触发显示酒精检测应用的界面(例如,图5B所示的酒精检测界面510)。酒精检测应用可以用于得到预测醒酒时间或可摄入酒水体积。其中,桌面501的上方还可以显示状态栏,该状态栏中可以显示蓝牙图标。该蓝牙图标用于指示电子设备100与电子设备200建立通信连接。
电子设备100接收到用户针对酒精检测应用图标502的输入(例如单击),响应于该输入,电子设备100可以显示如图5B所示的酒精检测界面510。
如图5B所示,酒精检测界面510可以包括用户参数栏511,用户参数栏511中包括有用户的性别、身高、体重、睡眠时长等信息。其中,用户参数栏511中的睡眠时长参数可以为电子设备100从电子设备200中获取的。性别、身高、体重这些参数可以为电子设备100预先存储,或用户输入的。电子设备100可以接收用户的输入,修改用户参数栏511中的参数。酒精检测界面510还可以包括时间预测控件512和体积预测控件513。其中,时间预测控件512可以用于预测用户的醒酒时间,体积预测控件513可以用于预测用户可摄入的酒水体积。
电子设备100可以在接收到用户针对时间预测控件512的输入,响应于该输入,显示如图5C所示的检测提示界面530。
如图5C所示,检测提示界面530中包括有提示框531。提示框531显示有提示信息,该提示信息可以用于提示用户向酒精传感器哈气。这样,电子设备100可以通过酒精传感器获取用户的血醇浓度。该提示信息可以包括但不限于文字类提示信息、动画类提示信息、图片类提示信息、语音类提示信息等等。例如,提示信息可以包括如图5C所示的图片类提示信息,该图片类提示信息用于提示酒精传感器的位置。提示信息还可以包括如图5C所示的文字类提示信息:“正在检测血醇浓度,请朝箭头指向的酒精传感器哈气”。
可以理解的是,当电子设备100和包括有酒精传感器的电子设备200建立了通信连接时,才会显示图5C所示的提示信息531。在一些实施例中,电子设备100携带有酒精传感器,电子设备100可以提示用户朝着电子设备100的酒精传感器哈气,以获取用户的血醇浓度。在另一些实施例中,电子设备100未和包括有酒精传感器的电子设备建立通信连接,电子设备100也可以提示用户自行检测血醇浓度并将该血醇浓度输入到电子设备100。
当电子设备200检测到用户的血醇浓度后,可以将该血醇浓度和获取该血醇浓度的采集时间发送至电子设备100。电子设备100接收到电子设备200的血醇浓度和采集时间后,可以显示如图5D所示的时间预测界面540。
如图5D所示,时间预测界面540中可以显示有酒水参数栏541。酒水参数栏541可以用于显示用户饮用酒水的酒水体积和酒水度数。酒水参数栏541可以包括酒水参数条目542,酒水参数条目542中包括有拍照识别图标542A。拍照识别图标542A可用于触发电子设备100启动摄像头,并对摄像头采集的画面进行识别,得到用户饮用酒水体积和饮用酒水度数。需 要说明的是,电子设备100可以接收用户的输入,在酒水参数条目542中显示用户输入的饮用酒水体积和饮用酒水度数。酒水参数栏541中还可以包括添加按键,该添加按键可用于触发电子设备100在酒水参数条目542的上方或下方显示另一个酒水参数条目。这样,电子设备100可以采集到多种类型的酒水的参数。时间预测界面540中还可以包括此次检测记录栏544,再次输入按键545和开始预测按键546。
其中,检测记录栏544可以用于显示用户的血醇浓度。该血醇浓度可以为电子设备200发送至电子设备100的,或用户手动输入的。在此,检测记录栏544可以显示有一条或多条检测记录,该一条或多条检测记录中包括检测记录544A,检测记录544A包括有血醇浓度和该血醇浓度的采集时间。可选的,电子设备100可以接收用户的输入,更改检测记录中的数值。再次输入按键545可以用于触发电子设备100通知电子设备200再次检测用户的血醇浓度。可以理解的是,当电子设备100不包括有酒精传感器,并且未和包括有酒精传感器的电子设备200建立通信连接时,再次输入按键545可以用于在检测记录栏544中新增一条检测记录,用户可以在该新增的检测记录中输入血醇浓度及其对应的采集时间。开始预测按键546可以用于触发电子设备100基于上述获取到的参数,得到预测醒酒时间。
电子设备100可以接收用户针对图5D所示的拍照识别图标542A的输入,响应于该输入,显示如图5E所示的拍照识别界面550。
如图5E所示,拍照识别界面550显示有电子设备100的摄像头采集的画面。拍照识别界面550还可以包括识别到的酒水的度数信息。例如,图5E中在酒瓶旁边用文字示出了酒水度数为20%。需要说明的是,当酒瓶的包装上没有标识出酒水度数时,电子设备100可以识别得到酒水的包装信息(例如,品牌、名称等等),并基于该包装信息获取该品牌酒水的度数信息。拍照识别界面550还可以包括识别到的装酒的容器的容积信息。例如,电子设备100识别出酒瓶的容积为220ml。可选的,电子设备100还可以在容积信息附近显示容器的数量,电子设备100可以接收用户的输入,修改容器的数量。这样,可以获取用户摄入的全部酒水的体积。可以理解的是,当酒瓶的包装上没有标识出酒瓶的容积时,电子设备100可以识别得到酒水的包装信息(例如,品牌、名称等等),并基于该包装信息获取该品牌酒水的酒瓶容积信息。
拍照识别界面550还可以包括重新识别按键551,确认按键552。其中,重新识别按键551可以用于触发电子设备100重新识别当前拍照识别界面550显示的图像中的相关信息。确认按键552可以用于确认识别结果。
电子设备100可以接收用户针对图5E所示的确认按键552的输入,响应于该输入,显示如图5F所示的时间预测界面540。该时间预测界面540的酒水参数条目542中还显示有饮用酒水体积的值和饮用酒水度数的值。电子设备100还可以接收用户针对图5F所示的再次输入按键545的输入,响应于该输入,电子设备100可以通知电子设备200再次采集用户的血醇浓度。可以理解的是,电子设备100还可以显示提示信息,该提示信息的作用和内容可以参见上述图5C所示的提示信息,在此不再赘述。电子设备100可以在接收到电子设备200再次采集到的用户血醇浓度及其对应的采集时间后,可以显示如图5G所示的检测记录544B。电子设备100可以在接收到用户针对图5G所示的开始预测按键546的输入后,响应于该输入,计算预测醒酒时间。其中,电子设备100可以基于生理信息参数,摄入酒水度数和存储的酒精预测模型,得到C-T曲线。之后,电子设备100可以基于C-T曲线、用户血醇浓度、采集时间得到预测醒酒时间。具体的,可以参见上述图3所示实施例,在此不再赘述。在此,电子设备100得到的预测醒酒时间为次日00:45。电子设备100在得到预测醒酒时间后,可以 显示如图5H所示的预测结果界面570。
如图5H所示,预测结果界面570可以包括结果信息572。该结果信息572包括有预测醒酒时间信息。该结果信息572可以为文字类信息、图片类信息、语音类信息等中的一种或多种。例如,结果信息572可以为文字类信息:“预计5.1h后,血醇浓度降低至20mg/100ml,醒酒时间为明日凌晨00:45”。可选的,预测结果界面570还可以包括血醇浓度-时间曲线图571,该血醇浓度-时间曲线图571可以用于示出用户当前血醇浓度,当前时间信息,以及预测醒酒时间。这样,电子设备100可以通过血醇浓度-时间曲线图571可以更加直观地显示用户的血醇浓度变化,体现用户醒酒时间。
可选的,预测结果界面570还可以包括用户预设时间内(例如,一个月内)的酒精吸收速率、酒精代谢速率及其变化曲线。这样,用户可以查阅到自身酒精吸收速率及酒精代谢速率的变化情况,调整自身的生活作息、饮酒习惯等等。
可以理解的是,电子设备100可以只基于一条检测记录得到预测结果。还可以理解的是,由于血醇浓度-时间曲线中,除了最大血醇浓度之外的血醇浓度都对应了两个采集时间,这样基于一条检测记录得到预测结果有一段时间的误差。
在一些实施例中,电子设备100可以基于生理信息参数、酒水度数参数,通过酒精预测模型,得到预测代谢速率和预测吸收速率。电子设备100还可以基于期望饮酒时间、预测代谢速率和预测吸收速率,得到可摄入酒水体积。
示例性的,电子设备100可以在接收到用户针对图5B所示的体积预测控件513的输入,响应于该输入,显示如图6A所示的时间预测界面601。时间预测界面601可以包括酒水度数栏602,该酒水度数栏602可用于显示待摄入的酒水的度数。其中,酒水度数栏602中包括有拍照识别图标602A,拍照识别图标602A可用于触发电子设备100启动摄像头,并对摄像头采集的画面进行识别,得到用户饮用酒水度数。其中,识别酒水度数的详细描述可以参见上述图5E所述实施例,在此不再赘述。需要说明的是,电子设备100也可以接收用户的输入,在酒水度数栏602中显示用户输入的饮用酒水度数。在此,酒水度数栏602中显示有酒水度数,该酒水度数的值为20%。
时间预测界面601还可以包括期望时间栏603,期望时间栏603可以用于显示期望醒酒时间。其中,期望时间栏603可以包括时间轮盘,该时间轮盘可用于接收用户输入,调整时间轮盘上的数字,得到期望醒酒时间。期望时间栏603还可以显示有期望醒酒时间的具体数值。需要说明的是,不限于上述期望时间栏603,在实际应用中,该期望时间栏可以以其他形式,例如,期望时间栏可以为输入框,该输入框可以由用户自行输入数字,得到期望醒酒时间。本申请实施例对此不做限定。在此,期望时间栏603中显示有期望醒酒时间,该时间为“北京时间19:35”。
在一种可能的实现方式中,电子设备100可以通过查询用户的日程表或备忘录,获取用户出行或工作的时间,并将该时间作为期望醒酒时间。
时间预测界面601还可以包括开始预测按键604,开始预测按键604可用于触发电子设备100预测可摄入酒水体积。电子设备100可以在接收到用户针对开始预测按键604的输入后,基于期望饮酒时间、预测代谢速率、预测吸收速率和阈值血醇浓度,得到最大血醇浓度。再基于酒水度数参数、用户体重参数和最大血醇浓度,通过上述图3所示的公式2,得到可摄入酒水体积。电子设备100可以在得到可摄入酒水体积后,显示如图6B所示的预测结果界面610。
如图6B所示,预测结果界面610可以包括结果信息611。该结果信息611包括有可摄入酒水体积信息。该结果信息611可以为文字类信息、图片类信息、语音类信息等中的一种或多种。例如,结果信息611可以为文字类信息:“期望在3h后醒酒,可饮用酒水容量约为82ml”。可选的,预测结果界面610还可以包括血醇浓度-时间曲线图,该血醇浓度-时间曲线图可以用于示出预测得到的血醇浓度-时间变化曲线。
接下里介绍本申请实施例提供的一种检测方法的流程示意图。
基于本申请实施例提供的一种检测方法,电子设备100可以基于生理信息参数、酒水度数参数、酒水体积参数、血醇浓度参数和采集血醇浓度参数的采集时间参数,得到预测醒酒时间。电子设备100也可以基于期望醒酒时间、生理信息参数、酒水度数参数,得到可摄入酒水体积。这样,由于电子设备100使用了用户的身体参数得到预测饮酒时间和可摄入酒水体积,预测的结果更加准确。用户可以通过预测醒酒时间,在醒酒后再工作或出行。用户也可以通过可摄入酒水体积,在不影响行程的情况下,适度饮酒。
示例性的,如图7所示,该方法包括:
S701,电子设备100获取生理信息参数、摄入酒水参数、血醇浓度参数和采集时间参数。
其中,生理信息参数可以包括长期记忆型参数(例如,性别)和短期记忆型参数(例如,身高、体重、睡眠时间)。该生理信息参数可以由用户输入得到。可选的,电子设备100也可以和携带有体动记录仪的电子设备(例如,电子设备200)建立连接,通过该电子设备的体动记录仪获取用户睡眠时间。可选的,电子设备100也可以和体脂称建立连接,通过体脂称获取用户体重。
其中,摄入酒水参数可以包括酒水体积参数和酒水度数参数。该摄入酒水参数可以由用户输入得到。或者,电子设备100可以通过拍照识别摄入酒水度数和体积,具体的,电子设备100可以通过摄像头获取摄入酒水的容器图像,并基于容器图像,通过图像识别算法,得到摄入酒水参数。示例性的,电子设备100获取摄于酒水参数的具体步骤可以参见上述图5E所示实施例,在此不再赘述。
血醇浓度参数和采集时间参数一一对应,该血醇浓度参数和采集时间参数可以由用户输入得到。或者,电子设备100也可以和携带有酒精传感器的电子设备(例如,电子设备200)建立连接,通过该电子设备的酒精传感器获取血醇浓度参数和采集时间参数。具体的,可以参见上述图5C所示实施例,在此不再赘述。
在一种可能的实现方式中,电子设备100在接收到用户的第一输入后,响应于第一输入,执行步骤S701。其中,第一输入可以包括但不限于单击,双击长按等。例如,第一输入可以为针对上述图5A所示的酒精检测应用图标502的输入。
在一种可能的实现方式中,电子设备100在接收到用户的第二输入后,响应于第二输入,电子设备100基于生理信息参数、摄入酒水参数、血醇浓度参数和采集时间参数,确定出预测醒酒时间。在一些实施例中,电子设备100可以响应于第二输入,执行步骤S702和步骤S703。其中,第二输入可以包括但不限于单击,双击长按等。例如,第二输入可以为针对上述图5G所示的开始预测控件546的输入。
S702,电子设备100可以基于生理信息参数、摄入酒水参数和酒精预测模型,得到血醇浓度-时间曲线。
具体的,电子设备100可以将生理信息参数、酒水度数参数作为酒精预测模型的输入,得到预测吸收参数和预测代谢参数。电子设备100还可以基于酒水度数参数、酒水体积参数 和用户体重参数,通过上述图3所示的公式2得到最大血醇浓度。电子设备100再基于最大血醇浓度,预测吸收参数和预测代谢参数,通过上述图3所示的公式1得到血醇浓度-时间曲线。
S703,电子设备100可以基于血醇浓度-时间曲线、血醇浓度参数和采集时间参数,得到预测醒酒时间。
电子设备100可以基于血醇浓度参数和C-T曲线,确定血醇浓度参数指示的血醇浓度在C-T曲线中的位置,即,可以确定出血醇浓度参数在C-T曲线中对应的时间点。之后,预测模块340可以基于血醇浓度参数对应的时间点,得到该时间点和阈值血醇浓度对应的时间点之间的时间差。预测模块340可以在采集时间参数指示的时间点上加上时间差,得到预测醒酒时间。
在一种可能的实现方式中,电子设备100可以基于多组血醇浓度参数及其对应的采集时间参数,得到预测血醇浓度-时间曲线上的血醇浓度和实际的血醇浓度之间的误差值。电子设备100可以在血醇浓度-时间曲线上的所有血醇浓度的值上添加该误差值,得到矫正后的血醇浓度-时间曲线。电子设备100再基于矫正后的血醇浓度-时间曲线,得到预测醒酒时间。这样,电子设备100可以得到更加准确的预测醒酒时间。
可以理解的是,电子设备100还可以基于矫正后的血醇浓度-时间曲线,得到矫正后的预测代谢速率和矫正后的预测吸收速率。
需要说明的是,电子设备100可以存储该生理信息参数、摄入酒水参数、预测吸收速率、预测代谢速率、血醇浓度-时间曲线、矫正后的预测吸收速率、矫正后的预测代谢速率和矫正后的血醇浓度-时间曲线,并基于存储的数据进行酒精预测模型的训练。也就是说,电子设备100可以基于矫正后的预测代谢速率和预测代谢速率之间的误差、以及矫正后的预测吸收速率和预测吸收速率的误差,调整酒精预测模型的模型参数。电子设备100还可以计算调整了模型参数的酒精预测模型的准确率,电子设备100可以在确定出酒精预测模型的准确率达到预设阈值后,存储该酒精预测模型。
在一些实施例中,电子设备100可以相隔预设时间(例如,1个月)就重新进行模型的训练,或者,电子设备100可以在每次得到预测醒酒时间或可摄入酒水体积后,进行模型的训练。
S704,电子设备100显示预测醒酒时间。
电子设备100可以显示预测醒酒时间。可选的,电子设备100还可以显示预测吸收速率和预测代谢速率。具体的,可以参见上述图5A-图5H所示实施例,在此不再赘述。
可选的,电子设备100可以在检测到用户的驾车操作时,显示提示信息,该提示信息可以用于提示用户处于醉酒状态,不要驾车。例如,电子设备100可以通过查询用户的日程表或备忘录,确定出用户的驾车时间。
可以理解的是,当电子设备100预测可摄入酒水体积时,电子设备100可以在得到预测代谢速率和预测吸收速率后,直接基于期望醒酒时间、预测代谢速率和预测吸收速率,得到最大血醇浓度,再基于最大血醇浓度、酒水度数参数和用户体重参数,得到可摄入酒水体积。其中,该期望醒酒时间可以为用户输入的。在一种可能的实现方式中,电子设备100可以通过查询用户的日程表或备忘录,获取用户出行或工作的时间,并将该时间作为期望醒酒时间。
电子设备100还可以显示该可摄入酒水体积,具体的,可以参见上述图6A-图6B所示实施例,在此不再赘述。
可选的,电子设备100可以将生理信息参数和摄入酒水参数发送至服务器300,服务器 300进行预测醒酒时间/可摄入酒水体积的计算,以及酒精预测模型的训练。服务器300还可以用于存储上述参数。这样,可以节约电子设备100的计算及存储资源。
在一种可能的实现方式中,电子设备100可以基于上述摄入酒水参数,生理信息参数,血醇浓度参数和采集时间参数中的一种或多种,预测得到预计醒酒时间。电子设备100还可以基于酒水度数参数和生理信息参数中的一种或多种,以及期望醒酒时间,得到用户的可摄入酒水体积。这样,电子设备100可以在获取的参数为上述参数的一种或多种的情况下,也能得到预计醒酒时间,或可摄入酒水体积。
在一些可能的应用场景中,疲劳驾驶已经成为造成交通安全事故的重要原因,驾驶员在疲劳状态下,驾驶车辆上路,造成不必要的人员伤亡和经济损失。目前,检测驾驶人员是否疲劳驾驶已经成为亟待解决的问题。
因此,本申请实施例提供了一种检测方法。电子设备100可以在用户驾驶出行时,获取用户的行为数据。电子设备100可以基于行为数据得到用户的驾驶前疲劳程度。电子设备100还可以获取车上行驶数据和身体状况数据,电子设备100可以基于身体状况数据和车上行驶数据得到用户的驾驶中疲劳程度。电子设备100可以基于驾驶前疲劳程度和驾驶中疲劳程度,得到用户当前的疲劳程度(又称为最终疲劳程度)。电子设备100还可以基于最终疲劳程度,得到驾驶建议并显示。其中,驾驶建议可以包括但不限于推荐驾驶时长。推荐驾驶时长用于指示用户达到预设疲劳程度之前,可以驾驶的总时长。这样,电子设备100可以结合用户的驾驶前和驾驶中的数据,得到用户的疲劳程度,并基于用户的疲劳程度,给出相应的驾驶建议,减少用户疲劳驾驶的时间,降低驾驶事故发生的概率,改善疲劳驾驶问题。
例如,当用户的疲劳程度分为轻度疲劳、中度疲劳和重度疲劳时,当电子设备100基于最终疲劳程度判定出用户为轻度疲劳时,驾驶建议中可以包括电子设备100结合用户以前的车上行驶数据,得到的推荐驾驶时长。其中,该推荐驾驶时长为用户达到重度疲劳之前的驾驶时长。当电子设备100判定出用户为中度疲劳时,驾驶建议中可以包括推荐驾驶时长和清醒提示信息,清醒提示信息可以用于提醒用户调低车内温度或饮用提神饮料,播放提神音乐等等;当电子设备100判定出用户为重度疲劳时,驾驶建议可以包括停车提示信息,该停车提示信息可以用于提示用户尽快停车休息。驾驶建议还可以包括推荐驾驶时长,此时,推荐驾驶时长的值为零。可选的,电子设备100还可以规划最近的停车地点,并显示到该停车地点的导航信息。
需要说明的是,用户在驾驶车辆前,电子设备100未获取车上行驶数据,电子设备100可以基于行为数据得到驾驶前疲劳程度。电子设备100再结合存储的用户之前驾驶汽车时的历史数据(例如,最终疲劳程度、驾驶时长等),得到驾驶建议。其中,驾驶建议中可以包括推荐驾驶时长。该推荐驾驶时长用于指示用户驾驶多长时间会出现重度疲劳。这样,电子设备100可以在用户出现驾驶之前,就推荐用户可以驾驶的时长,降低发生交通事故的概率。
接下来介绍本申请实施例提供的一种通信系统20。
如图8所示,该通信系统20可以包括但不限于电子设备100、电子设备500、电子设备600和电子设备700。其中,电子设备100可以和电子设备500建立通信连接(例如,蓝牙连接等等)。电子设备100还可以和电子设备600建立通信连接。电子设备600可以和电子设备700建立通信连接。
具体的,在通信系统20中,电子设备700为包括有摄像头的电子设备(例如,车载摄像 头、行车记录仪等等),该电子设备700可以用于获取用户的面部图像数据。电子设备700还可以将面部图像数据发送至电子设备600。
电子设备600可用于获取驾驶数据。例如,电子设备600可以为车机设备、车载平板等等。其中,驾驶数据可以用于体现用户在驾驶过程中车内环境状况、驾驶道路情况、用户的驾驶状态等等。驾驶数据可以包括但不限于车内的光线、噪音、温度,车辆的速度、加速度、速度的方差、加速度的方差,车辆与车道偏移的频次,跟车距离,道路情况,用户面部图像数据,用户驾驶车辆的时刻、以及用户驾驶车辆的驾驶时长等等。电子设备600可以将驾驶数据发送至电子设备100。
可选的,电子设备600还可以用于接收电子设备700发送的面部图像数据。电子设备600还可以用于基于用户的面部图像数据,通过图像识别,得到用户面部数据。其中,用户面部数据可以包括但不限于用户的眼睛的聚焦情况、头动情况(低头频次)、眨眼频率、打哈欠次数等等。电子设备600可以将用户面部数据发送至电子设备100。
可选的,电子设备700可以在获取用户的面部图像数据后,基于面部图像数据得到用户面部数据,并将用户面部数据发送至电子设备600。
电子设备500可以用于实时检测用户的身体状况,得到用户数据。其中,用户数据可以用于表征用户的身体状况及用户行为。电子设备500可以为可穿戴设备(例如,智能手表、智能手环)等等。需要说明的是,上述用户数据可以包括稳定型用户数据和波动型用户数据。其中,稳定型用户数据可以用于指示用户短期内不会产生波动的身体特征数据(例如,身高、性别、年龄、体重等)。波动型用户数据可以用于指示用户在短期内发生波动的身体状况数据。也就是说,电子设备500可以用于获取波动型用户数据。其中,电子设备500获取的波动型用户数据可以包括但不限于用户的心率、体温、血糖、睡眠质量(例如,可以由睡眠时长标识)、运动情况(包括有运动时长、运动强度等)、血氧饱和度等等。电子设备500可以将获取的波动型用户数据发送至电子设备100。
电子设备500还可以用于获取涉及用户行为的用户数据,用户行为可以包括但不限于睡眠、静坐、行走、跑步等等。在以下实施例中,将只通过睡眠、静坐、行走、跑步这四种用户行为进行撰写。可以理解的是,在实际应用过程中,还可以包括其他用户行为(例如,躺卧)或将以上用户行为进行细分(例如,行走可以分为漫步、疾走等),本申请实施例对此不做限定。可以理解的是,电子设备500可以通过检测到用户的心率、体温、运动情况等用户数据获取用户行为。电子设备500可以将这些用户数据发送至电子设备100。
需要说明的是,波动型用户数据还可以包括用户的体重、体脂等,电子设备100可以通过体脂称获取这些波动型用户数据,或者可以通过用户输入得到。还需要说明的是,电子设备100获取的稳定型用户数据也可以通过用户输入得到。可选的,电子设备100可以通过电子设备700采集的用户的人脸图像数据,并通过图像识别算法,预测用户的性别、年龄、身高、体重中的一种或多种。
电子设备100可以通过电子设备500获取的用户数据,得到行为数据和一部分身体状态数据,电子设备100还可以获取用户另一部分用户状态数据。电子设备100还可以通过电子设备600获取驾驶数据,得到车上行驶数据。需要说明的是,电子设备100获取的用户数据、驾驶数据,也可以通过用户输入得到。
电子设备100还可以用于获取驾驶前疲劳程度。具体的,电子设备100可以将行为数据以序列的形式(又称为行为序列)作为第一疲劳模型的输入,得到驾驶前疲劳程度。第一疲劳模型的输出由输入的行为序列中用户行为的数量和用户行为的先后顺序决定。其中,行为 序列中的用户行为的先后顺序不同,第一疲劳模型得到的驾驶前疲劳程度不同。例如,行为序列<跑步,睡眠>和行为序列<睡眠,跑步>得到的驾驶前疲劳程度不同。行为序列中的用户行为的数量不同,第一疲劳模型得到的驾驶前疲劳程度不同。例如,行为序列<静坐,跑步,睡眠>和行为序列<静坐,跑步>得到的驾驶前疲劳程度不同。示例性的,第一疲劳模型可以为用于处理具有时序关系的数据的循环神经网络(recurrentneuralnetwork,RNN)模型。
电子设备100还可以用于获取驾驶中疲劳程度。具体的,电子设备100可以将车上行驶数据、身体状况数据作为第二疲劳模型的输入,得到驾驶中疲劳程度。其中,第二疲劳模型可以用于处理无时序关系的输入数据,得到输出结果。例如,第二疲劳模型可以为支持向量回归(supportvectorregression,SVR)模型。其中,由于SVR模型对于离群点引起的偏差具有更好的包容性,可以忽略预测结果和真实结果之间的较小误差,这样,更加适用疲劳程度的检测。
电子设备100还可以用于得到最终疲劳程度。其中,电子设备100可以基于驾驶前疲劳程度和驾驶中疲劳程度,进行加权求和计算,得到最终疲劳程度。之后,电子设备100可以基于最终疲劳程度,得到驾驶建议。也就是说,电子设备100可以基于最终疲劳程度,得到用户处于轻度疲劳、中度疲劳还是重度疲劳,并根据用户的疲劳状态给出相应的驾驶建议。其中,驾驶建议中可以包括推荐驾驶时长,推荐驾驶时长为用户达到重度疲劳的驾驶时长。可选的,电子设备100可以将驾驶建议发送至电子设备600,电子设备600可以显示该驾驶建议。
电子设备100可以用于存储行为数据、身体状态数据和车上行驶数据。并将这些数据作为模型训练的参数,进行模型的训练。具体的,电子设备100可以根据存储的车上行驶数据,身体状态数据,得到用户在驾驶时的疲劳程度,以及不同疲劳程度之间的时间间隔。例如,电子设备100可以标记用户在预设时间内打哈欠次数在1-2次,为轻度疲劳。打哈欠次数在3-5次为中度疲劳,打哈欠次数高于5次为重度疲劳。之后,电子设备100可以在训练过程中,将行为数据、身体状态数据和车上行驶数据输入到相应的模型中,得到最终疲劳程度。电子设备100可以将用户的疲劳程度作为真实结果,并基于真实结果,得到的最终疲劳程度和真实结果之间的误差值。电子设备100可以基于误差值调整模型的参数,直到误差值小于预设阈值,模型训练完成。电子设备100可以将误差值小于预设阈值的模型用于用户疲劳程度的检测。需要说明的是,电子设备100通过打哈欠次数标记用户的疲劳程度仅为示例,电子设备100还可以通过其他数据(例如,用户低头次数)标记用户的疲劳程度,本申请对此不作限定。
电子设备100还可以存储基于行为数据、身体状态数据和车上行驶数据得到的驾驶前疲劳程度、驾驶中疲劳程度和最终疲劳程度。需要说明的是,由于电子设备100会在用户驾驶过程中,实时采集车上行驶数据和身体状态数据。电子设备100可以相隔预设时间(例如,15分钟)基于预设时间内采集到的车上行驶数据和身体状态数据,再次计算驾驶中疲劳程度和最终疲劳程度。电子设备100可以将身体状态数据、车上行驶数据、驾驶中疲劳程度、最终疲劳程度与用户已驾驶的驾驶时长相互关联存储。这样,便于电子设备100得到用户的驾驶时长和疲劳程度的关系。
在一些实施例中,当电子设备100还没有获取到车上行驶数据时(即,用户还未驾驶出行时),电子设备100可以基于用户当前的行为数据,通过第一疲劳模型,得到驾驶前疲劳程度。电子设备100可以基于驾驶前疲劳程度和存储的驾驶前疲劳程度和驾驶时长的关系。电子设备100还可以基于该驾驶时长和驾驶前疲劳程度的关系,确定出用户达到重度疲劳的推 荐驾驶时长。可选的,当电子设备100可以确定出用户此次驾驶的总时长时,还可以根据驾驶总时长和推荐驾驶时长判定出用户是否会发生疲劳驾驶。
例如,电子设备100可以基于该驾驶前疲劳程度,确定出存储的多个驾驶前疲劳程度(又称为历史驾驶前疲劳程度)中和该驾驶前疲劳程度最相近的历史驾驶前疲劳程度。其中,最相近得到历史驾驶前疲劳程度可以为和当前的驾驶前疲劳程度之间的差值的绝对值最小的历史驾驶前疲劳程度。之后,电子设备100可以确定出该最相近的历史驾驶前疲劳程度对应的多个历史最终疲劳程度中最早达到重度疲劳的历史最终疲劳程度。并将该历史最终疲劳程度对应的驾驶时长,确定为推荐驾驶时长。可以理解的是,电子设备100可以不仅限于示例所述的通过驾驶前疲劳程度确定最相近的历史驾驶前疲劳程度,电子设备100还可以通过上述身体状态数据、驾驶时间、天气等中的一项或多项确定最相近的历史驾驶前疲劳程度。
在一种可能的实现方式中,通信系统10还包括服务器300。服务器300可以为云服务器。服务器300和电子设备100之间建立有通信连接。服务器300可以用于从电子设备100获取上述数据(包括有生理信息参数、行为数据、身体状态数据、驾驶前疲劳程度、驾驶中疲劳程度、最终疲劳程度、驾驶时长等)并存储。服务器300还可以基于上述参数进行模型训练,并进行疲劳程度和推荐驾驶时长的计算。可选的,服务器300和电子设备600之间可以建立通信连接,服务器300可以从电子设备600获取车上行驶数据。服务器300还可以将驾驶建议和推荐驾驶时长发送至电子设备600,并通过电子设备600进行显示。
需要说明的是,不仅限于上述通信连接方式,通信系统20还可以包括其他通信连接。例如,电子设备100也可以和电子设备700建立通信连接,并从电子设备700获取用户面部数据,本申请实施例对此不作限定。
接下来介绍本申请实施例提供的一种电子设备100的模块示意图。
如图9所示,电子设备100可以包括但不限于用户数据采集模块910、车上数据采集模块930、数据预处理模块920、模型计算模块940和驾驶建议判断模块950。
其中,用户数据采集模块910可以用于获取涉及用户的身体状况及行为习惯的用户数据。用户数据采集模块910可以运行在电子设备100的处理器,或者电子设备100的部分传感器(例如,加速度传感器)上,用户数据采集模块910可以通过和电子设备100建立有通信连接的其他电子设备(例如,电子设备500、电子设备600等)获取参数,或者,用户数据采集模块910还可以通过获取用户的输入,得到相关参数。可选的,用户数据采集模块910和感知模块310可以为同一个模块。
其中,涉及用户的身体状况及行为的用户数据可以包括但不限于用户的年龄、性别、身高、体重、体脂、心率、体温、血糖浓度、血氧饱和度、睡眠质量、睡眠时长、运动时长、运动强度等等。可选的,用户数据采集模块910可以用于接收用户输入的数据,从中获取到用户数据。或者,用户数据采集模块910可以通过相应的传感器获取用户数据。例如,可以通过加速度传感器获取用户的运动情况。再例如,可以通过光学传感器获取用户的心率等等。用户数据采集模块910还可以将用户数据发送至数据预处理模块920。
车上数据采集模块930可以用于获取用户在行驶过程中的驾驶数据。车上数据采集模块930可以通过和电子设备100建立有通信连接的电子设备(例如,电子设备600)获取驾驶数据。其中,驾驶数据可以用于体现用户在驾驶过程中车内环境状况、驾驶道路情况、用户的驾驶状态等等。驾驶数据可以包括但不限于车内的光线、噪音、温度,车辆的速度、加速度、速度的方差、加速度的方差,车辆与车道偏移的频次,跟车距离,道路情况,天气情况,用 户面部图像数据,用户驾驶车辆的时刻、以及用户驾驶车辆的驾驶时长等等。车上数据采集模块930可以通过相应的软件或硬件获取驾驶数据。例如,可以通过电子设备700的摄像头获取用户的眼动情况等。再例如,可以通过地图资源包和用户的实时定位获取车辆形式的道路情况(例如,潮汐道路、落石道路)等等,可以通过天气服务器获取驾驶时的天气状况(例如,晴天,下雨,下雪等等)。再例如,可以通过加速度传感器获取车辆的加速度等等。车上数据采集模块930还可以将驾驶数据发送至数据预处理模块920。
数据预处理模块920可以用于接收用户数据采集模块910和车上数据采集模块930采集到的数据,并针对接收的数据进行预处理操作,得到特征数据。其中,预处理操作可以包括但不限于异常值去除、缺失值填补、数据归一化、数据分类等等。数据预处理模块920可以将基于接收到的数据进行预处理操作后得到的特征数据发送至模型计算模块940。其中,特征数据可以包括行为数据、身体状态数据和车上行驶数据。数据预处理模块920的功能代码可以在电子设备100上运行,例如,可以在电子设备100的处理器上运行。
具体的,数据预处理模块920可以基于用户数据得到行为数据。行为数据用于指示用户在驾驶出行之前预设时间段内(例如,一个小时内)按照时间先后顺序发生的行为。例如,用户在预设时间段内,先后进行了跑步、走路和睡觉的活动。那么,数据预处理模块920可以基于用户数据中的心率、体温、位置等数据得到用户的行为数据,该行为数据可以表示为<跑步,走路,睡觉>。
数据预处理模块920还可以基于用户数据得到身体状态数据。身体状态数据用于表征用户的身体状况。其中,身体状态数据可以分为稳定型数据和波动型数据。其中,稳定型数据可以用于表征用户在一段时间内不会发生较大改变的数据,例如,年龄、性别、身高、体重、体脂等。其中,波动型数据可以用于表征用户随着用户的行为和环境的改变产生波动的数据,例如,心率、体温、血糖、血氧饱和度、睡眠质量、运动时长、运动强度等。
数据预处理模块920还可以基于驾驶数据得到车上行驶数据。车上行驶数据可以用于表征用户驾驶车辆时的周边环境情况和用户的实时驾驶情况。其中,车上行驶数据可以包括周边环境数据和用户面部数据。其中,周边环境数据用于表征车内环境(例如温度、光强等)以及车辆行驶情况(例如,车速、加速度、跟车距离、驾驶时长等)。其中,用户面部数据可以用于表征用户的驾驶状态,例如,用户的打哈欠频次、点头频次等。
在一些实施例中,由于用户驾驶车辆的时刻也会影响用户的驾驶状态(例如,在中午或凌晨驾驶车辆更容易感觉疲劳)。数据预处理模块920还可以记录得到特征数据的时刻。
模型计算模块940可以用于计算用户的疲劳程度。模型计算模块940可以在电子设备100的处理器上运行,例如,电子设备100的处理器可以为上述处理器110或者AI芯片等等。模型计算模块940还可以用于将疲劳程度的结果发送至驾驶建议判断模块950。具体的,模型计算模块940可以将行为数据作为第一疲劳模型的输入,计算得到驾驶前疲劳程度。模型计算模块940可以基于身体状态数据中的稳定型数据确定出第二疲劳模型,并将身体状态数据中的波动型数据和车上形式数据作为第二疲劳模型的数据,得到驾驶中疲劳程度。模型计算模块940可以将驾驶前疲劳程度和驾驶中疲劳程度进行加权求和,得到最终疲劳程度。模型计算模块940可以将最终疲劳程度发送给驾驶建议判断模块950。
可选的,模型计算模块940可以基于用户的驾驶时长,确定出计算最终疲劳程度时驾驶前疲劳程度和驾驶中疲劳程度的权重。示例性的,驾驶中疲劳程度的权重随着驾驶时长的增加而增加,并且驾驶前疲劳程度的权重同步减少。例如,若驾驶前疲劳程度和驾驶中疲劳程度的权重初始值为0.5,模型计算模块940可以在驾驶时间增加30分钟时,在驾驶中疲劳程 度的权重的值上增加0.05,并在驾驶前疲劳程度的权重的值上减少0.05。需要说明的是,不仅限于上述示例中描述的权重调整方式,模型计算模块940还可以通过其他方式调整权重,例如,模型计算模块940可以在驾驶时长达到2小时,调整驾驶前疲劳程度的权重为0.4,驾驶中疲劳程度的权重为0.6。模型计算模块940还可以在驾驶时长达到5小时,就调整驾驶前疲劳程度的权重为0.2,驾驶中疲劳程度的权重为0.8等等,本申请实施例对此不作限定。
需要说明的是,当用户还未驾车时,模型计算模块940只能得到驾驶前疲劳程度。模型计算模块940可以只将驾驶前疲劳程度发送至驾驶建议判断模块950。
驾驶建议判断模块950可以用于获取用户的出行信息。驾驶建议判断模块950可以在获取到用户的出行信息时,在触发时刻通知用户数据采集模块910将用户数据发送至数据预处理模块920。
示例性的,驾驶建议判断模块950可以通过用户的日程表、购票信息(又称为票据信息,例如,火车票、飞机票、演出票、电影票等)等获取用户的目的地点和到达时刻。其中,目的地点为日程表记录的地点或票证的使用地点。到达时刻为日程表记录的时间或票证指示的出发时刻或演出开始时刻等。例如,驾驶建议判断模块950可以基于飞机票获取用户的目的地点为出发机场,到达时刻为飞机的值机时刻等等。可选的,到达时刻可以比票证或日程表记录的时刻提前预设时间(例如,30分钟),这样,可以避免用户错过行程。
驾驶建议判断模块950可以在到达时刻前M个小时开始,确定用户的实时地点和目的地点之间的距离是否超过距离阈值(例如,1千米),其中,M大于等于0,例如,M的值可以取5。当驾驶建议判断模块950判定出用户的当前地点和目的地点之间的距离超过距离阈值,驾驶建议判断模块950可以基于从当前地点到目的地点的驾驶时长和到达时刻,确定出用户的出发时刻。驾驶建议判断模块950可以将出发时刻作为触发时刻,在触发时刻通知用户数据采集模块910将用户数据发送至数据预处理模块920。在一些实施例中,驾驶建议判断模块950可以直接从日程表或设置的闹钟获取用户的出发时刻。再例如,驾驶建议判断模块950可以获取用户的导航信息,基于导航信息确定出用户的出发时刻。
可选的,驾驶建议判断模块950可以将出发时刻前N个小时作为触发时刻,其中,触发时刻晚于到达时刻前M个小时,且触发时刻晚于当前时刻。其中,N大于等于0,例如,N的值可以取1。
在一些实施例中,驾驶建议判断模块950可以通过检测用户系安全带,关上驾驶座的车门,松开手刹,开车打火,踩油门等开始驾车的行为,判定出用户即将驾车出行。并将该时刻作为触发时刻。
驾驶建议判断模块950可以基于收到的驾驶前疲劳程度,得到驾驶建议。其中,驾驶建议中可以包括推荐驾驶时长,推荐驾驶时长用于指示用户达到重度疲劳时的总驾驶时长。驾驶建议判断模块950可以确定出和当前得到的驾驶前疲劳程度最相近的历史驾驶前疲劳程度。驾驶建议判断模块950可以确定出该最相近的历史驾驶前疲劳程度对应的多个历史最终疲劳程度中最早达到重度疲劳的历史最终疲劳程度。并将该历史最终疲劳程度对应的驾驶时长,确定为推荐驾驶时长。
可选的,驾驶建议判断模块950可以获取到用户从出发地点到目的地点的预计驾驶时长时,驾驶建议中还可以包括出行提示。具体的,驾驶建议判断模块950可以基于预计驾驶时长,确定出用户在驾驶过程中出现的疲劳程度。驾驶建议判断模块950可以基于当前时刻、出发时刻以及用户驾驶过程中出现的疲劳程度,得到出行提示。例如,若驾驶建议判断模块950判定出用户在驾驶过程中可能出现轻度疲劳或中度疲劳,且驾驶建议判断模块950判定 出当前时刻和出发时刻的时差超过时间阈值(例如,30分钟),出行提示可以用于提示用户休息一段时间。若驾驶建议判断模块950判定出用户在驾驶过程中可能出现轻度疲劳或中度疲劳,且驾驶建议判断模块950判定出当前时刻和出发时刻的时差小于等于时间阈值(例如,30分钟),出行提示可以用于提示用户准备提神饮料。若驾驶建议判断模块950判定出用户在驾驶过程中可能出现重度疲劳,出行提示可以用于提示用户通过其他出行方式(例如,公交出行或代驾出行)出行。
驾驶建议判断模块950还可以用于判断用户是否处于驾驶状态。驾驶建议判断模块950可以在接收到驾驶前疲劳程度后,每隔预设判定时间(例如,5分钟)判断用户是否在开车。驾驶建议判断模块950还可以在用户开车的过程中,每隔预设判定时间判断用户是否还在驾驶车辆。当驾驶建议判断模块950判定出用户正在开车时,可以通知用户数据采集模块910将用户数据发送至数据预处理模块920,并通知车上数据采集模块930将驾驶数据发送至数据预处理模块920。
可选的,驾驶建议判断模块950已经通过上述日程表或票务信息等获取到用户的出发时刻时,驾驶建议判断模块950可以在包括有出发时刻的预设时间内(例如,出发时刻及其前后5分钟内)实时判断用户是否处于驾驶状态。
可选的,驾驶建议判断模块950可以基于驾驶时长的增加,逐渐减少预设判定时间。需要说明的是,预设判定时间不能减少为0。
驾驶建议判断模块950可以基于模型计算模块940发送的最终疲劳程度,得到驾驶建议。其中,驾驶建议可以用于提示用户是否疲劳。可选的,驾驶建议中还可以包括推荐驾驶时长。例如,当驾驶建议判断模块950判定出用户为轻度疲劳或中度疲劳时,驾驶建议可以包括推荐驾驶时长。驾驶建议中还可以包括清醒提示信息,清醒提示信息可以用于提醒用户调低车内温度或饮用提神饮料,播放提神音乐等等。当驾驶建议判断模块950判定出用户为重度疲劳时,驾驶建议可以包括停车提示信息,该停车提示信息可以用于提示用户尽快停车休息。
驾驶建议判断模块950可以将驾驶建议发送至显示模块350,显示模块350可以显示该驾驶建议。驾驶建议判断模块950的相关功能代码可以运行在电子设备100的处理器上。
需要说明的是,本发明实施例示意的软件模块并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比上述更多或更少的软件模块,或者组合某些模块,或者拆分某些模块,等等。
接下来介绍本申请实施例提供的一种检测方法的流程示意图。
基于本申请实施例提供的一种检测方法,电子设备100可以在用户驾车前,获取用户的行为数据。并基于行为数据得到驾驶前疲劳程度和驾驶建议。电子设备100还可以在用户驾车过程中,获取用户的身体状态数据和车上行驶数据,并基于行为数据、身体状态数据和车上行驶数据,得到最终疲劳程度和驾驶建议。这样,电子设备100可以在用户驾车前,提示用户是否可以驾车出行,以及驾驶多久可能会疲劳驾驶。电子设备100还可以在用户驾车过程中,实时检测用户的疲劳程度,在用户达到轻度疲劳或中度疲劳时,提示用户通过调低车内温度等行为,降低疲惫感。在用户达到重度疲劳时,提示用户尽快停车休息等等。极大地降低了用户因为疲劳驾驶出现车祸的概率。
示例性的,如图10所示,该方法包括:
S1001,电子设备100获取用户的出行信息。
其中,出行信息可以包括但不限于出发时刻,到达时刻和触发时刻。其中,出发时刻为 用户开始驾驶的时刻,到达时刻为用户停车的时刻,触发时刻为电子设备100获取行为数据的时刻。
其中,电子设备100可以基于出发时刻,得到触发时刻。
在一些实施例中,电子设备100可以通过用户的日程表、购票信息(例如,火车票、飞机票、演出票、电影票等)等获取用户的目的地点和到达时刻。可选的,到达时刻可以比票证或日程表记录的时刻提前预设时间(例如,30分钟),这样,可以避免用户错过行程。
之后,电子设备100可以基于用户的目的地点、到达地点和用户的实时地点,得到用户的出发时刻。例如,电子设备100可以在到达时刻前X个小时开始,确定用户的实时地点和目的地点之间的距离是否超过距离阈值(例如,1千米),其中,X大于等于0,例如,X的值可以取5。当电子设备100判定出用户的当前地点和目的地点之间的距离超过距离阈值,电子设备100可以基于从当前地点到目的地点的驾驶时长和到达时刻,确定出用户的出发时刻。
在另一些实施例中,电子设备100可以直接从日程表或设置的闹钟获取用户的出发时刻。
在另一些实施例中,驾驶建议判断模块可以获取用户的导航信息,基于导航信息确定出用户的出发时刻。
电子设备100可以将出发时刻作为触发时刻,或者,电子设备100可以将出发时刻前N个小时作为触发时刻,其中,触发时刻晚于到达时刻前Y个小时。其中,Y大于等于0,例如,Y的值可以取1。也就是说,触发时刻早于出发时刻,并且出发时刻和触发时刻相差预设时间,预设时间可以为Y个小时。
或者,电子设备100可以在检测到用户的开始驾车行为时,将该时刻作为出发时刻及触发时刻。其中,开始驾车行为可以包括但不限于电子设备100和电子设备600建立通信连接、系安全带、关上驾驶座的车门,松开手刹,开车打火,踩油门等。
在一些实施例中,电子设备100在用户驾车过程中,通过用户的定位确定出用户正在驾驶车辆。需要说明的是,电子设备100还未获取到行为数据。电子设备100可以将该检测到用户驾驶行为的时刻,作为触发时刻以及出发时刻。可以理解的是,电子设备100可以在执行步骤S1002-步骤S1004后,直接执行步骤S1006及其后续步骤。
S1002,电子设备100获取用户的行为数据。
当电子设备100判定出当前时刻为触发时刻后,可以获取触发时刻前预设时间内(例如,6小时内)用户的行为数据。
例如,电子设备100可以直接通过电子设备500获取触发时刻前预设时间内的行为数据。再例如,电子设备100可以通过电子设备500获取触发时刻前预设时间内的用户数据,再基于用户数据,得到行为数据。
S1003,电子设备100将行为数据作为第一疲劳模型的输入,得到驾驶前疲劳程度。
具体的,电子设备100可以将行为数据以行为序列的形式作为第一疲劳模型的输入,得到驾驶前疲劳程度。第一疲劳模型的输出由输入的行为序列中用户行为的数量和用户行为的先后顺序决定。其中,行为序列中的用户行为的先后顺序不同,第一疲劳模型得到的驾驶前疲劳程度不同。具体的,可以参见上述图8所示实施例,在此不再赘述。
可选的,当电子设备100的触发时刻在出发时刻之前,电子设备100可以基于触发时刻到出发时刻之间,用户最频繁的行为,作为行为序列的最后一项行动。例如,电子设备100在触发时刻获取用户的行为序列为<运动、静坐>。电子设备100检测到用户在之前的一段时间内(例如,前一个月内),触发时刻和出发时刻之间,睡觉的次数最多。电子设备100可以 得到行为序列为<运动、静坐、睡觉>。电子设备100检测到用户在之前的一段时间内(例如,前一个月内),触发时刻和出发时刻之间,静坐的次数最多。电子设备100可以得到行为序列为<运动、静坐>。或者,电子设备100也可以直接使用该行为序列。
S1004,电子设备100基于驾驶前疲劳程度,得到并显示驾驶建议。
电子设备100可以基于驾驶前疲劳程度,得到驾驶建议,并显示。具体的,电子设备100可以基于驾驶前疲劳程度,从存储的一个或多个驾驶前疲劳程度(即,历史驾驶前疲劳程度)中,确定出和当前得到的驾驶前疲劳程度最相近的历史驾驶前疲劳程度。在一个或多个历史驾驶前疲劳程度中,最相近的历史前疲劳程度和当前得到的驾驶前疲劳程度之间的差值的绝对值最小。
可选的,电子设备100还可以基于出发时刻或触发时刻确定出最相近的历史驾驶前疲劳程度。在一个或多个历史驾驶前疲劳程度中,最相近的历史前疲劳程度和当前得到的驾驶前疲劳程度之间的出发时刻或触发时刻最邻近。
电子设备100可以获取最相近的历史前疲劳程度对应的一个或多个最终疲劳程度。并基于一个或多个最终疲劳程度中,最早达到重度疲劳的最终疲劳程度。电子设备100可以将最早达到重度疲劳的最终疲劳程度对应的驾驶时长,作为推荐驾驶时长。
可选的,电子设备100可以获取最相近的历史前疲劳程度对应的一个或多个驾驶中疲劳程度,并基于一个或多个驾驶中疲劳程度和当前得到的驾驶前疲劳程度,依次计算得到一个或多个最终疲劳程度。电子设备100可以基于一个或多个最终疲劳程度的值,得到最早达到重度疲劳的最终疲劳程度。电子设备100可以获取该最终疲劳程度对应的驾驶中疲劳程度的驾驶时长。并将该驾驶时长作为推荐驾驶时长。
电子设备100可以显示包括有推荐驾驶时长的驾驶建议。这样,电子设备100可以在用户还未开始驾驶,或驾驶时长不超过预设初始时间(例如,10分钟)时,提示用户可以连续驾驶的最长驾驶时间,改善用户疲劳驾驶问题。
可选的,电子设备100可以基于预计驾驶时长、出发时刻和触发时刻,得到驾驶建议。当电子设备100基于预计驾驶时长,判定出用户在驾驶过程中会出现轻度疲劳或中度疲劳时,若触发时刻比出发时刻早,且触发时刻和出发时刻之间的时间差大于时间阈值(例如,30分钟),驾驶建议中可以包括出行提示,出行提示可以用于提示用户休息一段时间。若触发时刻比出发时刻早,且触发时刻和出发时刻之间的时间差小于或等于时间阈值(例如,30分钟),驾驶建议中可以包括出行提示,出行提示可以用于提示用户准备提神饮料等。当电子设备100基于预计驾驶时长,判定出用户在驾驶过程中会出现重度疲劳时,驾驶建议中可以包括出行提示,出行提示可以用于提示用户通过其他出行方式(例如,公交出行或代驾出行)出行,此时,推荐驾驶时长为零小时。
在一种可能的实现方式中,电子设备100可以只执行步骤S1001至步骤S1004。这样,电子设备100可以在用户驾驶车辆出行前,得到推荐驾驶时间,避免疲劳驾驶。
S1005,电子设备100判断用户是否处于驾驶状态。
电子设备100可以在获取到行为数据后,可以判断用户是否处于驾驶状态。具体的,电子设备100可以每隔预设判定时间,判定用户是否在开车。例如,电子设备100可以通过电子设备600获取车辆的速度、加速度、方向盘的转动情况等判定用户是否在开车。再例如,电子设备100可以通过传感器获取用户的位置信息,判断用户是否在开车。
可选的,电子设备100获取的触发时刻早于出发时刻时,电子设备100可以在出发时刻判断用户是否处于驾驶状态。可以理解的是,由于出发时刻时间范围较小,电子设备100判 定时可能会出现误差,故而电子设备100可以在包括有出发时刻这一时间点的一段时间内,每隔预设判定时间判断用户是否处于驾驶状态。
当电子设备100判定出用户处于驾驶状态时,可以执行步骤S1006。
需要说明的是,当电子设备100在判定出用户不处于驾驶状态时,相隔预设判定时间,重新判断用户是否处于驾驶状态。
S1006,电子设备100获取车上行驶数据,身体状态数据。身体状态数据包括稳定型数据和波动型数据。
电子设备100可以接收电子设备500发送的用户数据,并基于用户数据,经过预处理操作,得到身体状态数据。电子设备100还可以接收电子设备600发送的驾驶数据,并基于驾驶数据,经过预处理操作,得到车上行驶数据。具体的,可以参见图9所示实施例,在此不再赘述。
S1007,电子设备100基于身体状况数据中的稳定型数据,确定出第二疲劳模型。
在一种可能的实现方式中,服务器300中存储有多个用户的身体状态数据、车上行驶数据和驾驶中疲劳程度。服务器300可以基于身体状态数据中的稳定型数据进行分类,将用户划分为不同类型。在一些实施例中,服务器300可以基于年龄、体重、性别、身高等将用户分为某个年龄段、某个身高范围、某个体重范围的某个性别的用户。例如,服务器300可以将年龄在20-岁35岁,体重在60千克-70千克,身高在170厘米-180厘米,性别为男的用户划分为一个类型。服务器300可以基于该类型的多个用户的身体状态数据、车上行驶数据和驾驶中疲劳程度,训练得到一个第二疲劳模型。这样,服务器300可以得到多个类型的用户及其对应的第二疲劳模型。电子设备100可以基于用户的身体状况数据,确定出用户属于哪一个用户类型,并从服务器中下载该类型用户对应的第二疲劳模型。
进一步的,电子设备100获取到第二疲劳模型后,可以基于存储的用户的身体状态数据、车上行驶数据和驾驶中疲劳程度,针对第二疲劳模型进行训练,并保存训练后的第二疲劳模型。电子设备100可以使用该训练后的第二疲劳模型,计算得到用户的驾驶中疲劳程度。也就是说,电子设备100可以基于历史用户身体状态数据、历史车上行驶数据和历史驾驶中疲劳程度,训练得到第二疲劳模型。
在另一种可能的实现方式中,电子设备100可以直接基于历史用户身体状态数据、历史车上行驶数据和历史驾驶中疲劳程度,训练得到第二疲劳模型。
S1008,电子设备100将车上行驶数据和波动型数据作为第二疲劳模型的输入,得到驾驶中疲劳程度。
模型计算模块可以将身体状态数据中的波动型数据和车上行驶数据作为第二疲劳模型的数据,得到驾驶中疲劳程度。其中,第二疲劳模型可以用于处理无时序关系的输入数据,得到输出结果。
S1009,电子设备100基于驾驶前疲劳程度和驾驶中疲劳程度,得到最终疲劳程度。
电子设备100可以将驾驶前疲劳程度和驾驶中疲劳程度进行加权求和,得到最终疲劳程度。其中,驾驶前疲劳程度和驾驶中疲劳程度的权重都大于零,且驾驶前疲劳程度的权重和驾驶中疲劳程度的权重之和等于1。
可选的,电子设备100可以基于用户的驾驶时长,确定出计算最终疲劳程度时驾驶前疲劳程度和驾驶中疲劳程度的权重。示例性的,电子设备100可以随着驾驶时长的增加,增加驾驶中疲劳程度的权重,并且减少驾驶前疲劳程度的权重。
S1010,电子设备100基于最终疲劳程度,得到并显示驾驶建议。
电子设备100可以基于最中疲劳程度,得到驾驶建议,并显示。其中,驾驶建议可以用于提示用户是否疲劳。可选的,驾驶建议中还可以包括推荐驾驶时长。例如,当驾驶建议判断模块判定出用户为轻度疲劳或中度疲劳时,驾驶建议可以包括推荐驾驶时长。驾驶建议中还可以包括清醒提示信息,清醒提示信息可以用于提醒用户调低车内温度或饮用提神饮料,播放提神音乐等等。可选的,电子设备100可以直接通知车载空调调低车内温度,或/和通知车载音响播放提神的音乐。
当驾驶建议判断模块判定出用户为重度疲劳时,驾驶建议可以包括停车提示信息,该停车提示信息可以用于提示用户尽快停车休息。
可选的,电子设备100判定出用户为重度疲劳时,还可以显示离用户的当前地点最近的停车位置,并且显示当前地点到停车位置的导航信息。
可选的,电子设备100可以将驾驶建议发送至电子设备600,电子设备600可以显示该驾驶建议。进一步可选的,电子设备100还可以将导航信息发送至电子设备600,电子设备600可以显示该导航信息。
S1011,电子设备100判断用户是否处于驾驶状态。
当电子设备100执行步骤S1010后,可以相隔预设判定时间判断用户是否还处于驾驶状态。例如,电子设备100可以通过车辆的速度、加速度判断用户是否处于驾驶状态。当电子设备100判定出用户还处于驾驶状态时,电子设备100可以执行步骤S1006-步骤S1011。当电子设备100判定出用户不处于驾驶状态,电子设备100可以停止执行疲劳检测流程(即,步骤S1006-步骤S1011)。
可选的,电子设备100可以基于驾驶时长,调整预设判定时间。驾驶时长越长,预设判定时间越短,其中,预设判定时间的值大于零。
可选的,上述步骤S1002-步骤S1004、步骤S1007-步骤S1009可以由服务器300执行。
在一种可能的实现方式中,电子设备100可以每隔预设判定时间,判断用户是否处于驾驶状态。并在判定出用户处于驾驶状态后,获取该判定出用户处于驾驶状态的时刻前,预设时间内用户的行为数据。电子设备100可以基于该行为数据得到驾驶前疲劳程度。之后,电子设备100可以直接执行步骤S1006至步骤S1011。这样,电子设备100可以只针对驾驶行为发生过程中,进行用户的疲劳程度的判断,避免用户产生疲劳驾驶的行为。
在一种可能的实现方式中,电子设备100可以直接基于用户的身体状况数据,确定出用户的驾驶中疲劳程度。在一些实施例中,电子设备100可以基于用户的身体状况数据,通过第二疲劳模型,确定出驾驶中疲劳程度。其中,第二疲劳模型可以基于用户的历史身体状况数据训练得到,也可以基于用户的身体状况数据,从服务器300中下载得到。具体的,电子设备100从服务器300获取第二疲劳模型的步骤可以参见上述步骤S1007所示实施例,在此不再赘述。
图11和图12示例性示出了该检测方法的两个应用场景。
当电子设备100获取到用户的出行信息后,电子设备100可以获取用户的行为数据,并基于行为数据,得到驾驶前疲劳程度。电子设备100可以基于出行信息和驾驶前疲劳程度,得到驾驶建议。
如图11所示,图11示例性示出了用户所处的室内环境,其中,用户正在使用电子设备100。电子设备100可以基于用户的机票信息,获取到用户的出行信息。例如,电子设备100检测到用户的起飞时刻为“13:30”,用户的乘机地点为“深圳宝安机场T3”。电子设备100 获取到的用户的出行信息包括当前地点到用户的乘机地点的预计驾驶时长、出发时刻和到达时刻。在此,若电子设备100得到预计驾驶时长为60分钟,由于电子设备100检测到航空公司要求提前至少半个小时值机,电子设备100可以将到达时刻确定为“13:00”,出发时刻确定为“12:00”。在此,电子设备100可以将触发时刻设置为“11:00”。电子设备100可以获取用户在“9:00-11:00”的行为数据,例如,电子设备100可以通过电子设备500获取用户的行为数据。电子设备100可以基于用户行为数据和第一疲劳模型,得到驾驶前疲劳程度。电子设备100还可以基于驾驶前疲劳程度,得到驾驶建议。在此,若电子设备100判定出用户在驾驶过程中会出现轻度疲劳或中度疲劳。电子设备100可以显示包括有出行提示信息的驾驶建议。出行提示信息可以为文字类提示信息、图片类提示信息、语音类提示信息中的一种或多种。在此,出行提示信息可以为文字类提示信息:“用户您好,根据您的航班信息,您接下里可能需要驾车去机场。在驾驶过程中,你可能会感觉疲惫,建议您午休半小时,再驾车出行”。这样,用户在驾驶前,可以根据驾驶建议降低自己的疲劳程度,改善疲劳驾驶问题。
当电子设备100检测到用户正在开车时,电子设备100可以获取用户的身体状态数据和车上行驶数据,并基于身体状态数据和车上行驶数据,得到驾驶中疲劳程度。电子设备100可以基于图11所示的驾驶前疲劳程度和驾驶中疲劳程度,得到最终疲劳程度。电子设备100可以基于最终疲劳程度,得到驾驶建议。
如图12所示,图12示例性示出了车内环境,当用户开车时,电子设备100可以和电子设备600建立通信连接。电子设备100还可以通过电子设备600获取车上行驶数据。电子设备100可以基于车上行驶数据等,得到最终疲劳程度及驾驶建议。具体的,可以参见图10所示实施例,在此不再赘述。在此,若电子设备100判定出用户处于轻度疲劳或中度疲劳,电子设备100可以得到包括清醒提示信息的驾驶建议。电子设备100可以将该驾驶建议发送至电子设备600。电子设备600可以显示包括有清醒提示信息的驾驶建议。清醒提示信息可以为文字类提示信息、图片类提示信息、语音类提示信息中的一种或多种。在此,清醒提示信息可以为文字类提示信息:“驾驶员,您好,您目前比较疲劳,建议您调低车内温度,或者播放提神的音乐,避免疲劳驾驶”。这样,用户在驾驶过程中,可以根据驾驶建议降低自己的疲劳程度,改善疲劳驾驶问题。
在一些应用场景中,通过手机打车出行已经成为许多用户的出行方式,例如,当用户饮酒之后,或者比较疲惫,或者车辆在充电时,用户可以通过打车软件叫车出行。但用户可能在乘车时将随身物品遗失在车上。若乘客将物品遗失在车上,乘客需要找到司机取回丢在车上的物品,耽误乘客和司机的行程。并且乘客找回的落在车上的物品的可能性不高。因此,本申请实施例提供了一种检测方法。电子设备100可以和车机设备900建立蓝牙连接。车机设备900可以在检测到乘客的开门操作后,获取乘客上车前的车内图像(又称为上车前车内图像)。车机设备900还可以在检测到乘客下车后,获取乘客下车后的车内图像(又称为下车后车内图像)。车机设备900可以基于上车前车内图像和下车后车内图像,判断乘客下车后车内是否还包括乘客的物品。当车机设备900判定出车内还包括有乘客的物品,可以播报物品遗漏提示信息,该物品遗漏提示信息可以用于提示司机乘客的物品遗留在车上。同时,车机设备900也可以向电子设备100发送物品遗漏提示信息,电子设备100可以在接收到物品遗漏提示信息后,显示物品遗漏提示信息。该物品遗漏提示信息用于提示乘客有物品遗留在车上。这样,可以防止乘客物品遗留在车上。
其中,电子设备100可以为手机、平板电脑、可穿戴式设备等等。电子设备100的硬件结构可以参见图1所示的电子设备100的结构示意图,在此不再赘述。车机设备900可以用于获取车辆的数据,例如,车机设备900可以用于检测车门的打开和关闭,获取车内图像,检测车辆的速度、加速度等等。
接下来介绍本申请实施例提供的一种电子设备100的结构示意图。
如图13所示,该电子设备100可以包括但不限于蓝牙模块1302、加速度传感器1301和处理器1303。
其中,加速度传感器1301可以用于获取电子设备100的加速度。加速度传感器1301还可以用于将加速度发送至处理器1303。加速度传感器1301还可以将加速度发送至蓝牙模块1302。
蓝牙模块1302可以用于和车机设备900建立访客蓝牙连接。其中,访客蓝牙连接可以用于电子设备100和车机设备900建立不需要用户输入,即可实现配对和密钥验证的蓝牙连接。电子设备100可以通过调用相关函数设置蓝牙功能,实现访客蓝牙连接。
例如,电子设备100可以通过createBond()函数直接创建配对请求,向车机设备900发送配对请求。电子设备100还可以通过调用setPin()函数行密钥设置,将密钥设置为指定数值。电子设备100还可以通过cancelPairingUserInput()函数取消密钥输入。这样,电子设备100可以和车机设备900创建不需要配对和密钥的蓝牙连接(即,访客蓝牙连接)。还需要说明的是,电子设备100的用户在后续描述中可以称为乘客。
蓝牙模块1302还可以用于向车机设备900发送数据(例如,电子设备100的加速度等)。将加速度发送至车机设备900。蓝牙模块1302还可以用于接收车机设备900的加速度。
处理器1303可以用于判断是否断开和车机设备900的访客蓝牙连接。例如,处理器1303可以为图1所示的处理器110。也就是说,处理器1303可以基于电子设备100的加速度和车机设备900的加速度来判断是否断开访客蓝牙连接。当处理器1303判定出电子设备100的加速度和车机设备900的加速度相同时,处理器1303可以通过蓝牙模块1302向车机设备900发送确认成功信令,该确认成功信令可以用于指示车机设备900不断开访客蓝牙连接。当处理器1303判定出电子设备100的加速度和车机设备900的加速度不同时,处理器1303可以断开和车机设备900的蓝牙连接。处理器1303还可以通过蓝牙模块1302向车机设备900发送确认失败信令,该确认失败信令可以用于指示车机设备900断开访客蓝牙连接。
需要说明的是,由于电子设备100和车机设备900的传感器不同,电子设备100的加速度和车机设备900的加速度可能有偏差。故而,处理器1303可以在电子设备100的加速度和车机设备900的加速度之间的差值的绝对值不超过加速度偏差阈值时,判定电子设备100的加速度和车机设备900的加速度相同。其中,加速度偏差阈值可以为固定值(例如,0.001m/s- 2)。可选的,加速度偏差阈值可以基于传感器的最大误差值得到。其中,传感器的最大误差值可以由传感器的生产厂家提供。电子设备100和车机设备900存储有各自的传感器的最大误差值。电子设备100和车机设备900在传输加速度之前,可以进行各自的传感器的最大误差值的传输。加速度偏差阈值可以为电子设备100的传感器的最大误差值和车机设备900的传感器的最大误差值之和。这样,可以基于不同的电子设备,得到适用的加速度偏差阈值。
接下来介绍本申请实施例提供的一种车机设备900的结构示意图。
如图14所示,该车机设备900包括但不限于加速度传感器1401,蓝牙模块1402,摄像 头1403和处理器1404。
其中,加速度传感器1401可以用于获取车机设备900的加速度。加速度传感器1401还可以用于将加速度发送至处理器1404。加速度传感器1401还可以将加速度发送至蓝牙模块1402。
蓝牙模块1402可以用于和电子设备100建立访客蓝牙连接。蓝牙模块1402还可以用于接收电子设备100发送的数据(例如,电子设备100的加速度,确认成功信令,确认失败信令,物品遗漏提示信息等)。蓝牙模块1402还可以用于将车机设备900的数据(例如,车机设备900的加速度)发送至电子设备100。
摄像头1403可以用于获取车内的图像。其中,摄像头1403获取的车内的图像包括有上车前车内图像和下车后车内图像。
处理器1404可以用于基于车内的图像判断乘客的物品是否遗留在车内。也就是说,处理器1404可以在检测到乘客上车的操作后,通过摄像头1403获取上车前车内图像。处理器1404可以在检测到乘客下车的操作后,通过摄像头1403获取下车后车内图像。其中,处理器1404可以通过摄像头获取的图像,通过图像识别算法(例如,卷积神经网络算法)检测乘客的上车操作和下车操作。处理器1404可以通过上车前车内图像,确定出乘客上车前,车内的物品信息。处理器1404可以通过下车后车内图像,确定出乘客下车后车内的物品信息。处理器1404可以对比乘客上车前的物品信息和乘客下车后的物品信息,确定出乘客下车后车内的物品是否和乘客上车前车内的物品相同。若相同,处理器1404可以确定出乘客的物品没有遗留在车内。若不同,处理器1404可以提示司机乘客的东西遗留在车上。例如,处理器1404可以指示车载蓝牙播报第一遗漏提示信息。该第一遗漏提示信息可以提示司机乘客的东西遗留在车上。或者,处理器1404可以指示车辆中控显示屏显示该第一遗漏提示信息。处理器1404还可以通过蓝牙模块1402向电子设备100发送物品遗漏指示信息。该物品遗漏指示信息可以用于指示电子设备100显示第二遗漏提示信息,该第二遗漏提示信息可以用于提示乘客有物品遗留在车内。
在一些实施例中,车机设备900还包括车门传感器和压力传感器。其中,车门传感器可以用于检测乘客打开车门的操作。压力传感器可以用于检测乘客是否在座位上。这样,车机设备900可以通过车门传感器和压力传感器检测到乘客的上车操作和下车操作。
接下来介绍本申请实施例提供的一组界面示意图。
示例性的,如图15A所示,电子设备100可以显示桌面1501。其中,桌面1501可以包括多个应用图标,例如,打车应用图标1502等等。其中,该打车应用图标1502可以用于触发显示打车应用的界面(例如,图15B所示的打车应用界面1510)。打车应用可以用于向司机发送乘客的出发地点和目的地点。打车应用还可以用于向乘客发送司机的信息(位置信息、车牌号码、车辆颜色等)。其中,桌面1501的上方还可以显示状态栏,该状态栏中可以显示蓝牙图标。该蓝牙图标用于指示电子设备100开启了蓝牙功能。
电子设备100可以接收到乘客针对打车应用图标1502的输入,响应于该输入,显示如图15B所示的打车应用界面1510。
如图15B所示,打车应用界面1510可以包括文本框1511、文本框1512和呼叫车辆控件1513。其中,文本框1511可以用于获取并显示用户的出发地点。文本框1512可以用于获取并显示用户的目的地点。呼叫车辆控件1513可以用于将出发地点和目的地点发送给司机的电子设备(例如,车机设备900)。例如,文本框1511可以显示有出发地点“AA街道”,文本 框1512可以显示有目的地点“BB大厦”。
电子设备100可以在接收到乘客针对呼叫车辆控件1513的输入后,响应于该输入,显示如图15C所示的打车应用界面1520。同时,电子设备100可以将出发地点和目的地点发送给车机设备900。当车机设备900收到电子设备100的出发地点和目的地点后,可以将车辆信息(例如,车辆位置信息、车牌号、司机名称、车辆颜色等)发送至电子设备100。电子设备100在接收到车辆信息后,可以显示如图15D所示的打车应用界面1530。
如图15D所示,打车应用界面1530可以包括有车辆信息栏1531。该车辆信息栏1531可以用于显示车辆的信息。该车辆信息栏1531还可以用于显示车牌号为“A123”的车辆到达出发地点“AA街道”的时间。可选的,打车应用界面1530还可以包括地图动画,该地图动画可以用于显示车辆和用户的位置。
电子设备100在接收到车辆信息时,可以打开访客蓝牙功能,并广播访客蓝牙连接请求。其中,访客蓝牙功能可以用于电子设备100和车机设备900建立访客蓝牙连接。其中,该访客蓝牙连接可以用于电子设备100和车机设备900之间传输加速度信息,还可以用于车机设备900向电子设备100发送有物品遗留在车上的提示信息。需要说明的是,电子设备100可以通过设置蓝牙功能,进行密钥设置,取消密钥信息输入设置,取消配对请求创建设置。这样,电子设备100可以和车机设备900创建不需要配对和密钥的蓝牙连接(即,访客蓝牙连接)。还需要说明的是,电子设备100的用户在后续描述中可以称为乘客。
车机设备900可以在检测到乘客打开车门的操作时,通过摄像头获取上车前车内图像。其中,车机设备900可以通过车门传感器获取打开车门的操作,或者通过摄像头采集的画面识别出乘客打开车门的操作。车机设备900还可以在检测乘客上车后,开启访客蓝牙功能。其中,车机设备900可以通过摄像头采集的画面识别出乘客,确定出乘客上车。或者,车机设备900可以通过压力传感器判断乘客是否上车。车机设备900可以在开启访客蓝牙功能后,收到电子设备100的访客蓝牙连接请求。车机设备900接收到电子设备100的访客蓝牙连接请求后,可以向电子设备100发送访客蓝牙连接响应。电子设备100收到访客蓝牙连接响应后,电子设备100和车机设备900建立访客蓝牙连接。
电子设备100和车机设备900可以通过访客蓝牙连接交换各自的加速度。该加速度可以用于判断电子设备100和车机设备900是否处于同一辆车。若电子设备100和车机设备900确定出电子设备100和车机设备900的加速度不同,即不处于同一辆车,电子设备100和车机设备900可以断开该访客蓝牙连接。
在一种可能的实现方式中,电子设备100和车机设备900确定出电子设备100和车机设备900的加速度不同后,电子设备100还可以记录车机设备900的标识信息(例如,车机设备900的蓝牙设备名称)。在预设禁止接入时间内(例如,1小时内),电子设备100可以在基于标识信息确定出建立访客蓝牙连接的设备为车机设备900时,不与车机设备900建立访客蓝牙连接。这样,电子设备100可以记录不处于同一辆车的电子设备的标识,防止再次接入错误的电子设备,提高电子设备100和处于同一辆车的电子设备建立访客蓝牙连接的可能性。可以理解的是,电子设备100可以在预设禁止接入时间后,删除车机设备900的标识信息。
若电子设备100和车机设备900确定出电子设备100和车机设备900的加速度相同,即处于同一辆车,电子设备100和车机设备900可以不断开该访客蓝牙连接。接下来本申请实施例将以电子设备100和车机设备900处于同一辆车进行撰写。
车机设备900可以在检测到乘客打开车门的操作时,触发车机设备900判断乘客是否下 车,并在判定出乘客下车时,通过摄像头获取下车后车内图像。其中,车机设备900可以通过车门传感器检测乘客打开车门的操作,或者通过摄像头采集的画面识别出乘客打开车门的操作。在一些实施例中,车机设备900可以在检测到乘客打开车门的操作后,获取摄像头采集的车内图像,并基于车内图像判断乘客是否下车。当车机设备900基于车内图像判定出乘客下车后,可以将乘客下车后的车内图像作为下车后车内图像。例如,车机设备900可以在识别出图像中座位区域内没有乘客的图像,判定出乘客下车。在另一些实施例中,车机设备900可以通过座位处的压力传感器判定乘客是否离开座位,当车机设备900判定出乘客离开座位后,通过车内摄像头获取下车后车内图像。在另一些实施例中,车机设备900可以通过摄像头和压力传感器共同判断乘客是否下车。例如,车机设备900可以在通过压力传感器判定出乘客离开座位后,再基于摄像头采集的图像判断乘客是否离开座位区域。这样,车机设备900获取的下车后车内图像不包括乘客,更便于识别车内的物品。
车机设备900获取到下车后车内图像之后,可以基于上车前车内图像和下车后车内图像判断乘客的物品是否遗留在车上。车机设备900可以通过图像识别算法,识别得到上车前车内图像里的物品,以及下车后车内图像里的物品。车机设备900可以对比上车前车内物品和下车后车内物品是否相同,判断乘客的物品是否遗留在车内。当车机设备900判定出下车后车内物品和上车前车内物品不同时,即可判定出乘客的物品遗留在车内。车机设备900可以在判定出乘客的物品遗留在车内后,通过车载音响播报第一遗漏提示信息。例如,该第一遗漏提示信息可以为:“乘客的东西遗留在车内,请提醒乘客取回”。
车机设备900还可以将物品遗漏指示信息发送给电子设备100。电子设备100收到该物品遗漏指示信息后,可以显示第二遗漏提示信息。
示例性的,电子设备100可以在收到该物品遗漏提示信息后,显示如图15E所示的提示框1541。该提示框1541可以包括第二遗漏提示信息。该第二遗漏提示信息可以以文字,动画,图片等形式显示。例如,第二遗漏提示信息可以为文字类提示信息:“您的物品遗失在车上,司机还未驶离下车地点,请尽快取回”。可选的,电子设备100可以在图15E所示的打车应用界面1540上显示提示框1541,该打车应用界面1540可以用于显示乘客的打车费用。
需要说明的是,电子设备100可以不仅限于以文字的形式显示该第二遗漏提示信息,电子设备100还可以以语音播报的形式显示该第二遗漏提示信息。进一步的,电子设备100还可以通过震动机身,提示用户查看第二遗漏提示信息。
进一步的,车机设备900可以在检测到乘客关闭车门的操作时,再次通过上述实施例所述的方法(例如,压力传感器)判定乘客是否下车。当车机设备900判定出乘客下车后,在播报第一遗漏提示信息,并向电子设备100发送物品遗漏指示信息。这样,可以避免乘客暂时下车的场景中(例如,乘客下车让行的场景),错误提醒乘客有物品遗留在车内。
下面介绍本申请实施例提供的一种检测方法的流程示意图。
示例性的,如图16所示,该方法包括:
S1601,电子设备100接收到乘客针对第一应用的输入。
其中,第一应用可以为打车类应用(例如,上述图15A所示的打车应用)。第一应用可以用于接收乘客的输入,并从乘客的输入中获取乘客的打车信息。该打车信息中可以包括触发地和目的地。第一应用还可以用于将乘客打车信息发送给司机。
其中,针对第一应用的输入可以针对第一应用的图标的输入(例如,上述针对图15A所示的打车应用图标1502的输入),或者,针对第一应用提供的打车页面的打车控件的输入。 为上述针对图15A所示的打车应用图标1502的输入(例如,上述针对图15A所示的打车应用图标1502的输入)。
电子设备100可以在接收到乘客针对第一应用的输入后,向附近的电子设备广播访客蓝牙连接请求。
可选的,当电子设备100接收到用户针对第一应用的图标的输入时,电子设备100可以在接收到针对第一应用的图标的输入后,相隔预设时间(例如,2分钟)再广播访客蓝牙连接请求。
S1602,车机设备900检测到乘客的上车操作,获取乘客上车前的车内图像。
车机设备900可以在检测到乘客的上车操作时,通过车内摄像头获取乘客上车前的车内图像(又称为上车前车内图像)。车机设备900还可以通过图像识别算法,识别得到上车前车内图像中的物品信息。
其中,乘客的上车操作可以为乘客的开车门操作,或者司机的刹车操作等等。例如,车机设备900可以通过车门传感器,检测乘客的上车操作。再例如,车机设备900可以通过加速度传感器,检测乘客的上车操作。再例如,车机设备900可以通过车内摄像头获取的图像,检测乘客的上车操作。
示例性的,图17A示出了车机设备900获取的上车前车内图像。在乘客已经打开车门,还未上车时,车内仅包括司机的物品。车机设备900可以通过上车前车内图像得到车内的物品清单为{<瓶子,1>},其中,瓶子为物品的标识,1为物品的数量。需要说明的是,图17A所示的上车前车内图像及得到的物品清单仅为示例,不会对实际应用中车机设备900获取的上车前车内图像构成具体限定。例如,该物品清单中物品的标识可以标注为物品A。
S1603,车机设备900检测到乘客的坐下操作,开启访客蓝牙功能。
车机设备900可以在检测到乘客的坐下操作(即,检测到乘客在车内坐下)时,开启访客蓝牙功能,并接收电子设备100发送的访客蓝牙连接请求。
其中,车机设备900可以通过压力传感器、车内摄像头等检测乘客的坐下操作。这样,可以避免车机设备900将司机临时下车的场景作为乘客上车坐下的场景。
可选的,车机设备900还可以在检测到乘客的关门操作时,将该关门操作作为乘客的坐下操作。
可选的,车机设备900可以在检测到乘客的上车操作后,直接开启访客蓝牙功能。
S1604,电子设备100向车机设备900发送访客蓝牙连接请求。
电子设备100可以在接收到乘客针对第一应用的输入后,广播访客蓝牙连接请求。车机设备900可以在开启访客蓝牙功能后,接收电子设备100广播的访客蓝牙连接请求。
需要说明的是,电子设备100和车机设备900之间的通信连接不限于上述访客蓝牙连接,还可以为其他通信连接,例如,Wi-Fi直连等等。本申请对此不做限定。
S1605,车机设备900向电子设备100发送访客蓝牙连接响应。
车机设备900收到电子设备100发送的访客蓝牙连接请求后,向电子设备100发送访客蓝牙连接响应,与电子设备100建立访客蓝牙连接。可以理解的是,电子设备100收到访客蓝牙连接响应,与车机设备900建立访客蓝牙连接。
在一种可能的实现方式中,为了增加电子设备100和目标车机设备建立访客蓝牙连接的可能性。其中,目标车机设备为乘客上车后和电子设备100在同一辆车上的车机设备。电子设备100可以在收到的一个或多个访客蓝牙连接响应中,确定出蓝牙信号最强的车机设备,并和该车机设备建立访客蓝牙连接。可以理解的是,蓝牙信号越强的车机设备和电子设备100 的距离越近。
在一种可能的实现方式中,为了保护电子设备100和车机设备900之间通过访客蓝牙连接传输的数据的安全性。访客蓝牙连接仅可用于传输运动信息请求,运动信息,物品遗漏指示信息和校准信息(例如,传感器的最大误差值,指定获取时间点,获取加速度的时间)。其中,运动信息可以包括但不限于加速度,速度等等。也就是说,当运动信息为加速度是,运动信息请求为加速度请求。
在一些实施例中,电子设备100可以向车机设备900发送包括有指定包头的访客蓝牙连接请求。车机设备900也可以向电子设备100发送包括有指定包头的访客蓝牙连接响应。之后,电子设备100和车机设备900可以继续通过包括有指定包头的数据包,传输加速度。其中,数据包中的数据为加密后的加速度。电子设备100和车机设备900的加解密方式相同。
例如,电子设备100发送的访客蓝牙连接请求为:1001 0000。其中,1001为指定包头。0000为数据包中的数据,可以理解的是,数据包中的数据可以为任意值。在此,以数据值为0000进行撰写。车机设备900接收到访客蓝牙连接请求后,确定出包头为1001,向电子设备100回复访客蓝牙连接响应。例如,该访客蓝牙连接响应可以为:1001 0000。其中,1001为指定包头,0000为数据包中的数据。之后,电子设备100可以向车机设备900发送加速度。例如,电子设备100发送的加速度为:1001 5001。其中,1001为指定包头,5001为加密后的加速度。当电子设备100和车机设备900的加密方式为将原数据进行倒序排列时,车机设备900可以基于5001得到加速度为1.005m/s 2。需要说明的是,上述访客蓝牙连接之间的数据包结构、数据加解密方式仅为示例,不对本申请实施例构成限定。
在另一些实施例中,电子设备100可以向车机设备900发送包括有指定包头和指定数据段的访客蓝牙连接请求。其中,指定包头为电子设备100和车机设备900从服务器中获取的相同的固定数据段。其中,指定数据段可以为电子设备100随机生成的指定长度的数据段。车机设备900收到该访客蓝牙连接请求后,可以基于加密算法将指定数据段进行加密,并将加密后的指定数据段作为访客蓝牙连接响应的包头。之后,电子设备100可以在确定出访客蓝牙连接响应的包头为加密后的数据段后,和车机设备900建立访客蓝牙连接。之后,电子设备100和车机设备900都可以使用该加密后的数据段作为传输加速度的数据包的包头。可以理解的是,用于传输加速度的数据包中的数据为加密后的加速度。需要说明的是,电子设备100和车机设备900中的加解密算法相同。
例如,电子设备100发送的访客蓝牙连接请求为:1001 0000。其中,1001为指定包头。0000为数据包中的数据。车机设备900接收到访客蓝牙连接请求后,确定出包头为1001,向电子设备100回复访客蓝牙连接响应。当电子设备100和车机设备900针对数据的加密方式为在原数据的值上加1时,车机设备900可以得到访客蓝牙连接响应的包头为0001。该访客蓝牙连接响应可以为:0001 0000。其中,0001为指定包头,0000为数据包中的数据。可以理解的是,数据包中的数据可以为任意值。之后,电子设备100或车机设备900生成的用于发送加速度的数据包的包头为0001,数据为加密后的加速度。例如,当加速度为1.005m/s 2时,数据包为0001 1006。需要说明的是,上述访客蓝牙连接之间的数据包结构、数据加解密方式仅为示例,不对本申请实施例构成限定。
电子设备100和车机设备900建立访客蓝牙连接后,可以验证电子设备100和车机设备900是否在同一辆车上,当电子设备100和车机设备900在同一辆车上时,车机设备900才可以通过访客蓝牙连接向电子设备100发送物品遗漏指示信息。
在一种可能的实施例中,电子设备100和车机设备900的运动状态相同时,电子设备100 和车机设备900在同一辆车上。具体的,电子设备100可以通过电子设备100的运动信息以及车机设备900的运动信息判断电子设备100的运动状态和车机设备900的运动状态是否相同。当电子设备100的运动信息和车机设备900的运动信息之间的差值小于运动偏差阈值时,电子设备100的运动状态和车机设备900的运动状态是否相同。其中,运动偏差阈值可以为预设的,也可以为基于传感器的误差值得到的,该传感器为用于获取运动信息的传感器。其中,运动信息可以包括但不限于加速度、速度等等。在图13-图16所述的实施例中,运动信息可以通过加速度的形式体现,运动偏差阈值为加速度偏差阈值。示例性的,电子设备100和车机设备900可以通过执行步骤S1606-步骤S1610,确定电子设备100和车机设备900的运动状态是否相同。可以理解的是,不限于加速度,电子设备100和车机设备900之间可以基于电子设备100和车机设备900的速度,确定电子设备100和车机设备900的运动状态是否相同。
S1606,电子设备100向车机设备900发送加速度请求。
电子设备100收到访客蓝牙连接响应后,可以向车机设备900发送加速度请求。该加速度请求可以用于指示车机设备900将获取的加速度发送至电子设备100。
S1607,车机设备900基于加速度请求,获取车机设备900的第一加速度。
车机设备900可以在收到加速度请求后,获取车机设备900的第一加速度。
S1608,车机设备900向电子设备100发送第一加速度。
S1609,电子设备100获取电子设备100的第二加速度。
电子设备100可以在向车机设备900发送加速度请求后,获取第二加速度。
S1610,电子设备100判断第一加速度和第二加速度是否相同。
电子设备100在得到第一加速度和第二加速度后,可以比较第一加速度和第二加速度的值是否相同。当电子设备100确定出第一加速度和第二加速度相同时,可以执行步骤S1611。当电子设备100确定出第一加速度和第二加速度不同时,可以向车机设备900发送确认失败信令。电子设备100还可以断开和车机设备900之间的访客蓝牙连接。需要说明的是,电子设备100可以在断开和车机设备900之间的访客蓝牙连接后,继续广播访客蓝牙连接请求,直至电子设备100和目标车机设备建立访客蓝牙连接。
在一种可能的实现方式中,电子设备100和车机设备900确定出电子设备100和车机设备900的加速度不同后,电子设备100还可以记录车机设备900的标识信息(例如,车机设备900的蓝牙设备名称)。在预设禁止接入时间内(例如,1小时内),电子设备100可以在基于标识信息确定出建立访客蓝牙连接的设备为车机设备900时,不与车机设备900建立访客蓝牙连接。
进一步的,电子设备100和车机设备900获取加速度时,可以记录获取该加速度的时间。这样,可以比较同一时间点获取的加速度,避免由于获取加速度的时间点不同导致电子设备100和车机设备900的加速度不同。
可选的,加速度请求中可以包括有指定获取时间点。其中,指定获取时间在电子设备100发送加速度请求的时间点之后。这样,电子设备100和车机设备900可以在指定获取时间点获取加速度,让判定结果更加准确。
在一些实施例中,电子设备100和车机设备900的时间不同步。电子设备100和车机设备900在传输加速度之前,还可以进行时间的校准。例如,电子设备100和车机设备900可以通过卫星或蜂窝网络进行时间的同步。
需要说明的是,由于电子设备100和车机设备900的传感器不同,电子设备100的加速 度和车机设备900的加速度可能有偏差。故而,电子设备100可以在电子设备100的加速度和车机设备900的加速度之间的差值的绝对值不超过加速度偏差阈值时,判定电子设备100的加速度和车机设备900的加速度相同。
进一步的,电子设备100可以向车机设备900发送多次加速度请求,直到电子设备100发送加速度请求的次数达到预设次数(例如,3次)。电子设备100可以在每一次获取到车机设备900的第一加速度后,判断第一加速度和第二加速度是否相同。当电子设备100判定出第一加速度和第二加速度相同,并且电子设备100已经发送的加速度请求的次数小于预设次数时,可以相隔预设时长(例如,1分钟)后,再次向车机设备900发送加速度请求。当电子设备100判定出第一加速度和第二加速度相同,并且电子设备100已经发送的加速度请求的次数等于预设次数时,可以向车机设备900发送确认成功信令。当电子设备100判定出第一加速度和第二加速度不同,直接向车机设备900发送确认失败信令,并断开访客蓝牙连接。
或者,电子设备100可以向车机设备900发送预设次数(例如,3次)加速度请求。电子设备100可以在判定出第一加速度和第二加速度相同的次数达到预设次数阈值(例如,2次)时,判定电子设备100的加速度和车机设备900的加速度相同,预设次数阈值的值小于或等于预设次数。可选的,电子设备100每次加速度请求的时间点之间相隔预设时长。
也就是说,电子设备100和/或车机设备900可以在连续N次判定出电子设备100的加速度和车机设备900的加速度相同时,确定出电子设备100和车机设备900的运动状态相同。或者,电子设备100和/或车机设备900可以连续M次判断电子设备100的加速度和车机设备900的加速度是否相同,并在确定出电子设备100的加速度和车机设备900的加速度相同的次数大于等于N次时,确定出电子设备100和车机设备900的运动状态相同。
更进一步的,电子设备100的加速度请求之间相隔的预设时长不同。具体的,电子设备100可以在发送第1次加速度请求后,相隔预设时长A发送第2次加速度请求,再相隔预设时长B发送第3次加速度请求,预设时长B的值和预设时长A的值不同。例如,预设时长A的值为1分钟,预设时长B的值为2分钟。
可选的,车机设备900可以在接收到加速度请求后,向电子设备100发送第一加速度列表。电子设备100也可以获取第二加速度列表。电子设备100可以基于第一加速度列表和第二加速度列表,判断电子设备100的加速度和车机设备900的加速度是否相同。其中,第一加速度列表包括有多个加速度。同理,第二加速度列表包括有多个加速度。电子设备100可以将第一加速度列表中的多个加速度和第二加速度列表中的多个加速度依次进行对比,并记录对比相同的次数。电子设备100可以使用对比相同的次数除以对比的总次数,得到通过率。当通过率达到预设通过阈值(例如,0.8)时,电子设备100判定第一加速度列表和第二加速度列表相同。可以理解的是,电子设备100可以向车机设备900发送多次加速度请求。
例如,第一加速度列表为{1.005,1.343,1.532,1.793,1.935},第二加速度列表为{1.005,1.343,1.532,1.789,1.935}时,通过率为0.8,判定第一加速度列表和第二加速度列表相同。
进一步可选的,第一加速度列表和第二加速度列表还包括每个加速度对应的获取时间。示例性的,第一加速度列表可以为{<193532,1.005>,<193537,1.343>,<193542,1.532>,……,<1933603,1.935>}。其中,<193532,1.005>中的193532用于指示获取时间为“19:35:32”,1.005用于指示车机设备900获取的加速度为1.005m/s 2。电子设备100可以只对比加速度列表中时间相同的加速度,并计算得到通过率。可选的,加速度请求中可以包括指定获取时间点。例如,加速度请求可以包括多个指定获取时间点,电子设备100和车机设备900可以在 多个指定获取时间点获取加速度。在例如,加速度请求可以包括开始获取时间点、结束获取时间点和获取时间间隔。其中,开始获取时间点和结束获取时间点之间的时间差为获取时间间隔的整数倍。电子设备100和车机设备900可以在开始获取时间点和结束获取时间点之间,每相隔获取时间间隔获取加速度,得到加速度列表。
可选的,上述发送加速度请求和判断电子设备100的加速度和车机设备900的加速度是否相同的操作可以由车机设备900执行。
在一种可能的实现方式中,电子设备100和车机设备900可以在建立访客蓝牙连接后,每相隔预设时间,向对方传输加速度(即,电子设备100每相隔预设时间向车机设备900发送第二加速度,车机设备900每相隔预设时间向电子设备100发送第一加速度)。电子设备100和车机设备900都判断双方的加速度是否相同。当电子设备100和车机设备900在连续判定出第一加速度和第二加速度相同的次数达到预设次数时,或者,当电子设备100和车机设备900在连续判断预设次数后,得到的判定结果中第一加速度和第二加速度相同的次数达到预设次数阈值时,判定电子设备100和车机设备900的加速度相同。电子设备100和车机设备900都可以在判定出电子设备100和车机设备900的加速度不同时,断开访客蓝牙连接。
需要说明的是,上述实施例中描述的判断电子设备100的加速度和车机设备900的加速度是否相同的多种方式可以相互结合使用,本申请实施例对此不做限定。
S1611,电子设备100向车机设备900发送确认成功信令。
当电子设备100判定出第一加速度和第二加速度相同时,电子设备100可以确定电子设备100和车机设备900的加速度,即,电子设备100和车机设备900处于同一辆车。电子设备100可以向车机设备900发送确认成功信令。确认成功信令可以用于指示车机设备900不断开访客蓝牙连接。
需要说明的是,上述判断电子设备100和车机设备900的运动状态是否相同的步骤可以由车机设备900执行。也就是说,车机设备900可以接收电子设备100的运动信息,并基于电子设备100的运动信息和车机设备900的运动信息。判断电子设备100和车机设备900的运动状态是否相同。当车机设备900基于电子设备100的运动信息,判定出电子设备100和车机设备900的运动状态相同时,保持和电子设备100的通信连接。进一步的,当车机设备900基于电子设备100的运动信息,判定出电子设备100和车机设备900的运动状态相同时,车机设备900可以向电子设备100发送确认成功信令,确认成功信令可以用于指示电子设备100保持和车机设备900之间的通信连接。当车机设备900基于电子设备100的运动信息,判定出电子设备100和车机设备900的运动状态不同时,断开和电子设备100的通信连接。进一步的,当车机设备900基于电子设备100的运动信息,判定出电子设备100和车机设备900的运动状态不同时,车机设备900可以向电子设备100发送确认失败信令,确认失败信令可以用于指示电子设备100断开通信连接。
在一种可能的实现方式中,电子设备100可以通过第一应用的服务器,获取目标车机设备(例如,车机设备900)的蓝牙标识,并在广播访客蓝牙连接请求中携带该蓝牙标识。目标车机设备收到访客蓝牙连接请求后,可以在确定出访客蓝牙连接请求中携带的蓝牙标识和目标车机设备的蓝牙标识相同时,向电子设备100发送访客蓝牙连接响应。电子设备100接收到目标车机设备的访客蓝牙连接响应后,可以和目标车机设备建立访客蓝牙连接,并通过该访客蓝牙连接接收物品遗漏指示信息。
在另一种可能的实现方式中,电子设备100可以通过第一应用的服务器,将电子设备100的蓝牙标识发送给目标车机设备(例如,车机设备900)。目标车机设备收到访客蓝牙连接请 求后,可以将携带有电子设备100的蓝牙标识的访客蓝牙连接响应发送至电子设备100。电子设备100可以在确定出访客蓝牙连接响应携带的蓝牙标识为电子设备100的蓝牙标识时,和发送该访客蓝牙连接响应的目标车机设备建立访客蓝牙连接。电子设备100可以通过该访客蓝牙连接接收物品遗漏指示信息。
S1612,车机设备900检测到乘客的下车操作,获取乘客下车后的车内图像。
车机设备900可以在检测到乘客下车后,获取乘客下车后的车内图像(又称为下车后车内图像)。
在一些实施例中,车机设备900包括车门传感器时,车机设备900可以在通过车门传感器检测到乘客的开门操作后,触发车机设备900通过压力传感器和/或车内摄像头检测乘客是否下车。车机设备900可以在检测到乘客下车后,执行步骤S1613。
可选的,当车机设备900不包括车门传感器时,车机设备900可以在接收到确认成功信令后,相隔预设时间(例如,1ms)获取车内图像,并基于车内图像判断乘客是否下车。也就是说,车机设备900可以在识别出车内图像中不包括乘客的图像时,判定出乘客下车。当车机设备900判断出乘客下车后,可以执行步骤S1613。
S1613,车机设备900基于上车前车内图像和下车后车内图像,判断是否有物品遗漏。
车机设备900可以通过图像识别算法,识别得到下车后车内图像中的物品信息。车机设备900可以对比上车前车内图像中的物品和下车后车内图像中的物品是否相同。当车机设备900确定出上车前车内图像中的物品和下车后车内图像中的物品相同时,判定出没有物品遗漏。当车机设备900确定出上车前车内图像中的物品和下车后车内图像中的物品不同时,判定出有物品遗漏(即,乘客的物品遗留在车内)。
示例性的,图17B示出了车机设备900获取的下车后车内图像。在乘客已经打开车门,已经下车时,车内不仅包括司机的物品,还包括乘客的物品。车机设备900可以通过下车后车内图像得到车内的物品清单为{<瓶子,1>,<袋子,1>}。车机设备900可以确定出上车前车内图像的物品清单(参见上述图17A所示实施例)和下车后车内图像的物品清单不相同,判定出有物品遗漏。
可选的,车机设备900可以直接通过图像对比方法(例如,像素对比),对比上车前车内图像和下车后车内图像是否相同。当车机设备900判定出上车前车内图像和下车后车内图像相同时,判定出没有物品遗漏。当车机设备900判定出上车前车内图像和下车后车内图像不同时,判定出有物品遗漏。
车机设备900判定出有物品遗漏时,可以执行步骤S1614和步骤S1616。
车机设备900判定出没有物品遗漏时,可以和电子设备100断开访客蓝牙连接。
S1614,车机设备900向电子设备100发送物品遗漏指示信息。
其中,物品遗漏指示信息用于指示电子设备100执行步骤S1615。
可选的,车机设备900可以在判定出物品遗漏后,在检测到乘客的关门操作后,才执行步骤S1614。或者,车机设备900可以在判定出物品遗漏后,检测乘客的关门操作。并在检测到乘客的关门操作后,获取车内图像,并在判定出车内图像中不包括乘客的图像时,执行步骤S1614。其中,车机设备900检测乘客的关门操作的详细描述可以参见上述步骤S1603所示的检测乘客的开门操作的描述,在此不再赘述。
可选的,车机设备900可以将上车前车内图像和下车后车内图像通过访客蓝牙连接发送给电子设备100,电子设备100再基于上车前车内图像和下车后车内图像判断乘客的物品是否遗留在车上。
S1615,电子设备100显示第二遗漏提示信息。
电子设备100在接收到物品遗漏指示信息后,可以显示第二遗漏提示信息。第二遗漏提示信息可以用于提示乘客有物品遗留在车内。例如,电子设备100可以在接收到物品遗漏指示信息后,显示如图15E所示的提示框1541。
可选的,电子设备100在接收到物品遗漏指示信息后,可以通过显示文字,振动,播放动画,播报语音,显示图片等方式中的一种或多种方式,提示乘客有物品遗留在车内。
S1616,车机设备900播报第一遗漏提示信息。
其中,第一遗漏提示信息用于提示司机乘客的物品遗留在车内。
在一种可能的实现方式中,车机设备900可以在检测到乘客的开门操作时,获取上车前车内图像。车机设备900可以在检测到乘客下车后,获取下车后车内图像。车机设备900可以在基于上车前车内图像和下车后车内图像,判定出乘客的物品遗留在车内时,播报第一遗漏提示信息。这样,可以不用和电子设备100建立访客蓝牙连接。
在一些应用场景中,由于电动汽车节能、环保的特点,电动汽车成为很多用户出行的选择。但是,当充电汽车电量低时,用户需要自行搜索各个充电站的信息,再基于充电站信息筛选出合适的充电站,去该充电站给电动汽车充电。这样,用户给电动汽车充电的操作繁琐,并且会耗费用户的时间。因此,本申请实施例提供了一种检测方法,当电子设备100检测到待充电场景时,可以通过服务器1000获取充电站信息,通过车机设备900获取充电汽车信息。电子设备100可以基于充电站信息和充电汽车信息,得到充电服务信息。充电服务信息包括一个或多个充电站选项,一个充电站选项对应一个充电站,充电站选项指示的充电站为包括有车机设备900可以使用的充电设备,并且车机设备900可以在电量消耗完之前到达的充电站。充电站选项包括充电价格,充电时间等信息。其中,一个或多个充电站选项中包括第一充电站选项。当电子设备100接收到用户针对第一充电站选项的输入后,可以显示到第一充电站的导航信息。电子设备100还可以向服务器1000发送充电服务预约请求。这样,用户可以快速选中并到达可用的充电站。
进一步的,服务器1000检测到车机设备900驶入第一充电站后,可以获取车机设备900的停泊位置信息。该停泊位置信息可以用于指示车机设备900所处的停车区域。服务器1000还可以向电子设备100发送确认充电提示,电子设备100收到确认充电提示后,可以显示开始充电控件。电子设备100可以在接收到用户针对开始充电控件的输入后,向服务器1000发送开始充电请求。服务器1000可以在收到开始充电请求后,向充电设备1100发送停泊位置信息。充电设备1100可以基于停泊位置信息,抵达车机设备900所在位置,并给车机设备900充电。当充电设备1100开始给车机设备900充电后,可以通过服务器1000给电子设备100发送车辆充电信息。车辆充电信息可以包括车机设备900的电量。电子设备100可以显示该车辆充电信息。这样,用户可以实时查看车机设备900的充电情况。
接下来介绍本申请实施例提供的通信系统30。通信系统30包括有电子设备100和车机设备900。其中,电子设备100和车机设备900之间建立有通信连接(例如,蓝牙连接)。电子设备100和车机设备900之间可以通过该通信连接传输数据。其中,车机设备900为电动汽车或组成电动汽车的装置。车机设备900可以包括但不限于车载摄像头等。车机设备900可以用于获取电动汽车的数据(例如,充电汽车的剩余电量,车头前方的图像等)。其中,电子设备100可以为手持式电子设备,可穿戴式设备等等,具体的,电子设备100的硬件结构 可以参见图1所示实施例,在此不再赘述。需要说明的是,在下述实施例中,本申请实施例将以车机设备900作为充电汽车进行撰写。
接下来介绍本申请实施例提供的一组界面示意图。
示例性的,如图18A所示,电子设备100可以显示有桌面1801,该桌面1801包括多个应用图标(例如,汽车充电应用图标)。该桌面1801中还可以包括一个或多个卡片组件(例如,充电服务卡片1802)。其中,卡片组件(又称为卡片)可以用于显示指定功能信息,该指定功能信息可以用于触发电子设备100执行该功能信息指示的操作(例如,触发电子设备100在卡片组件中显示指定功能信息对应的页面)。卡片可以显示在桌面或其他指定快捷界面(例如负一屏、服务中心等等)。其中,充电服务卡片1802上可以显示有用于提供汽车充电服务的功能信息。例如,充电服务卡片1802中可以用于触发电子设备100显示车机设备900的电量信息、充电服务信息等等。
当电子设备100检测到待充电场景时,可以通过服务器1000获取充电站信息,通过车机设备900获取充电汽车信息。电子设备100可以基于充电站信息和充电汽车信息,得到充电服务信息。充电服务信息包括一个或多个充电站选项,其中,一个或多个充电站选项中包括第一充电站选项。电子设备100得到充电服务信息后,可以显示如图18B所示的充电信息栏1804。
其中,电子设备100的得到充电服务信息的具体实施例可以参见图19所示实施例,在此不再赘述。其中,充电站选项可以包括但不限于充电站的标识信息,预计充电时长,预计充电费用和待行驶距离。其中,充电站的标识信息可以用于指示充电站。预计充电时长可以用于表征车机设备900充电的时间,预计充电费用可以用于表征给车机设备900充满电所需要的费用。待行驶距离可以用于指示车机设备900到充电站的距离。
需要说明的是,电子设备100还可以基于预计充电费用,预计充电时长,待行驶距离这些参数中的一个或多个,得到各个充电站选项的优先级。电子设备100可以按照优先级的高低从最靠近状态栏的位置到最远离状态栏的位置依次显示各个充电站选项。其中,电子设备100可以在最接近状态栏的位置显示优先级最高的充电站选项。例如,电子设备100可以设置预计充电时长最短的充电站选项的优先级最高。
示例性的,如图18B所示,充电服务卡片1802上显示有剩余电量信息1803,充电站信息栏1804。其中,剩余电量信息1803可以用于指示车机设备900的剩余电量。充电站信息栏1804可以包括一个或多个充电站选项。该一个或多个充电站选项包括充电站选项1804A。充电站选项可以包括但不限于充电站的名称,预计充电时长,预计充电费用,待行驶距离等。在此,电子设备100可以接收用户针对充电站信息栏1804的滑动输入(例如,上滑),显示不同的充电站选项。其中,充电站选项1804A可以用于指示充电站A,例如,充电站选项1804A中显示有充电站A的名称,充电站A的预计充电时长为1小时,充电站A的预计充电费用为20元,车机设备900到充电站A之间的待行驶距离为1.2km。可选的,充电服务卡片1802还可以包括充电提示信息,该充电提示信息可以用于提示用户车机设备900需要充电。该充电提示信息可以为文字类提示信息,动画类提示信息,语音类提示信息中的一种或多种。例如,充电提示信息可以为文字类提示:“当前电量较低,请尽快充电”。
可选的,电子设备100可以在充电服务卡片1802中仅显示优先级最高的充电站选项。电子设备100还可以在充电服务卡片1802上显示更多控件。该更多控件可以用于触发电子设备100跳转显示充电服务界面,该充电服务界面可以用于显示充电站选项。
电子设备100可以在接收到用户针对充电站选项1804A的输入后,响应于该输入,向服务器1000发送充电服务预约请求。该充电服务预约请求包括汽车标识信息和充电站标识信息。其中,汽车标识信息用于指示车机设备900,充电站标识信息用于指示充电站A。服务器1000收到充电服务预约请求后,可以基于充电站标识信息确定出充电设备1100。服务器1000可以将汽车标识信息发送至充电设备1100,充电设备1100可以在车机设备900到达充电站A后,给车机设备900充电。
电子设备100还可以在接收到用户针对充电站选项1804A的输入(例如单击)后,响应于该输入,显示如图18C所示的导航图像1813。如图18C所示,充电服务卡片1802可以显示有预约成功提示信息1811,待行驶距离信息1812和导航图像1813。其中,预约成功提示信息1811可以用于提示用户可以到充电站A给车机设备900充电。例如,预约成功提示信息1811可以为文字类提示信息:“预约充电服务成功”。其中,待行驶距离信息1812可以用于提示用户当前位置到充电站A的距离(例如,1km)。其中,导航图像1813可以用于显示当前位置到充电站A的行驶路线。
可选的,电子设备100可以在接收到用户针对充电站选项1804A的输入后,响应于该输入,跳转显示地图应用的地图界面,并在该地图界面中显示当前位置到充电站A的导航地图。
当服务器1000检测到车机设备900到达充电站A后,可以获取车机设备900的停泊位置信息,该停泊位置信息可以用于指示车机设备900在充电站A中的位置。服务器1000还可以向电子设备100发送开始充电请求,电子设备100收到该开始充电请求后,可以显示如图18D所示的开始充电控件1822。
如图18D所示,电子设备100可以在充电服务卡片1802上显示开始充电控件1822。其中,该开始充电控件1822可以用于触发电子设备100向服务器1000发送开始充电响应。可选的,充电服务卡片1802上还可以显示有确认充电提示1821。该确认充电提示1821可以用于提示用户是否开始充电。例如,该确认充电提示1821可以为文字类提示信息:“到达充电站A,是否开始充电”。可选的,充电服务卡片1802上还可以显示有稍后询问控件。该稍后询问控件可以用于触发电子设备100显示如图18A所示的充电服务卡片1802,并在预设时间后(例如,5分钟后),再次显示如图18D所示的充电服务卡片1802。可选的,充电服务卡片1802上还可以显示有拒绝充电控件,拒绝充电控件可以用于触发电子设备100向服务器1000发送拒绝充电响应,服务器1000可以通知充电设备1100取消给车机设备900充电。
当电子设备100接收到用户针对开始充电控件1822的输入后,响应于该输入,可以向服务器1000发送开始充电响应。服务器1000接收到开始充电响应后,可以将停泊位置信息发送给充电设备1100。充电设备1100接收到停泊位置信息后,可以前往停泊位置信息指示的地点。充电设备1100到达停泊位置信息指示的地点后,还可以通过汽车标识信息确认该地点停泊的车辆是否为车机设备900。当充电设备1100确定出车机设备900后,可以开始给车机设备900充电。充电设备1100开始给车机设备充电后,可以通过服务器1000给电子设备100发送车辆充电信息,该车辆充电信息包括车机设备900的电量。电子设备100收到车辆充电信息后,可以显示如图18E所示的充电服务卡片1802。
如图18E所示,充电服务卡片1802中显示有车辆充电提示信息1831,车辆充电提示信息1831可以包括文字类提示信息,图片类提示信息,动画类提示信息,语音类提示信息中的一种或多种。车辆充电提示信息1831可以用于提示用户车机设备900正在充电。可选的,车辆充电提示信息1831还可以用于提示用户车机设备900的实时电量。可选的,车辆充电提示信息1831还可以用于提示用户车机设备900的充电时间。例如,车辆充电提示信息1831可 以包括文字类提示信息:“正在充电,预计1h后完成充电”,车辆充电提示信息1831还可以包括文字类提示信息:“当前电量:20%”。可选的,充电服务卡片1802中还可以显示有取消充电控件1832,该取消充电控件1832可以用于触发电子设备100向服务器1000发送取消充电信息。服务器1000收到取消充电信息后,可以通知充电设备1100停止给车机设备900充电。
需要说明的是,电子设备100可以不限于以卡片的形式显示如图18A-图18E所示的充电服务卡片1802中显示的内容。例如,电子设备100可以在汽车充电应用的界面中显示充电服务卡片1802中显示的内容,本申请实施例对此不作限定。
可以理解的是,为了电子设备100可以在充电服务卡片1802中显示车机设备900的电量,充电设备1100可以每隔预设时间(例如,1s),向电子设备100发送车机设备900的电量。或者,车机设备900可以在电量的数值发生变化时,例如,从20%变化至21%,向电子设备100发送车机设备900的电量。
这样,电子设备100可以显示用户可以使用的充电站对应的充电站选项,并在用户选中某个充电站选项后,显示到该充电站的导航信息。电子设备100还可以实时显示车机设备900的电量,用户可以实时查看车机设备900的充电情况。
在一种可能的实现方式中,当车机设备900包括有显示屏和输入装置(例如,触摸屏,机械按键等)时,上述电子设备100执行的操作可以由车机设备900执行。这样,用户可以直接在车载显示屏上查看充电服务相关信息。可选的,当电子设备100检测到用户离开车机设备900时,可以向车机设备900获取上述车辆充电信息,并基于车辆充电信息显示车辆充电提示信息。这样,用户可以在车机设备900充电的过程中,离开车机设备900所在的充电站。并且,用户可以通过电子设备100了解车机设备900的充电情况。
接下来介绍本申请实施例提供的一种检测方法的流程示意图。
示例性的,如图19所示,该方法包括:
S1901,电子设备100检测到待充电场景。
其中,待充电场景可以包括但不限于低电量场景,停车场场景,目的地场景等等。
电子设备100可以相隔预设时间(例如,1秒)获取车机设备900的电量,当电子设备100确定出车机设备900的电量低于预设电量阈值(例如,20%)时,确定出当前场景为低电量场景。
电子设备100还可以通过车机设备900的车载摄像头(例如,行车记录仪)获取前方道路图像,并通过图像识别算法,识别前方道路图像中是否包括停车场入口信息(例如,停车场标志等等)。当电子设备100识别得到前方道路图像中包括停车场入口信息时,可以确定出当前场景为停车场场景。或者,电子设备100可以通过全球导航定位系统获取车机设备900的位置信息,还可以通过地图服务器获取车机设备900附近的停车场位置信息。当电子设备100确定出车机设备900和停车场的距离小于指定距离阈值(例如,10米)时,确定出当前场景为停车场场景。
电子设备100还可以存储有用户的历史停车地点(例如,工作地点)。当电子设备100检测到车机设备900距离历史停车地点的距离小于指定距离阈值时,确定出当前场景为目的地场景。或者,电子设备100可以获取用户输入的目的地地址,当电子设备100检测到车机设备900距离目的地的距离小于指定距离阈值时,确定出当前场景为目的地场景。
可选的,电子设备100获取到用户输入的目的地地址后,可以判断车机设备900到达目 的地消耗的电量,并比较消耗的电量是否大于车机设备900剩余电量,当电子设备100确定出车机设备900到达目的地所消耗的电量大于车机设备900的剩余电量时,可以计算得到消耗电量和剩余电量的差值。电子设备100可以在车机设备900已消耗的电量大于该消耗电量和剩余电量的差值时,获取车机设备900的行驶路线附近的充电站信息,并基于该行驶路线附近的充电站信息和充电汽车信息得到并显示充电服务信息。或者,电子设备100可以在车机设备900的剩余电量小于车机设备900在剩余路程上消耗的电量时,获取车机设备900的行驶路线附近的充电站信息,并基于该行驶路线附近的充电站信息和充电汽车信息得到并显示充电服务信息。
在一种可能的实现方式中,电子设备100可以获取用户前往的目的地信息,该目的地信息包括目的地地址和到达目的地的路线。例如,该目的地路线可以为电子设备100基于电子设备100的位置和目的地地址,从地图服务器获取的。电子设备100确定出车机设备900的电量低于所述车机设备900按照到达目的地的路线行驶至目的地地址消耗的电量后,电子设备100获取一个或多个充电站的充电信息。
电子设备100检测到上述待充电场景后,可以执行步骤S1902和S1903。需要说明的是,本申请实施例对步骤S1902和步骤S1903的执行顺序不做限定,例如,电子设备100可以先执行步骤S1902,或者,电子设备100可以先执行步骤S1903,或者,电子设备100可以同步执行步骤S1902和步骤S1903。
在一种可能的实现方式中,电子设备100可以不执行步骤S1901,并直接执行步骤S1902-步骤S1904。
S1902,电子设备100从服务器1000处获取充电站信息(包括第一充电站的信息)。
其中,服务器1000可以为任意存储有多个充电站的充电站信息的服务器,例如,服务器1000可以为上述汽车充电应用对应的服务器。其中,多个充电站包括第一充电站。充电站信息可以包括但不限于充电站的标识信息(例如,名称),充电站中未工作的充电设备的数量,充电站中未工作的充电设备的充电功率,充电站中未工作的充电设备的充电接口型号(例如,五孔三针,九孔两针等),充电站的位置,单位电量的充电费用等等。
服务器1000可以将充电站信息(即,一个或多个充电站的充电信息)发送给电子设备100。
可选的,服务器1000可以只将包括有未工作的充电设备的充电站对应的充电站信息发送至电子设备100。
可选的,电子设备100还可以通过和充电站之间的历史交易记录、基于位置服务(locationbasedservices,LBS)、无线信标(Beacon)扫描等,获取充电站信息。
S1903,电子设备100从车机设备900处获取充电汽车信息。
其中,充电汽车信息可以包括但不限于车机设备900的充电接口型号,车机设备900的剩余电量,车机设备900的电池容量,车机设备900的位置,历史充电记录等等。
S1904,电子设备100基于充电站信息和充电汽车信息,得到并显示充电服务信息;充电服务信息包括有一个或多个充电站选项,一个或多个充电站选项中包括第一充电站选项,第一充电选项对应第一充电站。
其中,充电站选项包括有充电站的标识信息,预计充电时长,预计充电费用和待行驶距离。其中,充电站的标识信息可以用于指示充电站。预计充电时长可以用于表征车机设备900充电的时间,预计充电费用可以用于表征给车机设备900充满电所需要的费用。待行驶距离可以用于指示车机设备900到充电站的距离。具体的,电子设备100得到充电站选项的描述 如下:
首先,电子设备100可以基于充电站的未工作的充电设备的数量、充电设备的充电接口型号和车机设备900的充电接口型号,筛选出未工作的充电设备的数量大于零,并且充电设备的充电接口型号包括有车机设备900的接口型号的一个或多个充电站。
之后,电子设备100再在筛选得到的一个或多个充电站中,基于该一个或多个充电站的位置和车机设备900的位置,得到电子设备100和该一个或多个充电站的距离(又称为待行驶距离)。电子设备100还可以基于车机设备900的剩余电量,计算得到车机设备900在没电之前可以行驶的距离(又称为可行驶距离)。电子设备100可以在该一个或多个充电站中筛选出待行驶距离小于可行驶距离的充电站。在此,可以将筛选得到待行驶距离小于可行驶距离的充电站称为预选充电站。
之后,电子设备100可以基于预选充电站的未工作的充电设备的充电功率和单位电量的充电费用,以及车机设备900的剩余电量和电池容量,计算得到在每个预选充电站充电所需时间(即,预计充电时长)和费用(即,预计充电费用)。
这样,电子设备100可以得到一个或多个充电站选项。电子设备100可以显示该一个或多个充电站选项。例如,电子设备100可以通过图18B所示的充电服务卡片1802显示该一个或多个充电站选项。
在一种可能的实现方式中,电子设备100可以基于车机设备900和各个充电站的位置,得到车机设备900到各个充电站的待行驶距离。再基于待行驶距离和车机设备900的行驶速度,得到车机设备900到达各个充电站的到达时间点,并从服务器1000获取到达时间点之后包括有未使用的充电设备的充电站信息。电子设备100再基于充电站信息和充电汽车信息,得到充电站选项。这样,电子设备100可以得抵达充电站时,提供有未使用的充电设备的充电站,提高充电设备的利用率。
需要说明的是,电子设备100可以基于预计充电费用,预计充电时长,待行驶距离这些参数中的一个或多个,得到一个或多个充电站选项设置的优先级,并按照一个或多个充电站的优先级,设置该一个或多个充电站选项在电子设备100的显示屏中的位置。例如,优先级越高的充电站选项在电子设备100的显示屏中的位置越接近状态栏。
例如,电子设备100可以基于预计充电费用设置该一个或多个充电站选项的优先级。电子设备100可以设置预计充电费用越低的充电站选项优先级越高。
在一些实施例中,电子设备100中存储有历史充电记录,或者,电子设备100可以从车机设备900处获取历史充电记录。其中,历史充电记录中包括有车机设备900在此之前充电的充电站充电信息(例如,充电站的名称,充电站的位置,在该充电站的充电次数等)。电子设备100可以设置车机设备900附近区域内(例如,以车机设备900为中心,半径1公里区域内)充电次数最多的充电站对应的充电站选项的优先级最高。
可以理解的是,第一充电站选项包括第一充电站的标识信息,预计充电时长等。第一充电站选项可以用于触发电子设备100选中第一充电站的充电设备(例如,充电设备1100)。
S1905,电子设备100接收到用户针对第一充电站选项的输入。
其中,针对第一充电站选项的输入可以包括但不限于单击,双击,长按等。例如,该输入可以为针对上述图18B所示的充电站选项1804A的输入。
S1906,电子设备100向服务器1000发送充电服务预约请求,该充电服务预约请求包括汽车标识信息、充电站标识信息,其中,汽车标识信息可以用于指示车机设备900。充电站标识信息可以用于指示第一充电站。
电子设备100可以在接收到用户针对第一充电站选项的输入后,响应于该输入,向服务器1000发送充电服务预约请求。该充电服务预约请求包括汽车标识信息、充电站标识信息,其中,汽车标识信息可以用于指示车机设备900。例如,汽车标识信息可以包括但不限于车机设备的车牌号、车型、颜色等。充电站标识信息用于指示第一充电站选项对应的第一充电站。
S1907,服务器1000向充电设备1100发送汽车标识信息。
服务器1000接收到电子设备100发送的充电服务预约请求后,可以基于充电站标识信息,确定出车机设备900将要通过第一充电站的未使用的充电设备进行充电。服务器1000可以将汽车标识信息发送给第一充电站的未使用的充电设备,例如,充电设备1100。
需要说明的是,充电设备1100接收到汽车标识信息后,除了车机设备900以外的车机设备不能使用该充电设备1100。
S1908,电子设备100显示导航信息。
电子设备100在接收到用户针对第一充电站选项的输入后,响应于该输入,可以显示到第一充电站选项对应的第一充电站的导航信息(例如,电子设备100所在位置到第一充电站的导航路线)。例如,第一充电站可以为充电站A,电子设备100可以在收到针对第一充电站选项的输入后,显示上述图18C所示的导航图像1813。
S1909,服务器1000检测到车机设备900驶入第一充电站,可以获取车机设备900的停泊位置信息。
其中,服务器1000可以通过多种方式检测车机设备900是否驶入第一充电站。在一些实施例中,服务器1000可以通过第一充电站的摄像头或者全自动电子收费系统(electronictollcollection,ETC)检测车机设备900是否驶入第一充电站。具体的,服务器1000可以通过第一充电站入口处的摄像头获取驶入第一充电站的车辆图像。服务器1000可以通过图像识别算法,识别得到车辆图像中的车辆标识信息。服务器1000可以基于车辆标识信息确认该车辆图像中的车机设备是否为车机设备900。当服务器1000确定出车辆图像中的车机设备为车机设备900时,即可判定出车机设备900驶入第一充电站。或者,服务器1000可以通过ETC自动识别驶入第一充电站的车辆的车牌号,基于车牌号确定出该车辆是否为车机设备900,当服务器1000确定出该车辆为车机设备900时,即可判定出车机设备900驶入第一充电站。
在另一些实施例中,服务器1000可以相隔预设时间通过电子设备100获取车机设备900的位置,并在确定出车机设备900的位置和第一充电站的位置重叠时,确定出车机设备900驶入第一充电站。或者,电子设备100可以在车机设备900驶入第一充电站后,向服务器1000发送用于指示车机设备900驶入第一充电站的信令,服务器1000收到该信令后,可以确定出车机设备900驶入第一充电站。
服务器1000检测到车机设备900驶入第一充电站后,可以获取车机设备900的停泊位置信息。其中,停泊位置信息可以用于指示车机设备900在第一充电站中的位置。例如,停泊位置信息可以包括停车区域编号,停车位编号,室内定位指纹,室内GPS信号等中的一种或多种。
服务器1000可以通过多种方式获取车机设备900的停泊位置信息。例如,服务器1000可以通过第一充电站的摄像头,获取车机设备900停泊位置的停车区域编号,停车车位编号。再例如,电子设备100可以通过车机设备900的摄像头,获取车机设备900停泊位置的停车区域编号,停车车位编号等。再例如,服务器1000可以向电子设备100发送查询位置信息, 电子设备100收到查询位置信息后,可以显示位置提示信息,位置提示信息可以用于提示用户输入停泊位置信息(例如,停车位编号)。电子设备100可以接收用户输入的停泊位置信息,并将该停泊位置信息发送给服务器1000。
S1910,服务器1000可以向电子设备100发送开始充电请求。
服务器1000在检测到车机设备900驶入第一充电站后,可以向电子设备100发送开始充电请求。该开始充电请求可以用于指示电子设备100显示开始充电控件。
S1911,电子设备100可以显示开始充电控件。
电子设备100接收到服务器1000发送的开始充电请求后,可以显示开始充电控件。其中,开始充电控件可以用于触发电子设备100向服务器1000发送开始充电响应。
S1912,电子设备100接收到用户针对开始充电控件的输入。
其中,针对开始充电控件的输入可以为单击,双击,长按等。例如,该输入可以为针对上述图18D所示的开始充电控件1822的输入。
S1913,电子设备100向服务器1000发送开始充电响应。
电子设备100接收到用户针对开始充电控件的输入后,响应于该输入,可以向服务器1000发送开始充电响应。开始充电响应可以用于指示服务器1000通知充电设备1100给车机设备900充电。
可选的,电子设备100可以在显示导航信息的同时,显示该开始充电控件。这样,不需要服务器1000检测车机设备900是否驶入第一充电站,电子设备100可以在接收到用户针对开始充电控件的输入时,向服务器1000发送开始充电请求。服务器1000可以在收到开始充电请求后,确定出车机设备900驶入第一充电站。服务器1000可以在确定出车机设备900驶入第一充电站后,获取车机设备900的停泊位置信息,其中,服务器1000获取停泊位置信息的描述可以参见上述步骤S1909所示实施例,在此不再赘述。
S1914,服务器1000向充电设备1100发送停泊位置信息。
服务器1000收到开始充电响应后,可以向充电设备1100发送停泊位置信息。
S1915,充电设备1100到达停泊位置信息指示的停车区域后,给车机设备900充电。
充电设备1100收到停泊位置信息后,可以基于停泊位置信息获取车机设备900在第一充电站中的位置。充电设备1100可以移动至车机设备900的位置,给车机设备900充电。
进一步的,充电设备1100可以在抵达停泊位置信息指示的位置后,基于车辆标识信息,确认出该位置停泊的车辆为车机设备900,再给车机设备900充电。
S1916,充电设备1100向服务器1000发送车辆充电信息。
充电设备1100可以在将充电接口和车机设备900的充电接口对接后,获取车机设备900的车辆充电信息,并将车辆充电信息发送给电子设备100。其中,车辆充电信息中包括有车机设备900的电量。车辆充电信息可以用于指示车机设备900正在充电。
S1917,服务器1000向电子设备100发送车辆充电信息。
服务器1000收到充电设备1100发送的车辆充电信息后,可以将车辆充电信息发送至电子设备100。
S1918,电子设备100显示车辆充电信息。
电子设备100收到车辆充电信息后,可以显示车辆充电提示信息。车辆充电提示信息可以用于提示用户车机设备900正在充电。车辆充电提示信息还可以用于提示用户车机设备900的实时电量。例如,车辆充电提示信息可以参见上述图18E所示实施例,在此不再赘述。
可以理解的是,为了电子设备100可以在实时显示车机设备900的电量,充电设备1100 可以每隔预设时间(例如,1s),向电子设备100发送车辆充电信息。或者,车机设备900可以在电量的数值发生变化时,例如,从20%变化至21%,向电子设备100发送车辆充电信息。
可选的,电子设备100可以直接从车机设备900处获取车机设备900的电量信息并显示。
下面介绍本申请实施例中提供的一种检测方法。
图20示出了本申请实施例中提供的一种检测方法的流程示意图。
如图20所示,该检测方法包括如下步骤:
S2001,获取生理信息参数、血醇浓度参数和采集血醇浓度参数的采集时间参数。
S2002,基于生理信息参数、血醇浓度参数和采集时间参数,确定预测醒酒时间。
S2003,显示该预测醒酒时间。
其中,上述步骤可以由上述图2-图7所示的电子设备100执行。具体涉及电子设备100确定预测醒酒时间的详细描述,可以参考前述图2-图7所示实施例,在此不再赘述。
可选的,不限于图2-图7所示的电子设备100,确定预测醒酒时间的步骤可以由其他电子设备执行,例如,云端服务器。可选的,不限于上述图2-图7所示的电子设备100,获取生理信息参数、血醇浓度参数和采集血醇浓度参数的采集时间参数的步骤可以由其他电子设备执行,例如,图2所示的电子设备200。
下面介绍本申请实施例中提供的一种检测方法。
图21示出了本申请实施例中提供的一种检测方法的流程示意图。
如图21所示,该检测方法包括如下步骤:
S2101,获取用户的行为数据。
S2102,基于该用户的行为数据,确定出该用户的驾驶前疲劳程度。
S2103,基于该用户的驾驶前疲劳程度,确定出该用户的第一推荐驾驶时长。
S2104,显示该第一推荐驾驶时长。
其中,上述步骤可以由上述图8-图12所示的电子设备100执行。具体涉及电子设备100确定第一推荐驾驶时长的详细描述,可以参考前述图8-图12所示实施例,在此不再赘述。
可选的,不限于上述图8-图12所示的电子设备100,确定第一推荐驾驶时长的步骤可以由其他电子设备执行,例如,云端服务器。可选的,不限于上述图8-图12所示的电子设备100,获取用户的行为数据的步骤可以由其他电子设备执行,例如,图8所示的电子设备500。可选的,不限于上述图8-图12所示的电子设备100,显示第一推荐驾驶时长的步骤可以由其他电子设备执行,例如,图8所示的电子设备500。
在一种可能的实现方式中,获取用户的行为数据,具体包括:获取用户的出行时刻,在出行时刻之前的第一时刻,获取用户的行为数据。出行时刻和第一时刻相差预设时间。其中,出行时刻为上述图8-图12所示的出发时刻,第一时刻为上述图8-图12所示的触发时刻。
下面介绍本申请实施例中提供的一种检测方法。
图22示出了本申请实施例中提供的一种检测方法的流程示意图。
如图22所示,该检测方法包括如下步骤:
S2201,第一电子设备检测到乘客的上车操作,获取该乘客上车前的车内图像。
S2202,第一电子设备和第二电子设备建立通信连接。
S2203,第一电子设备检测到该乘客的下车操作,获取该乘客下车后的车内图像。
S2204,第一电子设备基于该乘客上车前的车内图像和该乘客下车后的车内图像,确定出 该乘客的物品遗留在车内时,播报第一遗漏提示信息。
其中,第一遗漏提示信息用于提示乘客的物品遗留在车内。
S2205,第一电子设备通过通信连接将物品遗漏指示信息发送给第二电子设备。
S2206,第二电子设备显示第二遗漏提示信息。
其中,第二遗漏提示信息用于提示乘客的物品遗留在车内。
其中,第一电子设备可以为上述图13-图17B所示的车机设备900。具体涉及车机设备900执行上述步骤的详细描述,可以参考前述图13-图17B所示实施例,在此不再赘述。
其中,第二电子设备可以为上述图13-图17B所示的电子设备100。具体涉及电子设备100执行上述步骤的详细描述,可以参考前述图13-图17B所示实施例,在此不再赘述。
其中,第一电子设备和第二电子设备可以组成第一通信系统。
下面介绍本申请实施例中提供的一种检测方法。
图23示出了本申请实施例中提供的一种检测方法的流程示意图。
如图23所示,该检测方法包括如下步骤:
S2301,第一电子设备获取一个或多个充电站的充电信息。
S2302,第一电子设备基于一个或多个充电站的充电信息,显示一个或多个充电站选项,该一个或多个充电站选项包括第一充电站选项。
S2303,第一电子设备接收到针对第一充电站选项的输入,显示第一导航信息,第一导航信息用于指示第一电子设备到第一充电站选项对应的充电站的路线。
S2304,服务器检测到第一电子设备到达第一充电站,获取第一电子设备在第一充电站中的停泊位置信息。
S2305,服务器将停泊位置信息发送给充电设备。
S2306,充电设备到达停泊位置信息指示的第一充电站中的位置,给第一电子设备充电。
其中,第一电子设备可以为上述图18A-图19所示的车机设备900。具体涉及车机设备900执行上述步骤的详细描述,可以参考前述图18A-图19所示实施例,在此不再赘述。
其中,服务器可以为上述图18A-图19所示的服务器1000。具体涉及服务器1000执行上述步骤的详细描述,可以参考前述图18A-图19所示实施例,在此不再赘述。
其中,充电设备可以为上述图18A-图19所示的充电设备1100。具体涉及充电设备1100执行上述步骤的详细描述,可以参考前述图18A-图19所示实施例,在此不再赘述。
其中,第一电子设备、服务器和充电设备可以组成第二通信系统。
在一种可能的实现方式中,第一电子设备可以为图18A-图19所示的电子设备100,那么充电设备到达停泊位置信息指示的第一充电站中的位置后,给图18A-图19所示的车机设备900充电。
上述图20-图21所示的检测方法可以相互结合使用。例如,上述图20-图21所述的电子设备100可以为同一个电子设备。电子设备100可以执行上述图20-图21所示实施例中的步骤,本申请对此不作限定。
以上所述,以上实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的范围。

Claims (39)

  1. 一种检测方法,其特征在于,包括:
    获取生理信息参数、血醇浓度参数和采集所述血醇浓度参数的采集时间参数;
    基于所述生理信息参数、所述血醇浓度参数和所述采集时间参数,确定预测醒酒时间;其中,所述预测醒酒时间用于指示用户的血醇浓度低于阈值血醇浓度的时间点;
    显示所述预测醒酒时间。
  2. 根据权利要求1所述的方法,其特征在于,所述生理信息参数包括体重,身高,年龄,性别,睡眠时间,睡眠质量中的一种或多种。
  3. 根据权利要求1或2所述的方法,其特征在于,所述基于所述生理信息参数、所述血醇浓度参数和所述采集时间参数,确定预测醒酒时间,具体包括:
    基于所述生理信息参数、所述血醇浓度参数和所述采集时间参数,通过酒精预测模型,确定所述预测醒酒时间。
  4. 根据权利要求1-3中任一项所述的方法,其特征在于,在所述获取生理信息参数、血醇浓度参数和采集所述血醇浓度参数的采集时间参数之前,所述方法还包括:
    接收第一输入;
    获取生理信息参数、血醇浓度参数和采集所述血醇浓度参数的采集时间参数,具体包括:
    响应于所述第一输入,获取所述生理信息参数、所述血醇浓度参数和所述采集时间参数。
  5. 根据权利要求1-4中任一项所述的方法,其特征在于,在所述基于所述生理信息参数、所述血醇浓度参数和所述采集时间参数,确定预测醒酒时间之前,所述方法还包括:
    接收第二输入;
    基于所述生理信息参数、所述血醇浓度参数和所述采集时间参数,确定预测醒酒时间,具体包括:
    响应于所述第二输入,基于所述生理信息参数、所述血醇浓度参数和所述采集时间参数,确定所述预测醒酒时间。
  6. 根据权利要求1-5中任一项所述的方法,其特征在于,所述基于所述生理信息参数、所述血醇浓度参数和所述采集时间参数,确定预测醒酒时间,具体包括:
    获取摄入酒水参数;
    基于所述生理信息参数、所述摄入酒水参数、所述血醇浓度参数和所述采集时间参数,确定所述预测醒酒时间。
  7. 根据权利要求6所述的方法,其特征在于,获取摄入酒水参数,具体包括:
    通过摄像头获取摄入酒水的容器图像;
    基于所述容器图像,确定出所述摄入酒水参数。
  8. 根据权利要求6或7所述的方法,其特征在于,所述摄入酒水参数包括酒水度数参数和酒水体积参数,所述酒水度数参数用于指示所述用户摄入酒水的度数,所述酒水体积参数 用于指示所述用户摄入酒水的体积。
  9. 根据权利要求6-7中任一项所述的方法,其特征在于,所述基于所述生理信息参数、所述摄入酒水参数、所述血醇浓度参数和所述采集时间参数,确定所述预测醒酒时间,具体包括:
    基于所述摄入酒水参数,所述生理信息参数,通过所述酒精预测模型,得到预测酒精吸收速率和预测酒精代谢速率;
    基于所述生理信息参数,所述输入酒水参数,所述预测酒精吸收速率,所述预测酒精代谢速率,得到血醇浓度和时间的对应关系;
    基于所述血醇浓度参数,所述采集时间参数,所述血醇浓度和时间的对应关系,确定所述预测醒酒时间。
  10. 一种检测方法,其特征在于,包括:
    获取用户的行为数据;
    基于所述用户的行为数据,确定出所述用户的驾驶前疲劳程度;
    基于所述用户的驾驶前疲劳程度,确定出所述用户的第一推荐驾驶时长;
    显示所述第一推荐驾驶时长。
  11. 根据权利要求10所述的方法,其特征在于,所述获取用户的行为数据,具体包括:
    获取所述用户的出行时刻;
    在所述出行时刻之前的第一时刻,获取所述用户的行为数据;其中,所述第一时刻与所述出行时刻相差预设时间。
  12. 根据权利要求11所述的方法,其特征在于,所述获取所述用户的出行时刻,具体包括:
    获取所述用户的日程信息,所述日程信息包括所述用户的票据信息、会议信息和日程安排信息中的一种或多种;
    基于所述用户的日程信息,获取所述用户的出行时刻。
  13. 根据权利要求10-12中任一项所述的方法,其特征在于,所述方法还包括:
    获取所述用户在车辆行驶状态中的身体状态数据;
    基于所述用户的身体状态数据,确定出所述用户的驾驶中疲劳程度;
    基于所述用户的驾驶前疲劳程度和所述用户的驾驶中疲劳程度,确定出所述用户的最终疲劳程度;
    基于所述用户的最终疲劳程度,确定出第二推荐驾驶时长;
    显示所述第二推荐驾驶时长。
  14. 根据权利要求13所述的方法,其特征在于,所述基于所述用户的身体状态数据,确定出所述用户的驾驶中疲劳程度,具体包括:
    基于所述用户的身体状态数据,通过第二疲劳模型,确定出所述驾驶中疲劳程度,所述第二疲劳模型根据所述用户的历史身体状态数据训练得到。
  15. 根据权利要求13所述的方法,其特征在于,所述基于所述用户的身体状态数据,确定出所述用户的驾驶中疲劳程度,具体包括:
    获取所述用户在车辆行驶状态中的车上行驶数据;
    基于所述用户的身体状态数据和所述用户的车上行驶数据,确定出所述用户的驾驶中疲劳程度。
  16. 根据权利要求15所述的方法,其特征在于,所述基于所述用户的身体状态数据和所述用户的车上行驶数据,确定出所述用户的驾驶中疲劳程度,具体包括:
    基于所述用户的身体状态数据,确定出第二疲劳模型;
    基于所述用户的车上行驶数据和所述用户的身体状态数据,通过第二疲劳模型,确定出所述驾驶中疲劳程度。
  17. 根据权利要求10-16中任一项所述的方法,其特征在于,所述获取用户的行为数据,具体包括:
    获取所述用户的用户数据,所述用户的用户数据包括运动时长,运动强度,睡眠时长中的一种或多种;
    基于所述用户数据,确定出所述用户的行为数据。
  18. 根据权利要求10-17中任一项所述的方法,其特征在于,所述基于所述用户的行为数据,确定出所述用户的驾驶前疲劳程度,具体包括:
    基于所述用户的行为数据,通过第一疲劳模型,确定出所述用户的驾驶前疲劳程度;其中,所述第一疲劳模型根据所述用户的历史行为数据训练得到。
  19. 一种检测方法,应用于第一通信系统,其特征在于,所述第一通信系统包括第一电子设备和第二电子设备;所述方法包括:
    所述第一电子设备检测到乘客的上车操作,获取所述乘客上车前的车内图像;
    所述第一电子设备和第二电子设备建立通信连接;
    所述第一电子设备检测到乘客的下车操作,获取所述乘客下车后的车内图像;
    当所述第一电子设备基于所述乘客上车前的车内图像和所述乘客下车后的车内图像,确定出所述乘客的物品遗留在车内时,播报第一遗漏提示信息;其中,所述第一遗漏提示信息用于提示乘客的物品遗留在车内;
    所述第一电子设备通过所述通信连接将物品遗漏指示信息发送给所述第二电子设备;
    所述第二电子设备基于所述物品遗漏指示信息,显示第二遗漏提示信息,所述第二遗漏提示信息用于提示乘客的物品遗漏。
  20. 根据权利要求19所述的方法,其特征在于,所述第二电子设备为所述第一电子设备检测到的所有电子设备中信号最强的电子设备。
  21. 根据权利要求19或20所述的方法,其特征在于,在所述第一电子设备和第二电子设备建立通信连接后,所述方法还包括:
    所述第一电子设备通过所述通信连接向所述第二电子设备发送所述第一电子设备的运动信息;
    当所述第二电子设备基于所述第一电子设备的运动信息,确定出所述第一电子设备的运动状态和所述第二电子设备的运动状态相同时,所述第二电子设备向所述第一电子设备发送确认成功信令;
    所述第一电子设备接收到所述确认成功信令,保持和所述第二电子设备的通信连接。
  22. 根据权利要求21所述的方法,其特征在于,所述第二电子设备基于所述第一电子设备的运动信息,确定出所述第一电子设备的运动状态和所述第二电子设备的运动状态相同,具体包括:
    所述第二电子设备连续N次确定出所述第一电子设备的运动信息和所述第二电子设备的运动信息相同时,确定出所述第一电子设备的运动状态和所述第二电子设备的运动状态相同;其中,所述N为正整数。
  23. 根据权利要求21所述的方法,其特征在于,所述第二电子设备基于所述第一电子设备的运动信息,确定出所述第一电子设备的运动状态和所述第二电子设备的运动状态相同,具体包括:
    所述第二电子设备在M次判断所述第一电子设备的运动信息和所述第二电子设备的运动信息是否相同时,判定出所述第一电子设备的运动信息和所述第二电子设备的运动信息至少有N次相同,所述第二电子设备确定出所述第一电子设备的运动状态和所述第二电子设备的运动状态相同;其中,所述N小于等于所述M,所述M和所述N为正整数。
  24. 根据权利要求22或23所述的方法,其特征在于,当所述第一电子设备的运动信息和所述第二电子设备的运动信息的差值小于运动偏差阈值时,所述第一电子设备的运动信息和所述第二电子设备的运动信息相同。
  25. 根据权利要求19或20所述的方法,其特征在于,在所述第一电子设备和第二电子设备建立通信连接后,所述方法还包括:
    所述第二电子设备通过所述通信连接向所述第一电子设备发送所述第二电子设备的运动信息;
    当所述第一电子设备基于所述第二电子设备的运动信息,确定出所述第一电子设备的运动状态和所述第二电子设备的运动状态相同时,所述第一电子设备向所述第二电子设备发送确认成功信令;
    所述第二电子设备接收到所述确认成功信令,保持和所述第一电子设备的通信连接。
  26. 根据权利要求21所述的方法,其特征在于,所述方法还包括:
    当所述第二电子设备基于所述第一电子设备的运动信息,确定出所述第一电子设备的运动状态和所述第二电子设备的运动状态不同时,所述第二电子设备断开和所述第一电子设备的通信连接。
  27. 根据权利要求26所述的方法,其特征在于,所述第二电子设备断开和所述第一电子 设备的通信连接,具体包括:
    所述第二电子设备向所述第一电子设备发送确认失败信令;
    所述第一电子设备接收到所述确认失败信令,断开和所述第二电子设备的通信连接。
  28. 根据权利要求26或27所述的方法,其特征在于,在所述第二电子设备断开和所述第一电子设备的通信连接后,所述方法还包括:
    所述第二电子设备广播通信连接请求。
  29. 根据权利要求19-28中任一项所述的方法,其特征在于,所述第一电子设备和第二电子设备建立通信连接,具体包括:
    所述第二电子设备广播通信连接请求;
    所述第一电子设备接收到所述第二电子设备的通信连接请求;
    所述第一电子设备向所述第二电子设备发送通信连接响应;
    所述第二电子设备接收到所述第一电子设备的所述通信连接响应,与所述第一电子设备建立所述通信连接。
  30. 根据权利要求19-29中任一项所述的方法,其特征在于,所述第一电子设备和第二电子设备建立通信连接,具体包括:
    所述第一电子设备检测到所述乘客在车内坐下后,所述第一电子设备接收到所述第二电子设备的通信连接请求;
    所述第一电子设备向所述第二电子设备发送通信连接响应,与所述第二电子设备建立所述通信连接。
  31. 一种检测方法,应用于第二通信系统,其特征在于,所述第二通信系统包括第一电子设备,服务器和充电设备;所述方法包括:
    所述第一电子设备接收所述服务器发送的一个或多个充电站的充电信息;
    所述第一电子设备基于所述一个或多个充电站的充电信息,显示一个或多个充电站选项,所述一个或多个充电站选项包括第一充电站选项;
    所述第一电子设备接收到针对所述第一充电站选项的输入后,显示第一导航信息,所述第一导航信息用于指示所述第一电子设备所处位置到所述第一充电站选项对应的第一充电站的路线;
    所述服务器检测到所述第一电子设备到达所述第一充电站后,所述服务器获取所述第一电子设备在所述第一充电站中的停泊位置信息;
    所述服务器将所述停泊位置信息发送给充电设备;
    所述充电设备到达所述停泊位置信息指示的所述第一充电站的位置后,给所述第一电子设备充电。
  32. 根据权利要求31所述的方法,其特征在于,所述第一电子设备基于所述一个或多个充电站的充电信息,显示一个或多个充电站选项,具体包括:
    所述第一电子设备基于所述一个或多个充电站的充电信息和所述第一电子设备的充电信息,确定出一个或多个充电站选项。
  33. 根据权利要求31或32所述的方法,其特征在于,所述一个或多个充电站选项包括充 电价格,充电时间和到达距离,所述充电价格用于指示所述第一电子设备充满电量所需的费用,所述充电时间用于指示所述第一电子设备充满电量所需的时间,所述到达距离用于指示所述第一电子设备与充电站选项对应的充电站之间的距离。
  34. 根据权利要求31-33中任一项所述的方法,其特征在于,所述第一电子设备接收所述服务器发送的一个或多个充电站的充电信息,具体包括:
    当所述第一电子设备检测到待充电场景时,所述第一电子设备接收所述服务器发送的一个或多个充电站的充电信息,所述待充电场景包括低电量场景和停车场场景;其中,所述低电量场景为所述第一电子设备的电量低于预设电量阈值的场景,所述停车场场景为所述第一电子设备和附近的停车地点之间的距离小于指定距离阈值的场景。
  35. 根据权利要求31-33中任一项所述的方法,其特征在于,所述第一电子设备接收所述服务器发送的一个或多个充电站的充电信息,具体包括:
    所述第一电子设备获取用户前往的目的地信息,所述目的地信息包括目的地地址和到达目的地的路线;
    所述第一电子设备确定出所述第一电子设备的电量低于所述第一电子设备按照所述到达目的地的路线行驶至所述目的地地址消耗的电量后,所述第一电子设备接收所述服务器发送的一个或多个充电站的充电信息。
  36. 根据权利要求31-35中任一项所述的方法,其特征在于,所述服务器将所述停泊位置信息发送给充电设备,具体包括:
    所述服务器向所述第一电子设备发送开始充电请求;
    所述第一电子设备接收到所述开始充电请求,显示开始充电控件;
    当所述第一电子设备接收到针对所述开始充电控件的第四输入后,响应于所述第四输入,向所述服务器发送开始充电响应;
    所述服务器接收到所述开始充电响应,将所述停泊位置信息发送给所述充电设备。
  37. 根据权利要求31-36中任一项所述的方法,其特征在于,所述通信系统还包括第二电子设备,在所述充电设备到达所述停泊位置信息指示的所述第一充电站的位置后,给所述第一电子设备充电之后,所述方法还包括:
    所述充电设备向所述第二电子设备发送车辆充电信息,所述车辆充电信息包括所述第一电子设备的电量;
    所述第二电子设备接收到所述车辆充电信息后,显示车辆充电提示信息,所述车辆充电提示信息用于提示用户所述第一电子设备的电量。
  38. 一种电子设备,其特征在于,包括:一个或多个处理器、显示屏、一个或多个存储器;其中,所述显示屏、一个或多个存储器与所述一个或多个处理器耦合,所述一个或多个存储器用于存储计算机程序代码,所述计算机程序代码包括计算机指令,当所述一个或多个处理器在执行所述计算机指令时,使得所述第一电子设备执行如权利要求1-37中任一项所述的方法。
  39. 一种计算机可读存储介质,包括指令,其特征在于,当所述指令在第一电子设备上运行时,使得所述第一电子设备执行如权利要求1-37中任一项所述的方法。
PCT/CN2022/141989 2021-12-30 2022-12-26 一种检测的方法及装置 WO2023125431A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111667026.8A CN116416192A (zh) 2021-12-30 2021-12-30 一种检测的方法及装置
CN202111667026.8 2021-12-30

Publications (1)

Publication Number Publication Date
WO2023125431A1 true WO2023125431A1 (zh) 2023-07-06

Family

ID=86997836

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/141989 WO2023125431A1 (zh) 2021-12-30 2022-12-26 一种检测的方法及装置

Country Status (2)

Country Link
CN (1) CN116416192A (zh)
WO (1) WO2023125431A1 (zh)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009162721A (ja) * 2008-01-10 2009-07-23 Toyota Motor Corp 運転補助装置
CN105391867A (zh) * 2015-12-06 2016-03-09 科大智能电气技术有限公司 基于手机app预约认证及引导支付的充电桩工作方法
US20160349239A1 (en) * 2015-05-29 2016-12-01 Hon Hai Precision Industry Co., Ltd. Electronic device and method for detecting and controlling driving under the influence
CN109927655A (zh) * 2019-04-16 2019-06-25 东风小康汽车有限公司重庆分公司 驾驶参数的调整方法及装置、汽车
CN110505837A (zh) * 2017-04-14 2019-11-26 索尼公司 信息处理设备、信息处理方法和程序
CN111415347A (zh) * 2020-03-25 2020-07-14 上海商汤临港智能科技有限公司 遗留对象检测方法和装置及交通工具
CN111703368A (zh) * 2020-06-28 2020-09-25 戴姆勒股份公司 一种车内遗忘物的检测和提醒系统及方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009162721A (ja) * 2008-01-10 2009-07-23 Toyota Motor Corp 運転補助装置
US20160349239A1 (en) * 2015-05-29 2016-12-01 Hon Hai Precision Industry Co., Ltd. Electronic device and method for detecting and controlling driving under the influence
CN105391867A (zh) * 2015-12-06 2016-03-09 科大智能电气技术有限公司 基于手机app预约认证及引导支付的充电桩工作方法
CN110505837A (zh) * 2017-04-14 2019-11-26 索尼公司 信息处理设备、信息处理方法和程序
CN109927655A (zh) * 2019-04-16 2019-06-25 东风小康汽车有限公司重庆分公司 驾驶参数的调整方法及装置、汽车
CN111415347A (zh) * 2020-03-25 2020-07-14 上海商汤临港智能科技有限公司 遗留对象检测方法和装置及交通工具
CN111703368A (zh) * 2020-06-28 2020-09-25 戴姆勒股份公司 一种车内遗忘物的检测和提醒系统及方法

Also Published As

Publication number Publication date
CN116416192A (zh) 2023-07-11

Similar Documents

Publication Publication Date Title
CN106293032B (zh) 便携终端设备及其控制方法和装置
US9638537B2 (en) Interface selection in navigation guidance systems
CN108237918A (zh) 车辆及其控制方法
CN101536059B (zh) 驾驶员状态估计装置、服务器、驾驶员信息收集装置以及驾驶员状态估计系统
CN110022403A (zh) 终端充电提醒方法、装置、设备及存储介质
CN110222491A (zh) 一种启动应用的方法及一种电子设备
US10829130B2 (en) Automated driver assistance system
WO2023051322A1 (zh) 出行管理方法、相关装置及系统
CN109887268B (zh) 车辆调度方法、装置及存储介质
CN106200477B (zh) 一种智能车载系统及更新方法
CN113722581B (zh) 一种信息推送方法和电子设备
CN115334193B (zh) 基于情境的通知显示方法和装置
EP4293535A1 (en) Information recommendation method and related device
FR2935523A1 (fr) Procede et systeme de mise en relation automatique et en direct d&#39;un conducteur et d&#39;au moins une personne a transporter.
CN111030719B (zh) 车载装置和数据处理的方法
WO2021082608A1 (zh) 一种提示出行方案的方法及电子设备
CN115033323A (zh) 通知显示方法和电子设备
CN111368765A (zh) 车辆位置的确定方法、装置、电子设备和车载设备
CN112923943A (zh) 辅助导航方法和电子设备
WO2023125431A1 (zh) 一种检测的方法及装置
WO2023125692A1 (zh) 一种服务推荐方法及相关装置
US20180315149A1 (en) Method and System for Facilitating the Movements of a Set of Pedestrians by Means of a Set of Vehicles
CN116416755A (zh) 一种警觉程度获取方法以及装置
CN113532453A (zh) 运动路线推荐方法、装置和介质
CN116320019B (zh) 数据采集方法、介质以及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22914685

Country of ref document: EP

Kind code of ref document: A1