CN116416192A - Detection method and device - Google Patents

Detection method and device Download PDF

Info

Publication number
CN116416192A
CN116416192A CN202111667026.8A CN202111667026A CN116416192A CN 116416192 A CN116416192 A CN 116416192A CN 202111667026 A CN202111667026 A CN 202111667026A CN 116416192 A CN116416192 A CN 116416192A
Authority
CN
China
Prior art keywords
electronic device
user
vehicle
information
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111667026.8A
Other languages
Chinese (zh)
Inventor
于金正
高翔宇
解文博
薛波
卓晓燕
陈维
詹舒飞
朱智超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202111667026.8A priority Critical patent/CN116416192A/en
Priority to PCT/CN2022/141989 priority patent/WO2023125431A1/en
Publication of CN116416192A publication Critical patent/CN116416192A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/14546Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue for measuring analytes not otherwise provided for, e.g. ions, cytochromes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/48Biological material, e.g. blood, urine; Haemocytometers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30268Vehicle interior

Abstract

The application discloses a detection method. When the user needs to drive out, the electronic equipment can acquire physiological information parameters, wine intake parameters, blood alcohol concentration parameters and acquisition time parameters. And predicts the sobering-up time of the user based on these parameters, avoiding the drunk driving of the user. The electronic equipment can also acquire behavior data, physical condition data and on-vehicle driving data of the user, and obtain recommended driving duration of the user based on the data, so that fatigue driving of the user is avoided. The electronic device may further obtain charging station information and charging vehicle information after detecting the scene to be charged, and obtain the first charging station option based on the information. The electronic device can display navigation information to the first charging station, so that the user can charge conveniently. When a user drives out, the vehicle machine equipment can acquire an image in the front vehicle and an image in the rear vehicle after the vehicle is driven in, and prompt the user that something is left in the vehicle when judging that something is left in the vehicle based on the images.

Description

Detection method and device
Technical Field
The present disclosure relates to the field of sensor technologies, and in particular, to a detection method and apparatus.
Background
Currently, with the development of society, driving travel is becoming more and more common. The driving and driving travel can reach the destination conveniently and fast, and the daily travel requirements of people are met better. However, various driving-related problems such as drunk driving, fatigue driving, trouble in charging electric vehicles, and missing property have frequently occurred in recent years.
At present, aiming at some driving and traveling scenes, the electronic equipment cannot accurately or intelligently provide effective reminding information for users, and user experience is poor. Therefore, improvement is demanded based on the reminder or service of driving trip.
Disclosure of Invention
The method and the device for detecting realize that when a user faces some related problems (such as drunk driving, fatigue driving, electric vehicle charging trouble, property missing and the like) of driving and traveling, travel prompts or services aiming at the problems are given, and user experience is improved.
In a first aspect, the present application provides a detection method, including: acquiring physiological information parameters, blood alcohol concentration parameters and acquisition time parameters for acquiring the blood alcohol concentration parameters; determining a predicted sobering-up time based on the physiological information parameter, the blood alcohol concentration parameter and the acquisition time parameter; wherein the predicted sobering-up time is used to indicate a point in time at which the blood alcohol concentration of the user is lower than the threshold blood alcohol concentration; displaying the predicted sobering-up time.
Therefore, the user can determine how long to sober up through the detection method provided by the application after drinking, and can avoid drunk driving of the user and cause life and property loss for the user and others.
In one possible implementation, the physiological information parameters include one or more of weight, height, age, gender, sleep time, sleep quality.
Thus, the sober-up time can be predicted more accurately based on the physiological information of the user.
In one possible implementation, determining the predicted sobering-up time based on the physiological information parameter, the blood alcohol concentration parameter, and the acquisition time parameter specifically includes: based on the physiological information parameter, the blood alcohol concentration parameter and the acquisition time parameter, determining the predicted sobering-up time through an alcohol prediction model.
In one possible implementation, before acquiring the physiological information parameter, the blood alcohol concentration parameter, and the acquisition time parameter for acquiring the blood alcohol concentration parameter, the method further includes: receiving a first input; the method for acquiring the physiological information parameter, the blood alcohol concentration parameter and the acquisition time parameter for acquiring the blood alcohol concentration parameter specifically comprises the following steps: in response to the first input, a physiological information parameter, a blood alcohol concentration parameter, and a collection time parameter are obtained.
In one possible implementation, before determining the predicted sobering-up time based on the physiological information parameter, the blood alcohol concentration parameter, and the acquisition time parameter, the method further includes: receiving a second input; based on the physiological information parameter, the blood alcohol concentration parameter and the acquisition time parameter, determining the predicted sobering-up time specifically comprises: in response to the second input, a predicted sobering-up time is determined based on the physiological information parameter, the blood alcohol concentration parameter, and the acquisition time parameter.
In one possible implementation, determining the predicted sobering-up time based on the physiological information parameter, the blood alcohol concentration parameter, and the acquisition time parameter specifically includes: acquiring parameters of intake of wine; based on the physiological information parameter, the intake wine parameter, the blood alcohol concentration parameter and the acquisition time parameter, a predicted sobering-up time is determined.
In this way, a more accurate prediction of sobering-up time can be obtained based on the amount of alcohol taken by the user.
In one possible implementation, the obtaining of the intake wine parameters specifically includes: acquiring a container image of the drunk water through a camera; based on the container image, parameters of intake of wine are determined.
In one possible implementation, the intake parameters include a degree of wine parameter for indicating the degree of wine intake by the user and a volume of wine parameter for indicating the volume of wine intake by the user.
In one possible implementation, the determining the predicted sober-up time based on the physiological information parameter, the intake alcohol parameter, the blood alcohol concentration parameter, and the acquisition time parameter specifically includes: based on the intake alcohol parameters and the physiological information parameters, obtaining a predicted alcohol absorption rate and a predicted alcohol metabolism rate through an alcohol prediction model; inputting wine parameters based on physiological information parameters, predicting alcohol absorption rate and alcohol metabolism rate to obtain a corresponding relationship between blood alcohol concentration and time; based on the blood alcohol concentration parameter, the corresponding relation of the acquisition time parameter, the blood alcohol concentration and the time, the predicted sobering-up time is determined.
In a second aspect, the present application provides another detection method comprising: acquiring behavior data of a user; determining the fatigue degree of the user before driving based on the behavior data of the user; determining a first recommended driving duration of the user based on the pre-driving fatigue degree of the user; and displaying the first recommended driving duration.
In this way, the user can obtain the recommended driving duration before driving. The user can determine the time that the user can drive based on the recommended driving duration. It will be appreciated that when the recommended driving duration is zero, it may be used to indicate that the user is not suitable for driving. Therefore, fatigue driving of the user can be avoided, and damage to lives and properties of the user or other people is avoided.
In one possible implementation manner, obtaining behavior data of a user specifically includes: acquiring the trip time of a user; acquiring behavior data of a user at a first moment before a travel moment; the first moment and the travel moment are different by a preset time.
In one possible implementation manner, acquiring the travel time of the user specifically includes: acquiring schedule information of a user, wherein the schedule information comprises one or more of bill information, meeting information and schedule information of the user; and acquiring the travel time of the user based on the schedule information of the user.
In one possible implementation, the method further includes: acquiring physical state data of a user in a vehicle running state; determining the fatigue degree of the user in driving based on the physical state data of the user; determining the final fatigue degree of the user based on the fatigue degree of the user before driving and the fatigue degree of the user during driving; determining a second recommended driving duration based on the final fatigue degree of the user; and displaying the second recommended driving duration.
In this way, the user obtains the recommended driving duration during driving in the driving process, and the fatigue driving situation is avoided.
In one possible implementation manner, determining the fatigue degree in driving of the user based on the physical state data of the user specifically includes: and determining the fatigue degree in driving through a second fatigue model based on the physical state data of the user, wherein the second fatigue model is obtained through training according to the historical physical state data of the user.
In one possible implementation manner, determining the fatigue degree in driving of the user based on the physical state data of the user specifically includes: acquiring vehicle running data of a user in a vehicle running state; and determining the fatigue degree of the user in driving based on the physical state data of the user and the vehicle running data of the user.
In one possible implementation manner, determining the fatigue degree in driving of the user based on the physical state data of the user and the vehicle driving data of the user specifically includes: determining a second fatigue model based on the physical state data of the user; and determining the fatigue degree in driving through a second fatigue model based on the on-vehicle driving data of the user and the physical state data of the user.
In one possible implementation manner, obtaining behavior data of a user specifically includes: acquiring user data of a user, wherein the user data comprise one or more of movement time, movement intensity and sleeping time; based on the user data, behavior data of the user is determined.
In one possible implementation manner, determining the fatigue degree of the user before driving based on the behavior data of the user specifically includes: determining the fatigue degree of the user before driving through a first fatigue model based on the behavior data of the user; the first fatigue model is obtained through training according to historical behavior data of a user.
In a third aspect, the present application provides another detection method applied to a first communication system, where the first communication system includes a first electronic device and a second electronic device; the method comprises the following steps: the method comprises the steps that a first electronic device detects boarding operation of a passenger and acquires an in-car image before the passenger boarding; the method comprises the steps that communication connection is established between a first electronic device and a second electronic device; the method comprises the steps that first electronic equipment detects the getting-off operation of a passenger and obtains an in-car image of the passenger after getting off; when the first electronic device determines that the articles of the passenger remain in the vehicle based on the vehicle interior image before the passenger gets on the vehicle and the vehicle interior image after the passenger gets off the vehicle, broadcasting first omission prompt information; the first missing prompt message is used for prompting that the articles of the passengers are left in the vehicle; the first electronic device sends the article omission indication information to the second electronic device through communication connection; the second electronic device displays second omission prompt information based on the article omission indication information, the second omission prompt information being used for prompting the passengers of article omission.
Thus, when the articles of the passengers are left in the vehicle, the driver and the passengers can be prompted, and the articles of the passengers are prevented from being left in the vehicle. Also, the occupation of time of the passenger and the driver when the passenger retrieves the articles is avoided.
In one possible implementation, the second electronic device is the electronic device with the strongest signal among all the electronic devices detected by the first electronic device.
In one possible implementation, after the first electronic device and the second electronic device establish a communication connection, the method further includes: the method comprises the steps that a first electronic device sends motion information of the first electronic device to a second electronic device through communication connection; when the second electronic equipment determines that the motion state of the first electronic equipment is the same as the motion state of the second electronic equipment based on the motion information of the first electronic equipment, the second electronic equipment sends a confirmation success signaling to the first electronic equipment; the first electronic device receives the confirmation success signaling and keeps the communication connection with the second electronic device.
Therefore, the first electronic equipment and the second electronic equipment can be ensured to be electronic equipment in the same vehicle, and passengers can be ensured to receive the second omission prompt information.
In a possible implementation manner, the second electronic device determines, based on the motion information of the first electronic device, that the motion state of the first electronic device is the same as the motion state of the second electronic device, and specifically includes: when the second electronic equipment continuously determines that the motion information of the first electronic equipment is the same as the motion information of the second electronic equipment for N times, determining that the motion state of the first electronic equipment is the same as the motion state of the second electronic equipment; wherein N is a positive integer.
In a possible implementation manner, the second electronic device determines, based on the motion information of the first electronic device, that the motion state of the first electronic device is the same as the motion state of the second electronic device, and specifically includes: when the second electronic equipment judges whether the motion information of the first electronic equipment is identical to the motion information of the second electronic equipment or not for M times, the motion information of the first electronic equipment is judged to be identical to the motion information of the second electronic equipment for at least N times, and the second electronic equipment determines that the motion state of the first electronic equipment is identical to the motion state of the second electronic equipment; wherein N is less than or equal to M, and M and N are positive integers.
In one possible implementation, when the difference between the motion information of the first electronic device and the motion information of the second electronic device is less than the motion deviation threshold, the motion information of the first electronic device and the motion information of the second electronic device are the same.
In one possible implementation, after the first electronic device and the second electronic device establish a communication connection, the method further includes: the second electronic device sends the motion information of the second electronic device to the first electronic device through communication connection; when the first electronic equipment determines that the motion state of the first electronic equipment is the same as the motion state of the second electronic equipment based on the motion information of the second electronic equipment, the first electronic equipment sends a confirmation success signaling to the second electronic equipment; the second electronic device receives the confirmation success signaling and maintains the communication connection with the first electronic device.
In one possible implementation, the method further includes: when the second electronic equipment determines that the motion state of the first electronic equipment is different from the motion state of the second electronic equipment based on the motion information of the first electronic equipment, the second electronic equipment is disconnected from the communication connection of the first electronic equipment.
In one possible implementation manner, the second electronic device disconnects the communication with the first electronic device, and specifically includes: the second electronic device sends a failure confirmation signaling to the first electronic device; the first electronic device receives the confirmation failure signaling and disconnects the communication connection with the second electronic device.
In one possible implementation, after the second electronic device disconnects the communication with the first electronic device, the method further includes: the second electronic device broadcasts a communication connection request.
In one possible implementation manner, the first electronic device and the second electronic device establish a communication connection, which specifically includes: the second electronic device broadcasts a communication connection request; the first electronic device receives a communication connection request of the second electronic device; the first electronic device sends a communication connection response to the second electronic device; the second electronic device receives the communication connection response of the first electronic device and establishes communication connection with the first electronic device.
In one possible implementation manner, the first electronic device and the second electronic device establish a communication connection, which specifically includes: after the first electronic equipment detects that a passenger sits down in the vehicle, the first electronic equipment receives a communication connection request of the second electronic equipment; the first electronic device sends a communication connection response to the second electronic device, and establishes communication connection with the second electronic device.
In a fourth aspect, the present application provides another detection method applied to a second communication system, where the second communication system includes a first electronic device, a server, and a charging device; the method comprises the following steps: the method comprises the steps that first electronic equipment receives charging information of one or more charging stations sent by a server; the first electronic device displays one or more charging station options based on charging information of one or more charging stations, the one or more charging station options including a first charging station option; after receiving input aiming at a first charging station option, the first electronic equipment displays first navigation information, wherein the first navigation information is used for indicating a route from the position of the first electronic equipment to a first charging station corresponding to the first charging station option; after the server detects that the first electronic equipment reaches the first charging station, the server acquires parking position information of the first electronic equipment in the first charging station; the server sends the parking position information to the charging equipment; and after the charging equipment reaches the position of the first charging station indicated by the parking position information, charging the first electronic equipment.
Therefore, the user can quickly acquire the charging service provided by the server, the charging equipment can search the first electronic equipment by itself to charge the first electronic equipment, and the charging operation of the user is reduced.
In one possible implementation, the first electronic device displays one or more charging station options based on charging information of one or more charging stations, specifically including: the first electronic device determines one or more charging station options based on the charging information of the one or more charging stations and the charging information of the first electronic device.
In one possible implementation, the one or more charging station options include a charging price, a charging time, and a distance of arrival, the charging price being used to indicate a charge required by the first electronic device to charge, the charging time being used to indicate a time required by the first electronic device to charge, the distance of arrival being used to indicate a distance between the first electronic device and a charging station to which the charging station option corresponds.
In one possible implementation manner, the first electronic device receives charging information of one or more charging stations sent by the server, and specifically includes: when the first electronic equipment detects a scene to be charged, the first electronic equipment receives charging information of one or more charging stations sent by a server, wherein the scene to be charged comprises a low-power scene and a parking lot scene; the low-power scene is a scene that the power of the first electronic device is lower than a preset power threshold, and the parking lot scene is a scene that the distance between the first electronic device and a nearby parking place is smaller than a specified distance threshold.
In one possible implementation manner, the first electronic device receives charging information of one or more charging stations sent by the server, and specifically includes: the method comprises the steps that first electronic equipment obtains destination information which a user goes to, wherein the destination information comprises a destination address and a route to a destination; and after the first electronic equipment determines that the electric quantity of the first electronic equipment is lower than the electric quantity consumed by the first electronic equipment for driving to the destination address according to the route to the destination, the first electronic equipment receives charging information of one or more charging stations sent by the server.
In one possible implementation, the server sends the parking position information to the charging device, specifically including: the server sends a charging starting request to the first electronic equipment; the method comprises the steps that a first electronic device receives a charging starting request and displays a charging starting control; after receiving a fourth input aiming at a charging start control, the first electronic device responds to the fourth input and sends a charging start response to the server; the server receives the start charging response and transmits the parking position information to the charging device.
In one possible implementation, the communication system further includes a second electronic device, and after the charging device reaches the location of the first charging station indicated by the parking location information, the method further includes, after charging the first electronic device: the charging equipment sends vehicle charging information to the second electronic equipment, wherein the vehicle charging information comprises the electric quantity of the first electronic equipment; and after the second electronic equipment receives the vehicle charging information, displaying vehicle charging prompt information which is used for prompting the electric quantity of the first electronic equipment for a user.
In a fifth aspect, the present application provides a communication device comprising one or more processors and one or more memories. The one or more memories are coupled to the one or more processors, the one or more memories being operable to store computer program code comprising computer instructions that, when executed by the one or more processors, cause the communications apparatus to perform the detection method in any of the possible implementations of the above.
In a sixth aspect, embodiments of the present application provide a computer storage medium comprising computer instructions that, when run on an electronic device, cause a communication apparatus to perform a detection method in any one of the possible implementations of the above aspect.
In a seventh aspect, embodiments of the present application provide a computer program product, which when run on a computer causes the computer to perform the detection method in any one of the possible implementations of the above aspect.
Drawings
Fig. 1 is a schematic structural diagram of an electronic device 100 according to an embodiment of the present application;
fig. 2 is a schematic diagram of a communication system according to an embodiment of the present application;
Fig. 3 is a schematic block diagram of an electronic device 100 according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a blood alcohol concentration versus time curve provided in an embodiment of the present application;
FIGS. 5A-5H are a set of interface diagrams provided in embodiments of the present application;
FIGS. 6A-6B are a schematic diagram of another set of interfaces provided in an embodiment of the present application;
fig. 7 is a schematic flow chart of a detection method according to an embodiment of the present application;
fig. 8 is a schematic diagram of another communication system according to an embodiment of the present application;
fig. 9 is a schematic block diagram of another electronic device 100 according to an embodiment of the present disclosure;
FIG. 10 is a flow chart of another detection method according to an embodiment of the present disclosure;
fig. 11 is a schematic view of an application scenario provided in an embodiment of the present application;
fig. 12 is a schematic view of another application scenario provided in an embodiment of the present application;
fig. 13 is a schematic structural diagram of another electronic device 100 according to an embodiment of the present disclosure;
fig. 14 is a schematic structural diagram of a vehicle-mounted device 900 according to an embodiment of the present application;
FIGS. 15A-15E are another set of interface schematic diagrams provided by embodiments of the present application;
FIG. 16 is a flow chart of another detection method according to an embodiment of the present disclosure;
Fig. 17A is a schematic view of an image in a front vehicle of an upper vehicle according to an embodiment of the present application;
fig. 17B is a schematic view of an image in a vehicle after a vehicle is taken off according to an embodiment of the present application;
FIGS. 18A-18E are another set of interface schematic diagrams provided by embodiments of the present application;
FIG. 19 is a flow chart of another detection method according to an embodiment of the present disclosure;
FIG. 20 is a flow chart of another detection method according to an embodiment of the present disclosure;
FIG. 21 is a flow chart of another detection method according to an embodiment of the present disclosure;
FIG. 22 is a flow chart of another detection method according to an embodiment of the present disclosure;
fig. 23 is a flow chart of another detection method according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and thoroughly described below with reference to the accompanying drawings. Wherein, in the description of the embodiments of the present application, "/" means or is meant unless otherwise indicated, for example, a/B may represent a or B; the text "and/or" is merely an association relation describing the associated object, and indicates that three relations may exist, for example, a and/or B may indicate: a exists alone, A and B exist together, and B exists alone.
The terms "first," "second," and the like, are used below for descriptive purposes only and are not to be construed as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature, and in the description of embodiments of the present application, unless otherwise indicated, the meaning of "a plurality" is two or more.
The electronic device provided by the embodiment of the application is introduced.
The electronic device 100 may be a cell phone, tablet, desktop, laptop, handheld, notebook, ultra-mobile personal computer (ultra-mobile personal computer, UMPC), netbook, as well as a cellular telephone, personal digital assistant (personal digital assistant, PDA), augmented reality (augmented reality, AR) device, virtual Reality (VR) device, artificial intelligence (artificial intelligence, AI) device, wearable device, vehicle-mounted device, smart home device, and/or smart city device, with the specific types of such electronic devices not being particularly limited in the embodiments of the present application.
Fig. 1 shows a schematic configuration of an electronic device 100.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It should be understood that the illustrated structure of the embodiment of the present invention does not constitute a specific limitation on the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and a command center of the electronic device 100, among others. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present invention is only illustrative, and is not meant to limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also use different interfacing manners, or a combination of multiple interfacing manners in the foregoing embodiments.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 to power the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also perform algorithm optimization on noise and brightness of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent awareness of the electronic device 100 may be implemented through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The external memory interface 120 may be used to connect external non-volatile memory to enable expansion of the memory capabilities of the electronic device 100. The external nonvolatile memory communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music and video are stored in an external nonvolatile memory.
The internal memory 121 may be used to store computer executable program code including instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 100 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals.
When the pressure sensor 180A is used to sense a pressure signal, the pressure signal may be converted into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The gyro sensor 180B may be used to determine a motion gesture of the electronic device 100. The air pressure sensor 180C is used to measure air pressure. The magnetic sensor 180D includes a hall sensor, and the opening and closing of the flip cover can be detected by the magnetic sensor 180D. The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). A distance sensor 180F for measuring a distance. The proximity light sensor 180G may also be used in holster mode, pocket mode to automatically unlock and lock the screen. The ambient light sensor 180L is used to sense ambient light level. The fingerprint sensor 180H is used to collect a fingerprint. The temperature sensor 180J is for detecting temperature. The touch sensor 180K, also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194. The bone conduction sensor 180M may acquire a vibration signal. The keys 190 include a power-on key, a volume key, etc. The motor 191 may generate a vibration cue. The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc. The SIM card interface 195 is used to connect a SIM card.
At present, various automobiles become an indispensable part of people going out, and in many automobile scenes, problems can occur. In some possible application scenarios, the user may remain drunk for a long period of time after drinking. If the user performs work or driving after drunk, the user can cause great harm to the life and property safety of himself or other people.
Accordingly, embodiments of the present application provide a detection method. The electronic device 100 may obtain parameters of intake of wine, parameters of physiological information, parameters of blood alcohol concentration, and parameters of acquisition time. Wherein the intake parameters include a alcoholicity parameter and a volume parameter. The physiological information parameter is body data affecting the alcohol absorption rate and alcohol metabolism rate of the user, such as data of sleep time, sleep quality, body weight, etc. The blood alcohol concentration parameter may be used to indicate the blood alcohol concentration of the user. The acquisition time parameter is used to indicate a time point when the electronic device 100 acquires the blood alcohol concentration parameter. The electronic device 100 may input the alcohol number parameter, the physiological information parameter, into the alcohol predictive model to obtain a predicted metabolic rate and a predicted absorption rate. Wherein the predicted metabolic rate and the predicted absorption rate are parameters affecting the blood alcohol concentration of the user. The electronic device 100 may derive a blood alcohol concentration-time (C-T) curve of the user based on the physiological information parameter, the intake alcohol parameter, the predicted alcohol metabolism rate, and the predicted absorption rate. The electronic device 100 may also derive a predicted sobering-up time based on the blood alcohol concentration parameter, the acquisition time parameter, and the blood alcohol-time curve. Thus, the electronic device 100 can obtain the sobering time of the user through the detected parameters, and the electronic device 100 can prompt the user when to sober up, so as to avoid the activities such as driving when the user is drunk, and protect the life and property safety of the user and other people.
In one possible implementation, electronic device 100 may receive a user entered start-up time and based on the physiological information parameters, the intake parameters, and the start-up time, obtain a predicted sober-up time. For example, the electronic device 100 may obtain the correspondence relationship between the blood alcohol concentration and time through an alcohol prediction model based on the physiological information parameter and the intake alcohol parameter. And obtaining the predicted sobering-up time according to the corresponding relation between the blood alcohol concentration and the time and the drinking starting time. For another example, electronic device 100 may use physiological information parameters, intake parameters, and onset time of drinking as inputs to an alcohol prediction model to obtain a predicted sober-up time. Thus, the electronic device 100 may also obtain a predicted sobering-up time without the need for an alcohol sensor.
In one possible implementation, the electronic device 100 may also derive the user ingestible volume of wine based on the desired sobering-up time entered by the user. Specifically, after the electronic device 100 receives the desired sobering-up time input by the user, the electronic device 100 may acquire the alcohol content parameter and the physiological information parameter, and input the alcohol content parameter and the physiological information parameter into the alcohol prediction model, to obtain the predicted metabolic rate and the predicted absorption rate. The electronic device 100 obtains an ingestible wine volume based on the predicted metabolic rate, the predicted absorption rate, the desired sober-up time, and the alcohol content parameter. In this way, the electronic device 100 may prompt the user to drink no more than the volume of ingestible alcohol, and may sober up at the desired sober-up time without affecting the user's subsequent travel.
In one possible implementation, the electronic device 100 may predict the predicted sober-up time based on one or more of the intake wine parameters, physiological information parameters, blood alcohol concentration parameters, and acquisition time parameters described above. For example, the electronic device 100 may determine the predicted sobering-up time based on the physiological information parameter of the user, the blood alcohol concentration parameter, and the acquisition time parameter for acquiring the blood alcohol concentration parameter. The electronic device 100 may also derive an ingestible wine volume for the user based on one or more of the wine degree parameter and the physiological information parameter, and the desired sobering-up time. Thus, the electronic device 100 may also obtain the predicted sobering-up time, or the ingestible volume of wine, in the case where the acquired parameter is one or more of the above parameters.
A communication system 10 provided in an embodiment of the present application is described next.
As shown in fig. 2, the communication system 10 may include an electronic device 100 and an electronic device 200. Wherein the electronic device 100 may establish a wireless connection with the electronic device 200 via wireless communication means (e.g., wireless fidelity, wi-Fi, bluetooth, etc.). The electronic device 100 may receive data transmitted by the electronic device 200, or the electronic device 100 may send an operation instruction input by a user to the electronic device 200, the electronic device 200 may perform an operation instructed by the operation instruction after receiving the operation instruction, or the like.
In communication system 10, electronic device 100 may be used to store data (e.g., physiological information parameters, wine volume parameters, wine degree parameters, blood alcohol concentration parameters, acquisition time parameters, predicted alcohol metabolism rate, predicted alcohol absorption rate, C-T curve, predicted sobering-up time, etc.) required to train an alcohol prediction model. The electronic device 100 may perform training of the alcohol prediction model based on the data, resulting in an alcohol prediction model with an accuracy greater than a first threshold (e.g., 90%).
The electronic device 100 may also derive a prediction result based on the alcohol prediction model and data related to the current drinking of the user (i.e., physiological information parameters, intake of alcohol parameters). The predicted outcome may include predicted metabolic rate, predicted absorption rate, and blood alcohol concentration versus time curve. The electronic device 100 may further obtain a correction result based on the prediction result, the blood alcohol concentration parameter, and the acquisition time parameter. The electronic device 100 may obtain the blood alcohol concentration parameter and the collection time parameter through the electronic device 200. Wherein, the blood alcohol concentration parameter can be used for indicating the concentration of ethanol in blood of a user, and the unit can be mg/100ml. The collection time parameter may be used to indicate a collection time point at which the electronic device 200 collects the blood alcohol concentration parameter.
The electronic device 200 may be any electronic device including an alcohol sensor. By way of example, electronic device 200 may be a wearable electronic device (e.g., a smart eye, a smart watch, a bluetooth headset, etc.), an electronic device integrated with an alcohol sensor (e.g., electronic device 100 carrying an alcohol sensor, a harness carrying an alcohol sensor, etc.). The alcohol sensor can be used for detecting gas exhaled by a user to obtain the blood alcohol concentration of the user. The electronic device 200 may send the blood alcohol concentration to the electronic device 100 after detecting the blood alcohol concentration of the user.
When the blood alcohol concentration reaches 20mg/100ml and is lower than 80mg/100ml, the driving of drinking is performed. When the blood alcohol concentration reaches 80mg/100ml, the drunk driving is realized. Thus, in the writing of the examples of the present application, blood alcohol concentrations below 20mg/100ml are used as criteria that the user has sobered. It will be appreciated that this criterion is only an example, and that the criterion for sobering up may also be any concentration of blood alcohol of 20mg/100ml or less, which is not limited in this application. It will also be appreciated that in some possible application scenarios, the electronic device 100 may include an alcohol sensor, and the electronic device 100 may directly obtain the blood alcohol concentration of the user.
Optionally, the electronic device 200 further includes a body movement recorder. The electronic device 200 may detect short-term memory body data of the user, such as the user's sleep quality, the user's sleep duration, etc., through the body movement recorder. Optionally, the electronic device 200 further comprises an acceleration sensor, which may be used to detect short-term memory body data of the user, e.g. movement situations of the user, etc. The electronic device 200 may send the short-term memory body data of the user to the electronic device 100.
The electronic device 100 may be configured to obtain a predicted sobering-up time based on intake parameters of alcohol, physiological information parameters, blood alcohol concentration parameters, and acquisition time parameters. The electronic device 100 may also be configured to derive an ingestible volume of wine based on the alcohol content parameter, the physiological information parameter, and the desired sobering-up time.
In one possible implementation, communication system 10 may also include a server 300, not shown in the figures. The server 300 may be a cloud server. A communication connection is established between the server 300 and the electronic device 100, and the server 300 may be configured to store the above-described parameters (e.g., physiological information parameters). And performing model training based on the parameters to obtain a preset alcohol model with accuracy greater than a first threshold (e.g., 90%), and obtaining a prediction result (e.g., an ingestible alcohol volume, a predicted sobering-up time, etc.) based on the preset alcohol model and related parameters of the current drinking of the user.
In one possible implementation, server 300 may have stored therein parameters of wine intake, physiological information, blood alcohol concentration, acquisition time, predicted absorption rate, predicted metabolic rate, C-T curve, predicted sobering-up time, etc. for a plurality of users. The server 300 may perform training of the alcohol prediction model based on these data.
In one possible implementation, electronic device 100 may send a predicted sobering-up time to electronic device 200, which electronic device 200 may display. Further, the steps performed by the electronic device 100 may be performed by the electronic device 200, which is not limited in this application.
A schematic block diagram of the electronic device 100 according to an embodiment of the present application is described below.
As shown in fig. 3, the schematic block diagrams provided in the embodiments of the present application include, but are not limited to, a sensing block 310, a storage block 320, a training block 330, a prediction block 340, a correction block 360, and a display block 350. The operations performed by the respective modules may be divided into a model training process and a model prediction process. The model training process is shown by a dashed arrow in fig. 3, and the electronic device 100 may use the historical parameters to train the alcohol prediction model after a preset time (for example, one week) is separated, or the electronic device 100 may use the historical parameters to train the alcohol prediction model after each prediction result is obtained, so as to obtain the alcohol prediction model with the accuracy reaching the first threshold. The model prediction flow is shown by solid arrows in fig. 3, and the electronic device 100 may obtain a predicted sobering-up time or an ingestible volume of wine based on the trained alcohol prediction model.
Wherein the perception module 310 may be used to obtain parameters required for model training/model prediction. The sensing module 310 may obtain the parameters through a camera, a related sensor, etc. of the electronic device 100, or the sensing module 310 may obtain the parameters through other electronic devices (e.g., the electronic device 200) that establish a communication connection with the electronic device 100, or the sensing module 310 may obtain the related parameters through obtaining an input of a user.
The sensing module 310 may be used to obtain parameters of the intake of wine. For example, the sensing module 310 may obtain the alcohol content (i.e., alcohol content parameter) and alcohol volume (alcohol volume parameter) ingested by the user via a camera. Specifically, the sensing module 310 may obtain a container image of the drunk water through a camera, and the sensing module 310 may obtain the parameters of the drunk water through an image recognition algorithm based on the container image. As another example, the sensing module 310 may also obtain the parameters of the intake of wine entered by the user.
The sensing module 310 may also be used to obtain physiological information parameters (including long-term memory type physiological parameters and short-term memory type physiological parameters). For example, the sensing module 310 may detect a user short-term memory-type physiological parameter (e.g., sleep quality, sleep time, etc.) via a somatosensory recorder. For another example, the sensing module 310 may detect a user short-term memory-type physiological parameter (e.g., movement, etc.) via an acceleration sensor, an inertial measurement unit, or the like. For another example, the sensing module 310 may also obtain a portion of the physiological information parameters entered by the user, including a portion of long-term memory parameters (e.g., gender), a portion of short-term memory parameters (e.g., weight, height, age), etc.
The sensing module 310 may also be configured to obtain a blood alcohol concentration parameter of a user, and a time to obtain the blood alcohol concentration parameter (also referred to as a collection time parameter). For example, the sensing module 310 may obtain the blood alcohol concentration parameter of the user via an alcohol sensor. The perception module 310 may send the blood alcohol concentration parameter and the acquisition time parameter to the correction module 360 for correcting the predicted outcome.
Alternatively, the sensing module 310 may obtain a portion of the short-term memory-type physiological parameter (e.g., body weight, body mass index, etc.) of the user via a body fat scale or the like coupled to the electronic device 100. Alternatively, the data collected by the sensor may be manually input by the user.
The perception module 310 may also send all acquired parameters to the storage module 320 for model training/model prediction.
Alternatively, in the model prediction process, the sensing module 310 may directly send the acquired parameters to the prediction module 340, and the prediction module 340 may predict the sobering-up time or the volume of ingestible wine based on the parameters sent by the sensing module 310.
Wherein the storage module 320 may be used to store parameters used for model training/model prediction. The storage module 320 may be configured to receive the parameters obtained by the sensing module 310 and store them in a memory (e.g., the internal memory 121). The storage module 320 may also receive and store the prediction results sent by the prediction module 340. The storage module 320 may also receive and store corrected predicted results (also referred to as corrected results) sent by the correction module 360. In the model training process, the storage module 320 may send all of the stored parameters (which may be referred to as historical parameters) to the training module 330. The historical parameters may include, but are not limited to, stored physiological information parameters, input wine parameters, blood alcohol concentration parameters, acquisition time parameters, predicted outcomes, corrected outcomes, and the like. In the model prediction process, the storage module 320 may send parameters sent by the sensing module 310 (e.g., physiological information parameters, alcohol content parameters, alcohol volume parameters, etc. recently acquired by the storage module 320) to the prediction module 340 for predicting the user's sober-up time.
The training module 330 may train the model by using a neural network algorithm (e.g., a convolutional neural network algorithm, a cyclic neural network algorithm, etc.), taking a part of the historical parameters (e.g., physiological information parameters, input wine parameters, prediction results) sent by the storage module 320 as input values of the model, and taking another part of the historical parameters (e.g., blood alcohol concentration parameters, acquisition time parameters, correction results, etc.) as output values of the model, to obtain a trained alcohol prediction model. That is, the training module 330 may train an alcohol prediction model with an accuracy greater than a first threshold based on historical parameters of the user. The training module 330 may be run in the processor 110 of the electronic device 100. Here, the processor 110 may also be An Intelligent (AI) chip.
The initial alcohol prediction model may be an alcohol prediction model obtained by training data of other similar users in advance. Wherein one or more of physiological information parameters, intake wine parameters of the similar user and the current user are the same or similar. For example, the sex of the user is a man, the height is 178cm, the weight is 83kg, the volume of the taken wine is 340ml, the alcohol content is 20% (i.e. the volume of the taken alcohol is 68 ml), and the sleeping time of the previous day is 7 hours. Then, a similar user of the user may be a male user who has a height of 175cm-185cm, a weight of 80kg-85kg, a volume of 50ml-80ml of alcohol, and a sleeping time of the previous day of 6 hours-8 hours. It should be noted that, the range of values of the physiological information parameter and the intake wine parameter is only an example, and the range of these parameters may be larger or smaller, which is not limited in this application.
In one possible implementation, the electronic device 100 may make a determination of similar users based on more or fewer parameters. For example, the electronic device 100 may also make a determination of similar users based on sleep quality parameters. For example, the electronic device 100 may classify the sleep quality as excellent, good, poor, very bad based on the duration of the user's deep sleep time, shallow sleep time, fast eye movement time. When the sleep quality is the same, the electronic device 100 may determine that it is a similar user.
The training module 330 may send the trained alcohol prediction model to the prediction module 340. The alcohol predictive model may be used to derive the alcohol metabolism rate (i.e., predicted metabolic rate) and the alcohol absorption rate (i.e., predicted absorption rate) of the user.
The prediction module 340 may be configured to calculate a prediction result. Specifically, the prediction module 340 may input the physiological information parameter and the alcohol degree parameter sent by the storage module 320 into the alcohol prediction model to obtain a predicted metabolic rate and a predicted absorption rate. The prediction module 340 may also derive a blood alcohol concentration versus time curve based on the predicted metabolic rate, the predicted absorption rate, the wine volume parameter, the wine number parameter, the user weight parameter of the physiological information parameter. The prediction module 340 may be run in the processor 110 of the electronic device 100.
Wherein the blood alcohol concentration-time curve can be obtained based on the predicted metabolic rate, the predicted absorption rate, the alcohol volume parameter, the alcohol number parameter, the user weight parameter. The specific formula is as follows:
Figure BDA0003448546560000141
wherein c is the blood alcohol concentration of the user, and t is the time corresponding to the blood alcohol concentration. k (k) a To predict the absorption rate, v m To predict metabolic rate, k m Is a michaelis constant, which is a known fixed value. c 0 For maximum blood alcohol concentration, it can be obtained from the following formula:
Figure BDA0003448546560000142
wherein B is a For taking alcohol content, V a For intake of wine volume, m is the weight of the user, r is a fixed coefficient, and 0.75 can be taken.
Illustratively, when the user ingests 340ml of wine, the user ingests 20% of wine, and the user weighs 83kg, the user's maximum blood alcohol concentration is about 87.3mg/100ml. The prediction module 340 may substitute the maximum blood alcohol concentration, the predicted metabolic rate, and the predicted absorption rate into equation 1 to obtain a C-T curve. Wherein the C-T curve obtained by the prediction module 340 may be as shown in fig. 4. Wherein, the time corresponding to the maximum blood alcohol concentration C0 is T0. And when the blood alcohol concentration of the user is lower than the threshold blood alcohol concentration C1, determining that the user sobers up. The time corresponding to the threshold blood alcohol concentration is T1, T1 may be used to indicate a predicted sober-up time. The threshold blood alcohol concentration is written as 20mg/100ml, but the threshold blood alcohol concentration may take other values lower than 20mg/100ml, and the present application is not limited thereto.
The prediction module 340 may also obtain a predicted sobering-up time based on the blood alcohol concentration-time curve, the blood alcohol concentration parameter, and the acquisition time parameter. Specifically, the prediction module 340 may determine, based on the blood alcohol concentration parameter and the C-T curve, a position of the blood alcohol concentration indicated by the blood alcohol concentration parameter in the C-T curve, that is, may determine a corresponding time point of the blood alcohol concentration parameter in the C-T curve. The prediction module 340 may then obtain a time difference between the time point corresponding to the threshold blood alcohol concentration and the time point corresponding to the blood alcohol concentration based on the time point corresponding to the blood alcohol concentration parameter. The prediction module 340 may add a time difference to the point in time indicated by the acquisition time parameter to obtain a predicted sobering-up time.
It can be appreciated that when the electronic device 100 only obtains a set of blood alcohol concentration parameters and collection time parameters, the prediction module 340 can obtain two time points corresponding to the blood alcohol concentration parameters in the C-T curve. That is, the prediction module 340 may derive two predicted sobering-up times. The prediction module 340 may send two predicted sobering-up times to the display module 350. The display module 350 may display the two predicted sobering-up times in the form of a time range. For example, the display module 350 receives a predicted sober-up time a and a predicted sober-up time B, wherein the predicted sober-up time a is earlier than the predicted sober-up time B, and the display module 350 may display the predicted sober-up time as the predicted sober-up time a to the predicted sober-up time B. Alternatively, display module 350 may only display the latest predicted sobering-up time, e.g., display module 350 may only display predicted sobering-up time B.
It will also be appreciated that when the prediction module 340 obtains two or more sets of blood alcohol concentration parameters and collection time parameters, a unique position of the blood alcohol concentration parameter in the C-T curve may be determined based on the two sets of blood alcohol concentration parameters and collection time parameters that were recently obtained, and a predicted sober-up time may be obtained.
Illustratively, according to the blood alcohol concentration versus time curve shown in FIG. 4, when the perception module 310 detects a user blood alcohol concentration of C0, it is determined that the user may sober up after (T1-T0) hours, i.e., 6.6 hours. If the time corresponding to T0 is the local time 19:42, the predicted sobering-up time is the next day 02:18. for another example, when the sensing module 310 detects that the blood alcohol concentration of the user is 78mg/100ml, the sensing module 310 may detect the blood alcohol concentration of the user again after a preset time (e.g., 7 minutes) since the blood alcohol concentration corresponds to two time points in the curve. If the blood alcohol concentration detected again by the sensing module 310 is 76mg/100ml, the prediction module 340 may determine that the current time point is after T0, and determine that the user may sober up after 5.1 hours. If the time corresponding to the current time point is the local time 19:42, the predicted sobering-up time is the next day 00:45.
In one possible implementation, the prediction module 340 may derive a blood alcohol concentration-time curve based on the blood alcohol concentration parameter and the acquisition time parameter. For example, the training module 330 may utilize a neural network algorithm (e.g., a convolutional neural network algorithm, a cyclic neural network algorithm, etc.), take the blood alcohol concentration parameter and the acquisition time parameter stored by the storage module 320 as input values of the model, take a blood alcohol concentration-time curve as output of the model, and train the model to obtain a trained alcohol prediction model. The prediction module 340 then uses the blood alcohol concentration parameter and the collection time parameter that were recently obtained by the sensing module 310 as inputs to the alcohol prediction model to obtain a blood alcohol concentration-time curve.
In another possible implementation, the prediction module 340 may fit the resulting blood alcohol concentration-time curve based on the blood alcohol concentration parameter and the acquisition time parameter.
In another possible implementation, the prediction module 340 may derive the blood alcohol concentration-time curve based on one or more of a physiological information parameter, an intake wine parameter, a blood alcohol concentration parameter, and a collection time parameter.
In another possible implementation, the prediction module 340 may directly derive the predicted sober-up time based on one or more of a physiological information parameter, an intake wine parameter, a blood alcohol concentration parameter, and a harvest time parameter. For example, the training module 330 may train one or more of physiological information parameters, intake alcohol parameters, blood alcohol concentration parameters, and acquisition time parameters as inputs to an alcohol prediction model, and the predicted sobering-up time as outputs to the alcohol prediction model to obtain the alcohol prediction model. The prediction module 340 may obtain a predicted sober-up time from the alcohol prediction model based on the parameters required by the alcohol prediction model recently acquired by the perception module 310.
The prediction module 340 may send the predicted sobering-up time to the display module 350 after obtaining the predicted sobering-up time. The display module 350 may display the predicted sobering-up time.
The prediction module 340 may also send the prediction results to the remediation module 360. The predicted outcome may include, but is not limited to, predicted metabolic rate, predicted absorption rate, and blood alcohol concentration versus time curve. The prediction module 340 may also send the prediction results to the storage module 320. It will be appreciated that when the prediction module 340 is configured to predict the blood alcohol concentration versus time curve, the prediction module 340 may be configured to predict the blood alcohol concentration versus time curve.
The correction module 360 may be configured to adjust the prediction based on the blood alcohol concentration parameter and the acquisition time parameter obtained by the perception module 310. After the correction module 360 obtains the collection time parameter after the user drinks and the corresponding blood alcohol concentration parameter, the prediction result can be adjusted based on the blood alcohol concentration parameter and the collection time parameter for obtaining the blood alcohol concentration parameter, so as to obtain an adjusted predicted metabolic rate, an adjusted predicted absorption rate and an adjusted C-T curve.
For example, the correction module 360 may obtain an actual blood alcohol concentration-time curve for the user based on the obtained multiple sets of blood alcohol concentration parameters and their corresponding collection time parameters. The remediation module 360 adjusts the predicted outcome based on the difference between the actual C-T curve and the C-T curve obtained by the prediction module 340, resulting in an adjusted predicted metabolic rate, an adjusted predicted absorption rate, and an adjusted C-T curve.
For another example, the correction module 360 may obtain an error value between the blood alcohol concentration on the predicted blood alcohol concentration-time curve and the actual blood alcohol concentration based on multiple sets of blood alcohol concentration parameters and their corresponding acquisition time parameters. The correction module 360 may add the error value to all values of the blood alcohol concentration on the blood alcohol concentration-time curve to obtain a corrected blood alcohol concentration-time curve.
The correction module 360 may derive a predicted sober-up time based on the adjusted blood alcohol concentration versus time curve.
The rectification module 360 may send the rectification results to the display module 350. The corrective results may include, but are not limited to, an adjusted predicted metabolic rate, an adjusted predicted absorption rate, a predicted sober-up time, and an adjusted blood alcohol concentration-time curve. The remediation module 360 may also send the remediation results to the storage module 320.
In one possible implementation, the correction module 360 may derive the predicted sober-up time directly based on the prediction result, the blood alcohol concentration parameter, and the acquisition time parameter. At this time, the correction result includes only the predicted sobering-up time.
In another possible implementation, the rectification module 360 is not included in the electronic device 100. The prediction module 340 of the electronic device 100 may obtain the predicted sobering-up time directly based on the prediction result, the blood alcohol concentration parameter, and the acquisition time parameter.
The display module 350 may be used to display the predicted sobering-up time. The display module 350 may display the predicted sobering-up time on the display screen 194 of the electronic device 100. Optionally, the display module 350 may also display the predicted metabolic rate and the predicted absorption rate, as well as the historical metabolic rate and the historical absorption rate of the user. The display module 350 may also display a prompt for prompting the user about the difference between the current metabolic rate and the historical metabolic rate, and about the difference between the current absorption rate and the historical absorption rate.
In one possible implementation, when the electronic device 100 is used to predict the user's ingestible volume of alcohol, the perception module 310 may be used to obtain the desired sobering-up time, physiological information parameters, and alcohol number parameters entered by the user. The description of the sensing module 310 for obtaining the physiological information parameter and the alcohol content parameter may be referred to the embodiment of the sensing module 310 described above, and will not be repeated here. The perception module 310 may send the acquired desired sobering-up time, physiological information parameters, and alcohol content parameters to the prediction module 340. The prediction module 340 may use the alcohol number parameter and the physiological information parameter as inputs to an alcohol prediction model to derive a predicted metabolic rate and a predicted absorption rate. The prediction module 340 may predict the absorption rate, the desired sober-up time, the threshold blood alcohol concentration, and equation 1 based on the predicted metabolic rate to obtain the maximum blood alcohol concentration. It is appreciated that when the prediction module 340 obtains the maximum blood alcohol concentration, the prediction module 340 may predict the metabolic rate, the absorption rate, the desired sober-up time, the threshold blood alcohol concentration based on the maximum blood alcohol concentration, and obtain the C-T curve. The prediction module 340 obtains the ingestible volume of wine based on the maximum blood alcohol concentration and the weight of the user in the physiological information parameter and equation 2. The prediction module 340 may send the ingestible wine volume to the display module 350, which display module 350 may be configured to display the ingestible wine volume.
In another possible implementation, the prediction module 340 may directly derive the ingestible volume of wine based on one or more of the physiological information parameter, the alcohol content parameter, and the desired sobering-up time. For example, the training module 330 may train one or more of the physiological information parameter, the alcohol content parameter, and the ingestible alcohol volume prediction model as inputs to the ingestible alcohol volume prediction model and the ingestible alcohol volume as outputs to the ingestible alcohol volume prediction model. The prediction module 340 may derive the ingestible wine volume from the ingestible wine volume prediction model based on the parameters required by the input of the latest ingestible wine prediction model obtained by the perception module 310.
Next, an interface schematic diagram of a detection method provided in an embodiment of the present application is described.
For example, as shown in fig. 5A, the electronic device 100 may display a desktop 501. Wherein the desktop 501 may include a plurality of application icons, such as an alcohol detection application icon 502, and the like. Wherein the alcohol detection application icon 502 may be used to trigger an interface (e.g., alcohol detection interface 510 shown in fig. 5B) displaying an alcohol detection application. Alcohol detection applications may be used to derive a predicted sobering-up time or an ingestible volume of alcohol. A status bar may also be displayed above the desktop 501, where a bluetooth icon may be displayed. The bluetooth icon is used to instruct the electronic device 100 to establish a communication connection with the electronic device 200.
The electronic device 100 receives input (e.g., a click) from the user on the alcohol detection application icon 502, in response to which the electronic device 100 may display an alcohol detection interface 510 as shown in fig. 5B.
As shown in fig. 5B, the alcohol detection interface 510 may include a user parameter field 511, where the user parameter field 511 includes information such as gender, height, weight, and sleep duration of the user. The sleep duration parameter in the user parameter field 511 may be obtained by the electronic device 100 from the electronic device 200. The parameters gender, height, weight, etc. may be pre-stored for the electronic device 100 or entered by the user. The electronic device 100 may receive user input, modify parameters in the user parameters field 511. The alcohol detection interface 510 may also include a time prediction control 512 and a volume prediction control 513. Wherein the time prediction control 512 may be used to predict the sobering-up time of the user and the volume prediction control 513 may be used to predict the volume of wine that the user may ingest.
The electronic device 100 can display a detection prompt interface 530 as shown in fig. 5C upon receiving user input for the temporal prediction control 512 in response to the input.
As shown in fig. 5C, the detection prompt interface 530 includes a prompt box 531. The prompt box 531 displays prompt information that can be used to prompt the user to breath the alcohol sensor. In this way, the electronic device 100 may acquire the blood alcohol concentration of the user through the alcohol sensor. The cues may include, but are not limited to, text cues, animation cues, picture cues, voice cues, and so on. For example, the prompt may include a picture-type prompt as shown in fig. 5C for prompting the location of the alcohol sensor. The prompt may also include a text prompt as shown in fig. 5C: "alcohol sensor that is detecting blood alcohol concentration, please point to the arrow, has a bad breath".
It will be appreciated that the prompt 531 shown in fig. 5C is displayed when the electronic device 100 and the electronic device 200 including the alcohol sensor establish a communication connection. In some embodiments, the electronic device 100 carries an alcohol sensor, and the electronic device 100 may prompt the user to breath toward the alcohol sensor of the electronic device 100 to obtain the user's blood alcohol concentration. In other embodiments, where the electronic device 100 is not communicatively coupled to an electronic device including an alcohol sensor, the electronic device 100 may prompt the user to detect the blood alcohol concentration by himself and input the blood alcohol concentration to the electronic device 100.
When the electronic device 200 detects the blood alcohol concentration of the user, the blood alcohol concentration and the acquisition time for acquiring the blood alcohol concentration may be transmitted to the electronic device 100. After the electronic device 100 receives the blood alcohol concentration and the collection time of the electronic device 200, a time prediction interface 540 as shown in fig. 5D may be displayed.
As shown in fig. 5D, a wine parameter column 541 may be displayed in the time prediction interface 540. The wine parameter column 541 may be used to display the volume of wine and the degree of wine that a user drinks. The wine parameter column 541 may include a wine parameter entry 542, where the wine parameter entry 542 includes a photograph identification icon 542A. The photographing identification icon 542A may be used to trigger the electronic device 100 to activate the camera and identify the picture acquired by the camera, so as to obtain the volume of drunk and the degree of drunk by the user. It should be noted that the electronic device 100 may receive the input of the user, and display the drinking water volume and the drinking alcohol number input by the user in the alcohol parameter entry 542. An add key may also be included in the hop parameters column 541 that may be used to trigger the electronic device 100 to display another hop parameters entry above or below the hop parameters entry 542. In this way, the electronic device 100 may collect parameters for various types of drinks. Also included in the time prediction interface 540 may be a detect record field 544, a re-enter key 545 and a start prediction key 546.
Among other things, the detection record field 544 may be used to display the user's blood alcohol concentration. The blood alcohol concentration may be sent by the electronic device 200 to the electronic device 100, or manually entered by a user. Here, the test records column 544 may display one or more test records including test record 544A, the test record 544A including the blood alcohol concentration and the time of collection of the blood alcohol concentration. Alternatively, the electronic device 100 may receive user input, altering the value in the detection record. The re-enter key 545 may be used to trigger the electronic device 100 to notify the electronic device 200 to re-detect the user's blood alcohol concentration. It will be appreciated that when the electronic device 100 does not include an alcohol sensor and is not in communication with the electronic device 200 including the alcohol sensor, the re-enter key 545 may be used to add a new test record in the test record column 544, and the user may enter the blood alcohol concentration and the corresponding collection time in the new test record. The start prediction key 546 may be used to trigger the electronic device 100 to obtain a predicted sobering-up time based on the obtained parameters.
The electronic device 100 may receive input from the user for the photo recognition icon 542A shown in fig. 5D, and in response to the input, display the photo recognition interface 550 shown in fig. 5E.
As shown in fig. 5E, the photographing identification interface 550 displays a screen captured by the camera of the electronic device 100. The photo recognition interface 550 may also include information of the degree of the identified wine. For example, the alcohol content is shown in figure 5E by text alongside the wine bottle as 20%. When the alcohol content is not identified on the package of the wine bottle, the electronic device 100 may identify package information (e.g., brand, name, etc.) of the wine, and obtain the alcohol content information of the brand based on the package information. The photo recognition interface 550 may also include information on the volume of the identified wine-filled container. For example, the electronic device 100 recognizes that the volume of the wine bottle is 220ml. Optionally, the electronic device 100 may also display the number of containers in the vicinity of the volume information, and the electronic device 100 may receive user input to modify the number of containers. In this way, the volume of the whole wine taken by the user can be obtained. It will be appreciated that when the volume of the wine bottle is not identified on the package of the wine bottle, the electronic device 100 may identify package information (e.g., brand, name, etc.) for the wine and obtain wine bottle volume information for the brand of wine based on the package information.
The photo recognition interface 550 may also include a re-recognition key 551, a confirm key 552. The re-recognition key 551 may be used to trigger the electronic device 100 to re-recognize relevant information in the image displayed on the current photographing recognition interface 550. The confirm button 552 may be used to confirm the recognition result.
The electronic device 100 may receive user input for the confirm button 552 shown in fig. 5E, and in response thereto, display the time prediction interface 540 shown in fig. 5F. Also displayed in the wine parameter entry 542 of the time prediction interface 540 are the value of the volume of wine drunk and the value of the number of degrees of wine drunk. The electronic device 100 may also receive input from the user for a re-input key 545 shown in fig. 5F, in response to which the electronic device 100 may notify the electronic device 200 to re-collect the user's blood alcohol concentration. It can be appreciated that the electronic device 100 may also display a prompt message, and the role and content of the prompt message may be referred to the prompt message shown in fig. 5C and will not be described herein. The electronic device 100 may display the detection record 544B shown in fig. 5G after receiving the blood alcohol concentration of the user collected again by the electronic device 200 and the corresponding collection time. The electronic device 100 may calculate the predicted sobering-up time in response to the input after receiving the input of the user for the start prediction key 546 shown in fig. 5G. Wherein the electronic device 100 may obtain the C-T curve based on the physiological information parameters, the number of hops ingested, and the stored alcohol prediction model. Thereafter, the electronic device 100 may obtain a predicted sobering-up time based on the C-T curve, the user blood alcohol concentration, the acquisition time. Specifically, reference may be made to the embodiment shown in fig. 3, and details thereof are not repeated here. Here, the predicted sobering-up time obtained by the electronic apparatus 100 is 00:45 the next day. After obtaining the predicted sobering-up time, the electronic apparatus 100 may display a prediction result interface 570 as shown in fig. 5H.
As shown in fig. 5H, prediction results interface 570 may include results information 572. The result information 572 includes predicted sobering-up time information. The result information 572 may be one or more of text information, picture information, voice information, and the like. For example, the result information 572 may be text-based information: after 5.1h, the blood alcohol concentration was reduced to 20mg/100ml, and the sobering-up time was 00:45% in the morning. Optionally, the prediction result interface 570 may further include a blood alcohol concentration-time graph 571, and the blood alcohol concentration-time graph 571 may be used to show the current blood alcohol concentration of the user, current time information, and predicted sobering-up time. In this way, the electronic device 100 can more intuitively display the blood alcohol concentration change of the user through the blood alcohol concentration-time graph 571, so as to reflect the sobering-up time of the user.
Optionally, the prediction result interface 570 may further include an alcohol absorption rate, an alcohol metabolism rate, and a change curve thereof for a user preset time (e.g., for one month). Thus, the user can review the change conditions of the self alcohol absorption rate and the alcohol metabolism rate, and adjust the self life work and rest, drinking habit and the like.
It is understood that the electronic device 100 may obtain the prediction result based on only one detection record. It will also be appreciated that since the blood alcohol concentration versus time curves correspond to two acquisition times, there is a period of error in the predicted results based on one test record, except for the maximum blood alcohol concentration.
In some embodiments, the electronic device 100 may derive the predicted metabolic rate and the predicted absorption rate from an alcohol prediction model based on the physiological information parameter, the alcohol number parameter. Electronic device 100 may also derive an ingestible wine volume based on the desired drinking time, the predicted metabolic rate, and the predicted absorption rate.
For example, the electronic device 100 can display a time prediction interface 601 as shown in fig. 6A upon receiving a user input to the volume prediction control 513 shown in fig. 5B, in response to the input. The time prediction interface 601 may include a hop count field 602, which hop count field 602 may be used to display the number of hops to be ingested. The alcohol degree column 602 includes a photographing identification icon 602A, where the photographing identification icon 602A may be used to trigger the electronic device 100 to start the camera, and identify the image collected by the camera, so as to obtain the degree of alcohol drunk by the user. The detailed description of identifying the alcohol content can be referred to the embodiment shown in fig. 5E, and will not be repeated here. The electronic device 100 may also receive an input from the user, and display the drinking wine number input by the user in the wine number field 602. Here, the alcohol content column 602 shows the alcohol content, and the value of the alcohol content is 20%.
The temporal prediction interface 601 may also include a desired time column 603, where the desired time column 603 may be used to display a desired sober-up time. The desired time bar 603 may include a time wheel that may be used to receive user input, adjust the numbers on the time wheel, and obtain the desired sobering-up time. The desired time column 603 may also display a specific value of the desired sobering-up time. It should be noted that, not limited to the desired time field 603, in practical application, the desired time field may be in other forms, for example, the desired time field may be an input box, and the input box may be used by a user to input a number by himself or herself, so as to obtain the desired sobering-up time. The embodiments of the present application are not limited in this regard. Here, the expected sobering-up time is displayed in the expected time field 603, and is "beijing time 19:35".
In one possible implementation, the electronic device 100 may obtain the time of the user's trip or work by querying the user's calendar or memo, and take the time as the desired sobering-up time.
The temporal prediction interface 601 may also include a start prediction button 604 the start prediction button 604 may be used to trigger the electronic device 100 to predict the ingestible wine volume. Electronic device 100 may, upon receiving user input for start prediction key 604, derive a maximum blood alcohol concentration based on the desired drinking time, the predicted metabolic rate, the predicted absorption rate, and the threshold blood alcohol concentration. And based on the alcohol content parameter, the user weight parameter and the maximum blood alcohol concentration, obtaining the volume of the ingestible alcohol through the formula 2 shown in the figure 3. The electronic device 100 may display the predicted outcome interface 610 as shown in fig. 6B after the ingestible wine volume is obtained.
As shown in fig. 6B, the predicted outcome interface 610 may include outcome information 611. The result information 611 includes ingestible wine volume information. The result information 611 may be one or more of text information, picture information, voice information, and the like. For example, the result information 611 may be text information: "it is desirable to sober up after 3 hours, drinkable wine capacity is about 82ml". Optionally, the predicted outcome interface 610 may also include a blood alcohol concentration versus time graph, which may be used to illustrate a predicted blood alcohol concentration versus time profile.
The following describes a flow chart of a detection method provided in the embodiment of the present application.
Based on the detection method provided in the embodiment of the present application, the electronic device 100 may obtain the predicted sobering-up time based on the physiological information parameter, the alcohol content parameter, the alcohol volume parameter, the blood alcohol content parameter, and the acquisition time parameter for acquiring the blood alcohol content parameter. The electronic device 100 may also derive an ingestible wine volume based on the desired sobering-up time, the physiological information parameter, the alcohol content parameter. In this way, the predicted result is more accurate because the electronic device 100 uses the user's physical parameters to obtain the predicted drinking time and the ingestible volume of wine. The user can work or travel after sobering up by predicting sobering up time. The user can drink wine moderately by taking the volume of the wine without affecting the journey.
Illustratively, as shown in FIG. 7, the method includes:
s701, the electronic device 100 acquires a physiological information parameter, an intake alcohol parameter, a blood alcohol concentration parameter, and a collection time parameter.
The physiological information parameters may include, among other things, long-term memory parameters (e.g., gender) and short-term memory parameters (e.g., height, weight, sleep time). The physiological information parameter may be entered by a user. Alternatively, the electronic device 100 may also establish a connection with an electronic device (e.g., the electronic device 200) carrying a body movement recorder, through which the sleep time of the user is obtained. Optionally, the electronic device 100 may also be connected to a body fat scale, through which the weight of the user is obtained.
Wherein the intake parameters may include a volume parameter of wine and a degree parameter of wine. The intake parameters may be entered by the user. Or, the electronic device 100 may identify the degree and volume of the intake wine by photographing, specifically, the electronic device 100 may obtain a container image of the intake wine through a camera, and obtain the intake wine parameter through an image identification algorithm based on the container image. For example, the specific steps of the electronic device 100 for obtaining parameters of drinking water may be referred to the embodiment shown in fig. 5E, which is not described herein.
The blood alcohol concentration parameter and the acquisition time parameter are in one-to-one correspondence, and the blood alcohol concentration parameter and the acquisition time parameter can be input by a user. Alternatively, the electronic device 100 may also establish a connection with an electronic device (e.g., the electronic device 200) carrying an alcohol sensor, through which the blood alcohol concentration parameter and the acquisition time parameter are obtained. Specifically, reference may be made to the embodiment shown in fig. 5C, which is not described herein.
In one possible implementation, the electronic device 100 performs step S701 in response to the first input after receiving the first input of the user. Wherein the first input may include, but is not limited to, a single click, a double click, a long press, etc. For example, the first input may be an input for the alcohol detection application icon 502 shown in FIG. 5A described above.
In one possible implementation, after receiving the second input from the user, the electronic device 100 determines the predicted sobering-up time based on the physiological information parameter, the intake alcohol parameter, the blood alcohol concentration parameter, and the acquisition time parameter in response to the second input. In some embodiments, the electronic device 100 may perform step S702 and step S703 in response to the second input. Wherein the second input may include, but is not limited to, a single click, a double click, a long press, etc. For example, the second input may be an input to the start prediction control 546 illustrated in FIG. 5G above.
S702, the electronic device 100 may obtain a blood alcohol concentration-time curve based on the physiological information parameter, the intake alcohol parameter, and the alcohol prediction model.
Specifically, the electronic device 100 may use the physiological information parameter and the alcohol degree parameter as inputs of the alcohol prediction model to obtain the predicted absorption parameter and the predicted metabolism parameter. The electronic device 100 may also obtain the maximum blood alcohol concentration through equation 2 shown in fig. 3 above based on the alcohol number parameter, the alcohol volume parameter, and the user weight parameter. The electronic device 100 further predicts the absorption parameter and the predicted metabolic parameter based on the maximum blood alcohol concentration, and obtains the blood alcohol concentration-time curve by equation 1 shown in fig. 3.
S703, the electronic device 100 may obtain the predicted sobering-up time based on the blood alcohol concentration-time curve, the blood alcohol concentration parameter, and the acquisition time parameter.
The electronic device 100 may determine, based on the blood alcohol concentration parameter and the C-T curve, a position of the blood alcohol concentration indicated by the blood alcohol concentration parameter in the C-T curve, that is, may determine a time point corresponding to the blood alcohol concentration parameter in the C-T curve. The prediction module 340 may then obtain a time difference between the time point corresponding to the threshold blood alcohol concentration and the time point corresponding to the blood alcohol concentration based on the time point corresponding to the blood alcohol concentration parameter. The prediction module 340 may add a time difference to the point in time indicated by the acquisition time parameter to obtain a predicted sobering-up time.
In one possible implementation, the electronic device 100 may obtain an error value between the predicted blood alcohol concentration and the actual blood alcohol concentration on the blood alcohol concentration-time curve based on multiple sets of blood alcohol concentration parameters and their corresponding acquisition time parameters. The electronic device 100 may add the error value to all the values of the blood alcohol concentration on the blood alcohol concentration-time curve to obtain a corrected blood alcohol concentration-time curve. The electronic device 100 then obtains a predicted sobering-up time based on the corrected blood alcohol concentration-time curve. In this way, the electronic device 100 may obtain a more accurate prediction of the sobering-up time.
It is understood that the electronic device 100 may also derive a corrected predicted metabolic rate and a corrected predicted absorption rate based on the corrected blood alcohol concentration-time curve.
It should be noted that the electronic device 100 may store the physiological information parameter, the intake wine parameter, the predicted absorption rate, the predicted metabolic rate, the blood alcohol concentration-time curve, the corrected predicted absorption rate, the corrected predicted metabolic rate, and the corrected blood alcohol concentration-time curve, and perform training of the alcohol prediction model based on the stored data. That is, the electronic device 100 may adjust the model parameters of the alcohol predictive model based on the corrected predicted metabolic rate and the error between the predicted metabolic rate, and the corrected predicted absorption rate and the error between the predicted absorption rate. The electronic device 100 may also calculate an accuracy of the alcohol prediction model with the model parameters adjusted, and the electronic device 100 may store the alcohol prediction model after determining that the accuracy of the alcohol prediction model reaches a preset threshold.
In some embodiments, the electronic device 100 may re-train the model at preset intervals (e.g., 1 month), or the electronic device 100 may train the model after each predicted sobering-up time or ingestible volume of wine is obtained.
S704, the electronic device 100 displays the predicted sobering-up time.
The electronic device 100 may display the predicted sobering-up time. Optionally, the electronic device 100 may also display the predicted absorption rate and the predicted metabolic rate. Specifically, reference may be made to the embodiments shown in fig. 5A to 5H, which are not described herein.
Alternatively, the electronic device 100 may display a prompt message when detecting the driving operation of the user, where the prompt message may be used to prompt the user to be in a drunk state and not to drive. For example, the electronic device 100 may determine the user's driving time by querying the user's calendar or memo.
It will be appreciated that when the electronic device 100 predicts an ingestible volume of wine, the electronic device 100 may obtain the maximum blood alcohol concentration based directly on the desired sober-up time, the predicted metabolic rate, and the predicted absorption rate, and then obtain the ingestible volume of wine based on the maximum blood alcohol concentration, the alcohol degree parameter, and the user weight parameter after obtaining the predicted metabolic rate and the predicted absorption rate. Wherein the desired sobering-up time may be input by a user. In one possible implementation, the electronic device 100 may obtain the time of the user's trip or work by querying the user's calendar or memo, and take the time as the desired sobering-up time.
The electronic device 100 may also display the volume of the ingestible wine, and specifically, refer to the embodiment shown in fig. 6A-6B, which is not described herein.
Alternatively, the electronic device 100 may send the physiological information parameters and the intake wine parameters to the server 300, and the server 300 performs calculation of predicted sobering-up time/volume of wine that can be taken, and training of the alcohol prediction model. The server 300 may also be used to store the parameters described above. In this way, computing and storage resources of the electronic device 100 may be conserved.
In one possible implementation, the electronic device 100 may predict the predicted sober-up time based on one or more of the intake wine parameters, physiological information parameters, blood alcohol concentration parameters, and acquisition time parameters described above. The electronic device 100 may also derive an ingestible wine volume for the user based on one or more of the wine degree parameter and the physiological information parameter, and the desired sobering-up time. Thus, the electronic device 100 may also obtain the predicted sobering-up time, or the ingestible volume of wine, in the case where the acquired parameter is one or more of the above parameters.
In some possible application scenarios, fatigue driving has become an important cause of traffic safety accidents, and drivers drive vehicles to get on roads in a fatigue state, causing unnecessary casualties and economic losses. At present, detecting whether a driver is tired or not has become a problem to be solved urgently.
Accordingly, embodiments of the present application provide a detection method. The electronic device 100 may obtain behavior data of the user while the user drives out. The electronic device 100 may derive the degree of pre-driving fatigue of the user based on the behavior data. The electronic device 100 may also obtain on-vehicle travel data and physical condition data, and the electronic device 100 may obtain the degree of fatigue in driving of the user based on the physical condition data and the on-vehicle travel data. The electronic device 100 may obtain the current fatigue level (also referred to as the final fatigue level) of the user based on the fatigue level before driving and the fatigue level during driving. The electronic device 100 may also obtain driving advice and display based on the final fatigue level. The driving advice may include, but is not limited to, a recommended driving duration. The recommended driving duration is used to indicate the total duration that the user can drive before reaching a preset fatigue level. In this way, the electronic device 100 can combine the data before and during driving of the user to obtain the fatigue degree of the user, and give corresponding driving advice based on the fatigue degree of the user, so as to reduce the time of fatigue driving of the user, reduce the probability of driving accidents, and improve the fatigue driving problem.
For example, when the fatigue level of the user is classified into light fatigue, medium fatigue and heavy fatigue, when the electronic device 100 determines that the user is light fatigue based on the final fatigue level, the driving advice may include a recommended driving duration obtained by the electronic device 100 in combination with the previous vehicle driving data of the user. The recommended driving duration is the driving duration before the user achieves heavy fatigue. When the electronic device 100 determines that the user is moderately tired, the driving advice may include recommended driving duration and awake prompt information, which may be used to remind the user to lower the temperature in the vehicle or drink refreshing beverages, play refreshing music, and so on; when the electronic device 100 determines that the user is severely tired, the driving advice may include a parking hint information that may be used to prompt the user to stop for a break as soon as possible. The driving advice may further include a recommended driving duration, at which time the value of the recommended driving duration is zero. Optionally, the electronic device 100 may also plan the nearest parking place and display navigation information to the parking place.
It should be noted that, before the user drives the vehicle, the electronic device 100 does not obtain the on-vehicle driving data, and the electronic device 100 may obtain the fatigue degree before driving based on the behavior data. The electronic device 100 may then combine the stored historical data (e.g., final fatigue level, driving duration, etc.) of the previous driving vehicle of the user to obtain driving advice. The driving advice may include a recommended driving time period. The recommended driving duration is used to indicate how long the user is driving and heavy fatigue may occur. In this way, the electronic device 100 may recommend the duration that the user can drive before the user drives, so as to reduce the probability of traffic accidents.
A communication system 20 provided in an embodiment of the present application is described next.
As shown in fig. 8, the communication system 20 may include, but is not limited to, an electronic device 100, an electronic device 500, an electronic device 600, and an electronic device 700. Wherein the electronic device 100 may establish a communication connection (e.g., a bluetooth connection, etc.) with the electronic device 500. The electronic device 100 may also establish a communication connection with the electronic device 600. Electronic device 600 may establish a communication connection with electronic device 700.
Specifically, in the communication system 20, the electronic device 700 is an electronic device including a camera (e.g., an in-vehicle camera, a vehicle recorder, etc.), and the electronic device 700 may be used to acquire facial image data of a user. The electronic device 700 may also transmit facial image data to the electronic device 600.
The electronic device 600 may be used to obtain driving data. For example, the electronic device 600 may be a vehicle device, a vehicle tablet, or the like. The driving data may be used to represent an environmental condition in the vehicle, a driving road condition, a driving state of the user, and the like, during driving of the user. The driving data may include, but is not limited to, light, noise, temperature within the vehicle, speed, acceleration, variance of speed, variance of acceleration of the vehicle, frequency of vehicle to lane offset, following distance, road condition, user facial image data, time of user driving the vehicle, and driving duration of user driving the vehicle, etc. The electronic device 600 may send the driving data to the electronic device 100.
Optionally, the electronic device 600 may also be configured to receive facial image data transmitted by the electronic device 700. The electronic device 600 may also be configured to obtain user facial data via image recognition based on the user's facial image data. The user face data may include, but is not limited to, among others, a focus condition of the user's eyes, a head movement condition (low head frequency), a blink frequency, a number of yawns, and the like. The electronic device 600 may send the user face data to the electronic device 100.
Alternatively, the electronic device 700 may obtain the user face data based on the face image data after acquiring the face image data of the user, and transmit the user face data to the electronic device 600.
The electronic device 500 may be configured to detect the physical condition of the user in real time, and obtain the user data. Wherein the user data may be used to characterize the physical condition of the user and the user behavior. The electronic device 500 may be a wearable device (e.g., a smart watch, a smart bracelet), or the like. The user data may include stable user data and wave user data. Wherein the stable user data may be used to indicate physical characteristic data (e.g., height, gender, age, weight, etc.) that the user does not fluctuate for a short period of time. The fluctuation type user data may be used for physical condition data indicating that the user fluctuates in a short period of time. That is, the electronic device 500 may be used to obtain fluctuating user data. The fluctuating user data acquired by the electronic device 500 may include, but is not limited to, a user's heart rate, body temperature, blood glucose, sleep quality (e.g., may be identified by a sleep duration), exercise conditions (including exercise duration, exercise intensity, etc.), blood oxygen saturation, and so forth, among others. The electronic device 500 may transmit the acquired wave-shaped user data to the electronic device 100.
The electronic device 500 may also be used to obtain user data related to user behavior, which may include, but is not limited to, sleep, sitting, walking, running, and the like. In the following embodiments, writing will be done by the four user behaviors of sleeping, sitting still, walking, running only. It will be appreciated that other user activities (e.g., lying) or subdividing the above user activities (e.g., walking may be divided into walking, scovering, etc.) may also be included in the actual application, and this embodiment of the present application is not limited thereto. It is appreciated that the electronic device 500 may obtain user behavior by detecting user data such as a user's heart rate, body temperature, movement, etc. The electronic device 500 may send these user data to the electronic device 100.
It should be noted that the wave-shaped user data may also include the weight, body fat, etc. of the user, and the electronic device 100 may acquire the wave-shaped user data through the body fat scale, or may be obtained through user input. It should be noted that the stable user data acquired by the electronic device 100 may also be obtained through user input. Optionally, the electronic device 100 may predict one or more of the gender, age, height, and weight of the user through the face image data of the user acquired by the electronic device 700 and through an image recognition algorithm.
The electronic device 100 may obtain the behavior data and a part of the physical state data through the user data obtained by the electronic device 500, and the electronic device 100 may also obtain another part of the user state data of the user. The electronic device 100 may also obtain driving data through the electronic device 600, and obtain on-vehicle driving data. The user data and the driving data acquired by the electronic device 100 may be obtained by user input.
The electronic device 100 may also be used to obtain a degree of pre-driving fatigue. Specifically, the electronic device 100 may use the behavior data in a serial form (also referred to as a behavior sequence) as an input of the first fatigue model, so as to obtain the fatigue degree before driving. The output of the first fatigue model is determined by the number of user actions and the order of user actions in the input sequence of actions. The sequence of the user behaviors in the behavior sequence is different, and the fatigue degrees before driving obtained by the first fatigue model are different. For example, the behavior sequence < running, sleep > and the behavior sequence < sleep, running > result in different levels of pre-driving fatigue. The number of user behaviors in the behavior sequence is different, and the degree of fatigue before driving obtained by the first fatigue model is different. For example, the behavior sequence < still sitting, running, sleeping > and the behavior sequence < still sitting, running > result in different levels of pre-driving fatigue. The first fatigue model may be, for example, a Recurrent Neural Network (RNN) model for processing data having a time-series relationship.
The electronic device 100 may also be used to obtain the degree of fatigue in driving. Specifically, the electronic device 100 may use the vehicle running data and the physical condition data as the input of the second fatigue model to obtain the fatigue degree in driving. The second fatigue model can be used for processing input data without time sequence relation to obtain an output result. For example, the second fatigue model may be a Support Vector Regression (SVR) model. Wherein, because the SVR model has better inclusion for the deviation caused by outliers, smaller errors between the predicted result and the real result can be ignored, thus being more suitable for detecting the fatigue degree.
The electronic device 100 may also be used to obtain a final degree of fatigue. The electronic device 100 may perform weighted summation calculation based on the fatigue level before driving and the fatigue level during driving, to obtain the final fatigue level. Thereafter, the electronic device 100 may obtain driving advice based on the final fatigue degree. That is, the electronic device 100 may obtain whether the user is in light fatigue, moderate fatigue, or heavy fatigue based on the final fatigue degree, and give a corresponding driving suggestion according to the fatigue state of the user. The driving advice may include a recommended driving duration, where the recommended driving duration is a driving duration for which the user achieves heavy fatigue. Alternatively, the electronic device 100 may send the driving advice to the electronic device 600, and the electronic device 600 may display the driving advice.
The electronic device 100 may be used to store behavioral data, physical state data, and on-board travel data. And taking the data as parameters of model training to train the model. Specifically, the electronic device 100 may obtain the fatigue degree of the user during driving and the time interval between different fatigue degrees according to the stored vehicle running data and the physical state data. For example, the electronic device 100 may mark the user as lightly fatigued with a number of yawns of 1-2 times in a preset time. The times of yawning are moderate fatigue and are higher than 5 times of heavy fatigue. Then, the electronic device 100 may input the behavior data, the physical state data, and the on-vehicle driving data into the corresponding models during the training process, to obtain the final fatigue degree. The electronic device 100 may take the fatigue level of the user as a real result and based on the real result, obtain an error value between the final fatigue level and the real result. The electronic device 100 may adjust parameters of the model based on the error value until the error value is less than a preset threshold, and the model training is complete. The electronic device 100 may use a model with an error value less than a preset threshold for detection of the user's fatigue level. It should be noted that, the electronic device 100 marks the fatigue level of the user by the number of times of yawning is only an example, and the electronic device 100 may mark the fatigue level of the user by other data (e.g., the number of times the user is low), which is not limited in this application.
The electronic device 100 may also store a pre-driving fatigue level, an in-driving fatigue level, and a final fatigue level, which are obtained based on the behavior data, the physical state data, and the on-vehicle running data. It should be noted that, since the electronic device 100 may collect the vehicle driving data and the physical state data in real time during the driving process of the user. The electronic device 100 may calculate the fatigue degree and the final fatigue degree in driving again based on the on-vehicle running data and the physical state data acquired during the preset time, at intervals of a preset time (for example, 15 minutes). The electronic device 100 may store physical state data, on-vehicle travel data, in-driving fatigue level, final fatigue level in association with a driving duration in which the user has driven. In this way, the electronic apparatus 100 is facilitated to obtain the relationship between the driving duration and the fatigue degree of the user.
In some embodiments, when the electronic device 100 has not acquired the on-vehicle driving data (i.e., when the user has not driven), the electronic device 100 may obtain the pre-driving fatigue degree through the first fatigue model based on the current behavior data of the user. The electronic device 100 may be based on the pre-driving fatigue degree and the stored relationship of the pre-driving fatigue degree and the driving duration. The electronic device 100 may further determine a recommended driving duration for which the user achieves heavy fatigue based on the relationship between the driving duration and the fatigue level before driving. Optionally, when the electronic device 100 may determine the total duration of the driving of the user, it may also determine whether the fatigue driving occurs to the user according to the total duration of the driving and the recommended driving duration.
For example, the electronic device 100 may determine, based on the pre-driving fatigue degree, a stored plurality of pre-driving fatigue degrees (also referred to as historical pre-driving fatigue degrees) and a historical pre-driving fatigue degree that is closest to the pre-driving fatigue degree. The most similar historical driving front fatigue degree may be the historical driving front fatigue degree with the smallest absolute value of the difference from the current driving front fatigue degree. Then, the electronic device 100 may determine a historical final fatigue level that reaches severe fatigue at an earliest time among a plurality of historical final fatigue levels corresponding to the closest historical pre-driving fatigue level. And determining the driving duration corresponding to the historical final fatigue degree as the recommended driving duration. It is to be appreciated that the electronic device 100 may not be limited to the determination of the most similar historical pre-driving fatigue level from the pre-driving fatigue level described by way of example, and that the electronic device 100 may also determine the most similar historical pre-driving fatigue level from one or more of the above-described physical state data, driving time, weather, etc.
In one possible implementation, communication system 10 also includes a server 300. The server 300 may be a cloud server. A communication connection is established between the server 300 and the electronic device 100. The server 300 may be configured to obtain and store the above data (including physiological information parameters, behavior data, physical state data, pre-driving fatigue level, in-driving fatigue level, final fatigue level, driving duration, etc.) from the electronic device 100. The server 300 may also perform model training based on the above parameters, and calculate the fatigue level and the recommended driving duration. Alternatively, a communication connection may be established between the server 300 and the electronic device 600, and the server 300 may obtain the on-vehicle travel data from the electronic device 600. The server 300 may also transmit the driving advice and the recommended driving time period to the electronic device 600, and display them through the electronic device 600.
The communication system 20 may include other communication connections, not limited to the above-described communication connection method. For example, the electronic device 100 may also establish a communication connection with the electronic device 700 and obtain user face data from the electronic device 700, which is not limited in this embodiment.
Next, a schematic block diagram of the electronic device 100 provided in the embodiment of the present application is described.
As shown in fig. 9, the electronic device 100 may include, but is not limited to, a user data acquisition module 910, an on-board data acquisition module 930, a data preprocessing module 920, a model calculation module 940, and a driving advice determination module 950.
The user data acquisition module 910 may be used to acquire user data related to physical conditions and behavior habits of a user. The user data collection module 910 may run on a processor of the electronic device 100, or a portion of a sensor (e.g., an acceleration sensor) of the electronic device 100, the user data collection module 910 may obtain parameters through other electronic devices (e.g., the electronic device 500, the electronic device 600, etc.) that establish a communication connection with the electronic device 100, or the user data collection module 910 may obtain relevant parameters through obtaining user input. Alternatively, the user data acquisition module 910 and the perception module 310 may be the same module.
User data relating to the physical condition and behavior of a user may include, but is not limited to, the user's age, gender, height, weight, body fat, heart rate, body temperature, blood glucose concentration, blood oxygen saturation, sleep quality, sleep duration, exercise intensity, and the like, among others. Alternatively, the user data collection module 910 may be configured to receive data input by a user and obtain user data therefrom. Alternatively, the user data acquisition module 910 may acquire user data via a corresponding sensor. For example, the movement of the user may be acquired by an acceleration sensor. For another example, the heart rate of the user may be acquired by an optical sensor, or the like. The user data collection module 910 may also send user data to the data preprocessing module 920.
The on-board data acquisition module 930 may be used to acquire driving data of a user during driving. The in-vehicle data acquisition module 930 may acquire driving data through an electronic device (e.g., the electronic device 600) that establishes a communication connection with the electronic device 100. The driving data may be used to represent an environmental condition in the vehicle, a driving road condition, a driving state of the user, and the like, during driving of the user. The driving data may include, but is not limited to, light, noise, temperature within the vehicle, speed, acceleration, variance of speed, variance of acceleration of the vehicle, frequency of vehicle-to-lane offset, following distance, road conditions, weather conditions, user facial image data, time of user driving the vehicle, and driving duration of user driving the vehicle, among others. The on-board data acquisition module 930 may acquire driving data through corresponding software or hardware. For example, the eye movement of the user may be obtained by a camera of the electronic device 700, or the like. As another example, road conditions in the form of vehicles (e.g., tidal roads, falling stone roads) and the like may be obtained through map resource packages and real-time positioning of users, and weather conditions while driving (e.g., sunny days, rainy days, snowy days, and the like) may be obtained through weather servers. For another example, the acceleration of the vehicle and the like may be acquired by an acceleration sensor. The on-board data acquisition module 930 may also send the driving data to the data preprocessing module 920.
The data preprocessing module 920 may be configured to receive data collected by the user data collecting module 910 and the on-board data collecting module 930, and perform preprocessing operation on the received data to obtain feature data. Among other things, preprocessing operations may include, but are not limited to outlier removal, missing value padding, data normalization, data classification, and so forth. The data preprocessing module 920 may transmit the feature data obtained after the preprocessing operation based on the received data to the model calculation module 940. The characteristic data may include behavior data, physical state data, and on-board travel data, among others. The functional code of the data preprocessing module 920 can run on the electronic device 100, for example, on a processor of the electronic device 100.
Specifically, the data preprocessing module 920 may obtain behavior data based on user data. The behavior data is used to indicate the behavior of the user that occurs in chronological order within a preset period of time (e.g., within one hour) before driving the trip. For example, the user performs running, walking, and sleeping activities sequentially for a preset period of time. Then, the data preprocessing module 920 may obtain behavior data of the user based on heart rate, body temperature, position, etc. among the user data, which may be expressed as < running, walking, sleeping >.
The data preprocessing module 920 may also derive physical state data based on user data. The physical state data is used to characterize the physical condition of the user. Wherein the physical state data can be classified into stable data and wave data. Wherein the stability data may be used to characterize data that the user does not change significantly over a period of time, e.g., age, gender, height, weight, body fat, etc. The fluctuation type data can be used for representing data that a user fluctuates along with the change of the behavior and environment of the user, such as heart rate, body temperature, blood sugar, blood oxygen saturation, sleep quality, exercise duration, exercise intensity and the like.
The data preprocessing module 920 may also obtain on-board travel data based on the driving data. The on-board travel data may be used to characterize the ambient conditions of the user while driving the vehicle and the real-time driving conditions of the user. Wherein the on-vehicle travel data may include surrounding environment data and user face data. Wherein the ambient environment data is used to characterize the in-vehicle environment (e.g., temperature, light intensity, etc.) and the vehicle driving conditions (e.g., vehicle speed, acceleration, following distance, driving duration, etc.). The user face data may be used, among other things, to characterize the driving state of the user, e.g., the user's yawning frequency, nodding frequency, etc.
In some embodiments, the driving state of the user may also be affected by the moment the user drives the vehicle (e.g., driving the vehicle in the morning or noon may be more susceptible to fatigue). The data preprocessing module 920 may also record the time at which the feature data was obtained.
Model calculation module 940 may be used to calculate the degree of fatigue of the user. The model calculation module 940 may run on a processor of the electronic device 100, for example, the processor of the electronic device 100 may be the processor 110 or AI chip described above, or the like. The model calculation module 940 may also be configured to send the result of the fatigue level to the driving advice determination module 950. Specifically, the model calculation module 940 may calculate the pre-driving fatigue degree using the behavior data as an input of the first fatigue model. The model calculation module 940 may determine the second fatigue model based on the stable data in the body state data, and use the fluctuating data and the on-vehicle form data in the body state data as data of the second fatigue model to obtain the fatigue degree in driving. The model calculation module 940 may weight and sum the pre-driving fatigue level and the in-driving fatigue level to obtain a final fatigue level. The model calculation module 940 may send the final fatigue level to the driving advice determination module 950.
Optionally, the model calculation module 940 may determine weights for the fatigue level before driving and the fatigue level during driving when calculating the final fatigue level based on the driving duration of the user. Illustratively, the weight of the fatigue degree in driving increases with an increase in driving duration, and the weight of the fatigue degree before driving synchronously decreases. For example, if the initial value of the weight of the pre-driving fatigue degree and the in-driving fatigue degree is 0.5, the model calculation module 940 may increase the value of the weight of the in-driving fatigue degree by 0.05 and decrease the value of the weight of the pre-driving fatigue degree by 0.05 when the driving time increases by 30 minutes. It should be noted that, not only the weight adjustment method described in the above example, the model calculation module 940 may adjust the weight in other manners, for example, the model calculation module 940 may adjust the weight of the fatigue degree to 0.4 before driving and the weight of the fatigue degree to 0.6 during driving when the driving duration reaches 2 hours. The model calculation module 940 may also adjust the weight of the fatigue degree before driving to 0.2, the weight of the fatigue degree during driving to 0.8, etc. when the driving duration reaches 5 hours, which is not limited in the embodiment of the present application.
It should be noted that, when the user is not driving, the model calculation module 940 can only obtain the fatigue degree before driving. The model calculation module 940 may send only the pre-driving fatigue degree to the driving advice determination module 950.
The driving advice determination module 950 may be used to obtain travel information of the user. The driving advice determination module 950 may notify the user data acquisition module 910 to send the user data to the data preprocessing module 920 at the trigger time when the travel information of the user is acquired.
For example, the driving advice determination module 950 may obtain the destination point and arrival time of the user through the schedule of the user, ticket information (also referred to as ticket information, e.g., train ticket, airplane ticket, presentation ticket, movie ticket, etc.), or the like. The destination point is a place recorded by a calendar or a using place of a ticket. The arrival time is the time recorded by the calendar or the departure time or the performance starting time indicated by the ticket. For example, the driving advice determination module 950 may obtain the destination point of the user as the departure airport, the arrival time as the departure time of the airplane, and so on based on the airplane ticket. Alternatively, the arrival time may be a preset time (e.g., 30 minutes) earlier than the time recorded by the ticket or calendar, so that the user may be prevented from missing a trip.
The driving advice determination module 950 may determine whether the distance between the real-time location and the destination location of the user exceeds a distance threshold (e.g., 1 km) starting M hours before the arrival time, where M is 0 or more, e.g., the value of M may take 5. When the driving advice determination module 950 determines that the distance between the current location and the destination location of the user exceeds the distance threshold, the driving advice determination module 950 may determine the departure time of the user based on the driving time period and the arrival time from the current location to the destination location. The driving advice determination module 950 may take the departure time as a trigger time, at which the user data acquisition module 910 is notified to send the user data to the data preprocessing module 920. In some embodiments, the driving advice determination module 950 may obtain the departure time of the user directly from a calendar or a set alarm clock. For another example, the driving advice determination module 950 may obtain navigation information of the user, and determine a departure time of the user based on the navigation information.
Alternatively, the driving advice determination module 950 may take the N hours before the departure time as the trigger time, where the trigger time is M hours after the arrival time, and the trigger time is later than the current time. Wherein N is 0 or more, for example, the value of N may be 1.
In some embodiments, the driving advice determination module 950 may determine that the user is about to drive by detecting the behavior of the user wearing a belt, closing a door of a driver's seat, releasing a hand brake, driving a fire, stepping on an accelerator, and the like. And takes this moment as the trigger moment.
The driving advice determination module 950 may obtain driving advice based on the received degree of fatigue before driving. The driving advice may include a recommended driving duration, where the recommended driving duration is used to indicate a total driving duration when the user reaches heavy fatigue. The driving advice determination module 950 may determine the historical driving fatigue level that is closest to the currently obtained driving fatigue level. The driving advice determination module 950 may determine a historical final fatigue level that reaches severe fatigue earliest among a plurality of historical final fatigue levels corresponding to the closest historical pre-driving fatigue level. And determining the driving duration corresponding to the historical final fatigue degree as the recommended driving duration.
Optionally, the driving advice determination module 950 may further include a travel advice when the predicted driving time period from the departure point to the destination point is obtained. Specifically, the driving advice determination module 950 may determine the fatigue degree of the user during driving based on the predicted driving duration. The driving advice determination module 950 may obtain the travel advice based on the current time, the departure time, and the fatigue degree occurring during the driving of the user. For example, if the driving advice determination module 950 determines that the user may have light fatigue or moderate fatigue during driving, and the driving advice determination module 950 determines that the time difference between the current time and the departure time exceeds a time threshold (e.g., 30 minutes), the trip prompt may be used to prompt the user to rest for a period of time. If the driving advice determination module 950 determines that the user may experience mild fatigue or moderate fatigue during driving, and the driving advice determination module 950 determines that the time difference between the current time and the departure time is less than or equal to a time threshold (e.g., 30 minutes), the trip prompt may be used to prompt the user to prepare the refreshing beverage. If the driving advice determination module 950 determines that the user may be severely tired during driving, the trip prompt may be used to prompt the user to trip through other trip modes (e.g., bus trip or driving trip).
The driving advice determination module 950 may also be configured to determine whether the user is in a driving state. The driving advice determination module 950 may determine whether the user is driving at every preset determination time (e.g., 5 minutes) after receiving the degree of fatigue before driving. The driving advice determination module 950 may also determine whether the user is still driving the vehicle at intervals of a preset determination time during the driving of the user. When the driving advice determination module 950 determines that the user is driving, the user data collection module 910 may be notified to transmit the user data to the data preprocessing module 920, and the on-vehicle data collection module 930 may be notified to transmit the driving data to the data preprocessing module 920.
Alternatively, when the driving advice determination module 950 has obtained the departure time of the user through the above-mentioned schedule or ticketing information, the driving advice determination module 950 may determine whether the user is in the driving state in real time within a preset time including the departure time (for example, within 5 minutes before and after the departure time).
Alternatively, the driving advice determination module 950 may gradually decrease the preset determination time based on an increase in the driving time period. The preset determination time cannot be reduced to 0.
The driving advice determination module 950 may obtain driving advice based on the final fatigue level sent by the model calculation module 940. Wherein the driving advice may be used to prompt the user if tired. Optionally, the driving advice may further include a recommended driving time period. For example, when the driving advice determination module 950 determines that the user is lightly tired or moderately tired, the driving advice may include a recommended driving duration. The driving advice may further include an awake prompt message, which may be used to remind the user to lower the temperature in the vehicle or drink refreshing beverages, play refreshing music, and the like. When the driving advice determination module 950 determines that the user is severely tired, the driving advice may include a parking advice information that may be used to prompt the user to stop for a break as soon as possible.
The driving advice determination module 950 may send the driving advice to the display module 350, and the display module 350 may display the driving advice. The relevant function code of the driving advice determination module 950 may be run on the processor of the electronic device 100.
It should be noted that the software modules illustrated in the embodiments of the present invention do not constitute a specific limitation on the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer software modules than those described above, or may combine certain modules, split certain modules, or the like.
Next, a flow chart of a detection method provided in an embodiment of the present application is described.
Based on the detection method provided by the embodiment of the application, the electronic device 100 can acquire the behavior data of the user before the user drives the car. And obtains the fatigue degree before driving and the driving advice based on the behavior data. The electronic device 100 may also acquire physical state data and on-vehicle driving data of the user during driving of the user, and obtain final fatigue degree and driving advice based on the behavior data, the physical state data, and the on-vehicle driving data. In this way, the electronic device 100 may prompt the user whether to drive and travel before the user drives, and how long the driving may be tired. The electronic device 100 may also detect the fatigue degree of the user in real time during the driving process of the user, and prompt the user to reduce the fatigue feeling by adjusting the temperature in the vehicle and other behaviors when the user reaches mild fatigue or moderate fatigue. When the user reaches severe fatigue, the user is prompted to stop for rest as soon as possible, and so on. The probability of traffic accidents caused by fatigue driving of users is greatly reduced.
Illustratively, as shown in FIG. 10, the method includes:
s1001, the electronic device 100 acquires travel information of the user.
The trip information may include, but is not limited to, departure time, arrival time and trigger time. The departure time is the time when the user starts driving, the arrival time is the time when the user parks, and the trigger time is the time when the electronic device 100 acquires the behavior data.
The electronic device 100 may obtain the trigger time based on the departure time.
In some embodiments, the electronic device 100 may obtain the destination point and arrival time of the user through the user's calendar, ticketing information (e.g., train ticket, airplane ticket, presentation ticket, movie ticket, etc.), and so forth. Alternatively, the arrival time may be a preset time (e.g., 30 minutes) earlier than the time recorded by the ticket or calendar, so that the user may be prevented from missing a trip.
Thereafter, the electronic device 100 may obtain a departure time of the user based on the destination point, the arrival point, and the real-time point of the user. For example, the electronic device 100 may begin X hours before the arrival time, determine whether the distance between the user's real-time location and the destination location exceeds a distance threshold (e.g., 1 km), where X is greater than or equal to 0, e.g., the value of X may take 5. When the electronic device 100 determines that the distance between the current location and the destination location of the user exceeds the distance threshold, the electronic device 100 may determine the departure time of the user based on the driving duration and the arrival time from the current location to the destination location.
In other embodiments, the electronic device 100 may obtain the departure time of the user directly from a calendar or a set alarm clock.
In other embodiments, the driving advice determination module may obtain navigation information of the user, and determine the departure time of the user based on the navigation information.
The electronic device 100 may take the departure time as the trigger time, or the electronic device 100 may take N hours before the departure time as the trigger time, where the trigger time is Y hours after the arrival time. Wherein Y is 0 or more, for example, Y may take 1. That is, the trigger time is earlier than the departure time, and the departure time and the trigger time differ by a preset time, which may be Y hours.
Alternatively, when detecting the start driving behavior of the user, the electronic device 100 may use the start time as the departure time and the trigger time. The act of initiating driving may include, but is not limited to, electronic device 100 and electronic device 600 establishing a communication connection, belting, closing a driver's seat door, releasing a hand brake, firing a fire, stepping on a throttle, and the like.
In some embodiments, the electronic device 100 determines that the user is driving the vehicle during the user's driving by the user's positioning. Note that the electronic device 100 has not acquired the behavior data yet. The electronic device 100 may use the time when the driving behavior of the user is detected as the trigger time and the departure time. It is understood that the electronic device 100 may directly perform step S1006 and subsequent steps after performing steps S1002-S1004.
S1002, the electronic device 100 acquires behavior data of the user.
When the electronic device 100 determines that the current time is the trigger time, the behavior data of the user within a preset time (for example, within 6 hours) before the trigger time may be acquired.
For example, the electronic device 100 may directly obtain, through the electronic device 500, the behavior data within a preset time before the trigger time. For another example, the electronic device 100 may obtain, through the electronic device 500, user data within a preset time before the trigger time, and obtain behavior data based on the user data.
S1003, the electronic device 100 obtains the degree of fatigue before driving by using the behavior data as an input of the first fatigue model.
Specifically, the electronic device 100 may use the behavior data as input of the first fatigue model in the form of a behavior sequence, to obtain the fatigue degree before driving. The output of the first fatigue model is determined by the number of user actions and the order of user actions in the input sequence of actions. The sequence of the user behaviors in the behavior sequence is different, and the fatigue degrees before driving obtained by the first fatigue model are different. Specifically, reference may be made to the embodiment shown in fig. 8, and details thereof are not repeated here.
Optionally, when the trigger time of the electronic device 100 is before the departure time, the electronic device 100 may act as the last action of the action sequence based on the most frequent actions of the user between the trigger time and the departure time. For example, the electronic device 100 acquires the behavior sequence of the user as < motion, sitting > at the trigger time. The electronic device 100 detects that the user has been sleeping the most between the trigger time and the departure time within a previous period of time (e.g., within a previous month). The electronic device 100 may obtain a behavior sequence < sport, sitting still, sleeping >. The electronic device 100 detects that the number of stills is the greatest between the trigger time and the departure time in a previous period of time (e.g., in the previous month). The electronic device 100 may obtain a behavior sequence of < sport, sitting >. Alternatively, the electronic device 100 may also directly use the sequence of actions.
S1004, the electronic apparatus 100 obtains and displays driving advice based on the degree of fatigue before driving.
The electronic device 100 may obtain driving advice based on the degree of fatigue before driving, and display it. Specifically, the electronic device 100 may determine, based on the pre-driving fatigue level, a historical pre-driving fatigue level that is closest to the currently obtained pre-driving fatigue level from among the stored one or more pre-driving fatigue levels (i.e., the historical pre-driving fatigue levels). Of the one or more historical pre-drive fatigue levels, the absolute value of the difference between the closest historical pre-drive fatigue level and the currently obtained pre-drive fatigue level is the smallest.
Optionally, the electronic device 100 may also determine the most similar historical pre-driving fatigue level based on the departure time or the trigger time. Of the one or more historical pre-drive fatigue levels, the closest pre-drive fatigue level is closest to the departure time or trigger time between the currently obtained pre-drive fatigue level.
The electronic device 100 may obtain one or more final fatigue levels corresponding to the closest pre-history fatigue levels. And based on one or more of the final fatigue levels, the final fatigue level of severe fatigue is reached earliest. The electronic device 100 may use the driving duration corresponding to the final fatigue level, which reaches the severe fatigue at the earliest, as the recommended driving duration.
Optionally, the electronic device 100 may obtain one or more driving fatigue levels corresponding to the most similar historical previous fatigue levels, and sequentially calculate one or more final fatigue levels based on the one or more driving fatigue levels and the current obtained previous fatigue level. The electronic device 100 may derive a final fatigue level that reaches severe fatigue at the earliest based on one or more final fatigue level values. The electronic device 100 may acquire the driving duration of the in-driving fatigue degree corresponding to the final fatigue degree. And takes the driving duration as the recommended driving duration.
The electronic device 100 may display driving advice including the recommended driving time period. In this way, the electronic device 100 may prompt the user for the longest driving time that can be continuously driven when the user has not yet started driving or the driving duration does not exceed the preset initial time (e.g., 10 minutes), so as to improve the fatigue driving problem of the user.
Alternatively, the electronic device 100 may obtain the driving advice based on the predicted driving time period, the departure time period, and the trigger time period. When the electronic device 100 determines that the user may experience mild fatigue or moderate fatigue during driving based on the predicted driving duration, if the trigger time is earlier than the departure time and the time difference between the trigger time and the departure time is greater than the time threshold (for example, 30 minutes), the driving advice may include a trip prompt, and the trip prompt may be used to prompt the user to rest for a period of time. If the trigger time is earlier than the departure time and the time difference between the trigger time and the departure time is less than or equal to a time threshold (e.g., 30 minutes), a trip prompt may be included in the driving advice, which may be used to prompt the user to prepare a refreshing beverage, or the like. When the electronic device 100 determines that the user is severely tired during driving based on the estimated driving duration, the driving advice may include a travel prompt, and the travel prompt may be used to prompt the user to travel by other travel modes (for example, bus travel or driving for the generation), where the recommended driving duration is zero hours.
In one possible implementation, the electronic device 100 may perform only steps S1001 to S1004. In this way, the electronic device 100 may obtain the recommended driving time before the user drives the vehicle to travel, so as to avoid fatigue driving.
S1005, the electronic apparatus 100 determines whether the user is in a driving state.
The electronic device 100 may determine whether the user is in a driving state after acquiring the behavior data. Specifically, the electronic device 100 may determine whether the user is driving at intervals of a preset determination time. For example, the electronic device 100 may determine whether the user is driving by acquiring the speed, acceleration, rotation of the steering wheel, etc. of the vehicle through the electronic device 600. For another example, the electronic device 100 may acquire the location information of the user through a sensor, and determine whether the user is driving.
Optionally, when the trigger time acquired by the electronic device 100 is earlier than the departure time, the electronic device 100 may determine whether the user is in the driving state at the departure time. It can be appreciated that, because the departure time is shorter, an error may occur in the determination of the electronic device 100, so that the electronic device 100 may determine whether the user is in the driving state at intervals of the preset determination time during a period including the departure time.
When the electronic apparatus 100 determines that the user is in the driving state, step S1006 may be performed.
When the electronic device 100 determines that the user is not in the driving state, it determines whether the user is in the driving state again at intervals of a preset determination time.
S1006, the electronic device 100 acquires the on-vehicle travel data and the physical state data. The physical state data includes stable data and wave data.
The electronic device 100 may receive user data transmitted by the electronic device 500, and obtain physical state data through a preprocessing operation based on the user data. The electronic device 100 may also receive driving data sent by the electronic device 600, and obtain on-vehicle driving data through a preprocessing operation based on the driving data. Specifically, reference may be made to the embodiment shown in fig. 9, and details thereof are not repeated here.
S1007, the electronic apparatus 100 determines the second fatigue model based on the stable data in the physical status data.
In one possible implementation, the server 300 has stored therein physical state data, on-board travel data, and in-driving fatigue levels of a plurality of users. The server 300 may classify the users into different types based on the stable data among the physical state data. In some embodiments, the server 300 may categorize users into certain gender users of a certain age group, a certain height range, a certain weight range, based on age, weight, gender, height, etc. For example, the server 300 may classify users of ages 20-35 years old, weights 60 kg-70 kg, heights 170 cm-180 cm, and sexes as men into one type. The server 300 may train to obtain a second fatigue model based on physical state data, on-board travel data, and in-driving fatigue levels of a plurality of users of this type. In this way, the server 300 may obtain multiple types of users and their corresponding second fatigue models. The electronic device 100 may determine which user type the user belongs to based on the physical status data of the user, and download a second fatigue model corresponding to the user type from the server.
Further, after the electronic device 100 acquires the second fatigue model, it may train the second fatigue model based on the stored physical state data of the user, the on-vehicle driving data, and the fatigue degree in driving, and store the trained second fatigue model. The electronic device 100 may calculate the in-driving fatigue level of the user using the trained second fatigue model. That is, the electronic device 100 may train to obtain the second fatigue model based on the historical user physical state data, the historical on-vehicle travel data, and the historical degree of fatigue in driving.
In another possible implementation, the electronic device 100 may train to obtain the second fatigue model directly based on historical user physical state data, historical on-vehicle travel data, and historical in-driving fatigue levels.
S1008, the electronic device 100 obtains the in-driving fatigue degree by using the on-vehicle travel data and the wave-like data as inputs of the second fatigue model.
The model calculation module can take the fluctuation data and the on-vehicle driving data in the physical state data as the data of the second fatigue model to obtain the fatigue degree in driving. The second fatigue model can be used for processing input data without time sequence relation to obtain an output result.
S1009, the electronic apparatus 100 obtains the final fatigue degree based on the pre-driving fatigue degree and the in-driving fatigue degree.
The electronic device 100 may weight and sum the pre-driving fatigue level and the in-driving fatigue level to obtain a final fatigue level. Wherein the weight of the fatigue degree before driving and the fatigue degree during driving are both larger than zero, and the sum of the weight of the fatigue degree before driving and the weight of the fatigue degree during driving is equal to 1.
Optionally, the electronic device 100 may determine weights of the fatigue degree before driving and the fatigue degree during driving when calculating the final fatigue degree based on the driving duration of the user. For example, the electronic apparatus 100 may increase the weight of the fatigue degree in driving and decrease the weight of the fatigue degree before driving as the driving duration increases.
S1010, the electronic apparatus 100 obtains and displays driving advice based on the final fatigue degree.
The electronic device 100 may obtain driving advice based on the minimum fatigue level and display the driving advice. Wherein the driving advice may be used to prompt the user if tired. Optionally, the driving advice may further include a recommended driving time period. For example, when the driving advice determination module determines that the user is lightly tired or moderately tired, the driving advice may include a recommended driving duration. The driving advice may further include an awake prompt message, which may be used to remind the user to lower the temperature in the vehicle or drink refreshing beverages, play refreshing music, and the like. Alternatively, the electronic device 100 may directly notify the vehicle air conditioner to lower the temperature in the vehicle, or/and notify the vehicle audio to play refreshing music.
When the driving advice determination module determines that the user is severely tired, the driving advice may include parking advice information that may be used to prompt the user to stop for a rest as soon as possible.
Alternatively, when the electronic device 100 determines that the user is severely tired, it may also display the parking position nearest to the user's current location, and display navigation information from the current location to the parking position.
Alternatively, the electronic device 100 may send the driving advice to the electronic device 600, and the electronic device 600 may display the driving advice. Further alternatively, the electronic device 100 may also send navigation information to the electronic device 600, which the electronic device 600 may display.
S1011, the electronic apparatus 100 determines whether the user is in a driving state.
After the electronic device 100 performs step S1010, it may be determined whether the user is still driving at a preset determination time interval. For example, the electronic apparatus 100 may determine whether the user is in a driving state through the speed, acceleration of the vehicle. When the electronic apparatus 100 determines that the user is still in the driving state, the electronic apparatus 100 may execute steps S1006 to S1011. When the electronic apparatus 100 determines that the user is not in the driving state, the electronic apparatus 100 may stop executing the fatigue detection flow (i.e., step S1006 to step S1011).
Alternatively, the electronic apparatus 100 may adjust the preset determination time based on the driving duration. The longer the driving duration, the shorter the preset determination time, wherein the value of the preset determination time is greater than zero.
Alternatively, the above steps S1002 to S1004 and S1007 to S1009 may be performed by the server 300.
In one possible implementation, the electronic device 100 may determine whether the user is in a driving state at every preset determination time. And after the user is judged to be in the driving state, acquiring behavior data of the user in a preset time before the moment when the user is judged to be in the driving state. The electronic device 100 may derive the degree of pre-driving fatigue based on the behavior data. After that, the electronic apparatus 100 may directly perform steps S1006 to S1011. In this way, the electronic device 100 can judge the fatigue degree of the user only in the driving behavior generation process, so as to avoid the fatigue driving behavior of the user.
In one possible implementation, the electronic device 100 may determine the degree of fatigue in the driving of the user directly based on the physical condition data of the user. In some embodiments, the electronic device 100 may determine the in-driving fatigue degree through the second fatigue model based on the physical condition data of the user. The second fatigue model may be trained based on the historical physical condition data of the user, or may be downloaded from the server 300 based on the physical condition data of the user. Specifically, the step of the electronic device 100 obtaining the second fatigue model from the server 300 may refer to the embodiment shown in the step S1007, which is not described herein.
Fig. 11 and 12 exemplarily show two application scenarios of the detection method.
After the electronic device 100 obtains the travel information of the user, the electronic device 100 may obtain the behavior data of the user, and obtain the fatigue degree before driving based on the behavior data. The electronic device 100 may obtain driving advice based on travel information and the degree of fatigue before driving.
As shown in fig. 11, fig. 11 illustrates an indoor environment in which a user is using the electronic device 100. The electronic device 100 may acquire travel information of the user based on the ticket information of the user. For example, the electronic device 100 detects that the departure time of the user is "13:30", and the boarding location of the user is "Shenzhen Baoan airport T3". The travel information of the user acquired by the electronic device 100 includes an estimated driving duration, departure time, and arrival time from the current location to the user's riding location. Here, if the electronic device 100 obtains the estimated driving time period of 60 minutes, since the electronic device 100 detects that the airline requires to be at least half an hour in advance, the electronic device 100 may determine the arrival time to be "13:00" and the departure time to be "12:00". Here, the electronic device 100 may set the trigger time to "11:00". Electronic device 100 may obtain behavior data of the user at "9:00-11:00", e.g., electronic device 100 may obtain behavior data of the user through electronic device 500. The electronic device 100 may derive a pre-driving fatigue level based on the user behavior data and the first fatigue model. The electronic device 100 may also derive driving advice based on the degree of fatigue before driving. Here, if the electronic apparatus 100 determines that the user is slightly tired or moderately tired during driving. The electronic device 100 may display driving advice including travel prompt information. The travel prompt information can be one or more of text prompt information, picture prompt information and voice prompt information. Here, the travel prompt information may be a text prompt information: "user your, according to your flight information, you may need to drive to the airport next. During driving, you may feel tired, suggesting you half an hour at noon break, and then travel. Therefore, before driving, the user can reduce the fatigue degree according to the driving advice, and the fatigue driving problem is improved.
When the electronic device 100 detects that the user is driving, the electronic device 100 may acquire physical state data and on-vehicle running data of the user, and obtain the fatigue degree in driving based on the physical state data and the on-vehicle running data. The electronic apparatus 100 may obtain the final fatigue degree based on the pre-driving fatigue degree and the in-driving fatigue degree shown in fig. 11. The electronic device 100 may derive driving advice based on the final fatigue level.
As shown in fig. 12, fig. 12 illustrates an in-vehicle environment in which the electronic device 100 may establish a communication connection with the electronic device 600 when a user drives. The electronic device 100 may also obtain on-vehicle travel data through the electronic device 600. The electronic device 100 may obtain the final fatigue degree and the driving advice based on the on-vehicle running data and the like. Specifically, reference may be made to the embodiment shown in fig. 10, and details thereof are not repeated here. Here, if the electronic device 100 determines that the user is in light fatigue or moderate fatigue, the electronic device 100 may obtain a driving advice including the awake prompt information. The electronic device 100 may send the driving advice to the electronic device 600. The electronic device 600 may display driving advice including wakefulness cues. The awake prompt information can be one or more of text prompt information, picture prompt information and voice prompt information. Here, the awake prompt information may be a text prompt information: "driver, you good, you are tired at present, suggest you to lower the temperature in the car, or play refreshing music, avoid fatigue driving". Therefore, in the driving process, the user can reduce the fatigue degree according to the driving advice, and the fatigue driving problem is improved.
In some application scenarios, the taxi taking through the mobile phone has become a way for many users to go, for example, after the users drink wine, or are tired, or the users can call the taxi to go through taxi taking software when the vehicles are charged. But the user may lose the personal belongings in the car while riding. If the passenger loses the articles on the vehicle, the passenger needs to find the driver to retrieve the articles lost on the vehicle, and the journey of the passenger and the driver is delayed. And the possibility of the passenger retrieving the article on the vehicle is not high. Accordingly, embodiments of the present application provide a detection method. The electronic device 100 may establish a bluetooth connection with the in-vehicle device 900. The vehicle-mounted device 900 may acquire an in-vehicle image before the passenger gets on (also referred to as an in-vehicle image before getting on) after detecting a door opening operation of the passenger. The in-vehicle apparatus 900 may also acquire an in-vehicle image of a passenger after getting off (also referred to as an in-vehicle image after getting off) after detecting that the passenger gets off. The vehicle-mounted device 900 may determine whether the passenger gets off the vehicle and further includes the passenger's article based on the image of the front of the vehicle and the image of the rear of the vehicle. When the vehicle-mounted device 900 determines that the passenger's article is still included in the vehicle, the article missing prompt may be broadcast, and the article missing prompt may be used to prompt the driver that the passenger's article remains on the vehicle. Meanwhile, the vehicle-mounted device 900 may also send the article omission prompt information to the electronic device 100, and the electronic device 100 may display the article omission prompt information after receiving the article omission prompt information. The article omission prompting message is used for prompting passengers that articles are left on the vehicle. In this way, the passenger articles can be prevented from remaining on the vehicle.
The electronic device 100 may be a mobile phone, a tablet computer, a wearable device, or the like. The hardware structure of the electronic device 100 may refer to the schematic structural diagram of the electronic device 100 shown in fig. 1, which is not described herein. The in-vehicle apparatus 900 may be used to acquire data of a vehicle, for example, the in-vehicle apparatus 900 may be used to detect opening and closing of a door, acquire an in-vehicle image, detect a speed, acceleration, and the like of the vehicle.
Next, a schematic structural diagram of the electronic device 100 provided in the embodiment of the present application will be described.
As shown in fig. 13, the electronic device 100 may include, but is not limited to, a bluetooth module 1302, an acceleration sensor 1301, and a processor 1303.
Among them, the acceleration sensor 1301 may be used to acquire acceleration of the electronic apparatus 100. The acceleration sensor 1301 may also be used to send acceleration to the processor 1303. The acceleration sensor 1301 may also send acceleration to the bluetooth module 1302.
The bluetooth module 1302 may be used to establish a guest bluetooth connection with the in-vehicle device 900. The guest bluetooth connection may be used to establish a bluetooth connection between the electronic device 100 and the in-car device 900 that may enable pairing and key verification without user input. The electronic device 100 may implement a guest bluetooth connection by invoking a related function to set a bluetooth function.
For example, the electronic device 100 may directly create a pairing request through the createBond () function, and send the pairing request to the in-car device 900. The electronic device 100 may also set the key to a specified value by calling setPin () function row key setting. The electronic device 100 may also cancel the key input through the cancelpair userinput () function. In this way, the electronic device 100 may create a bluetooth connection (i.e., guest bluetooth connection) with the in-vehicle device 900 that does not require pairing and keys. It should also be noted that the user of the electronic device 100 may be referred to as a passenger in the following description.
The bluetooth module 1302 may also be used to send data (e.g., acceleration of the electronic device 100, etc.) to the in-vehicle device 900. The acceleration is sent to the in-car apparatus 900. The bluetooth module 1302 may also be used to receive acceleration of the vehicle equipment 900.
The processor 1303 may be configured to determine whether to disconnect the guest bluetooth connection with the in-vehicle device 900. For example, the processor 1303 may be the processor 110 shown in fig. 1. That is, the processor 1303 may determine whether to disconnect the guest bluetooth connection based on the acceleration of the electronic device 100 and the acceleration of the in-car device 900. When the processor 1303 determines that the acceleration of the electronic device 100 and the acceleration of the vehicle device 900 are the same, the processor 1303 may send a confirmation success signaling to the vehicle device 900 through the bluetooth module 1302, where the confirmation success signaling may be used to instruct the vehicle device 900 not to disconnect the guest bluetooth connection. When the processor 1303 determines that the acceleration of the electronic device 100 and the acceleration of the in-vehicle device 900 are different, the processor 1303 may disconnect the bluetooth connection with the in-vehicle device 900. The processor 1303 may also send a confirmation failure signaling to the vehicle device 900 via the bluetooth module 1302, which may be used to instruct the vehicle device 900 to disconnect the guest bluetooth connection.
The acceleration of the electronic device 100 and the acceleration of the vehicle device 900 may deviate due to the difference between the sensors of the electronic device 100 and the vehicle device 900. Thus, the processor 1303 may determine that the acceleration of the electronic device 100 and the acceleration of the in-vehicle device 900 are the same when the absolute value of the difference between the acceleration of the electronic device 100 and the acceleration of the in-vehicle device 900 does not exceed the acceleration deviation threshold. Wherein the acceleration deviation threshold may be a fixed value (e.g., 0.001m/s- 2 ). Alternatively, the acceleration deviation threshold may be derived based on the maximum error value of the sensor. Wherein the maximum error value of the sensor may be provided by the manufacturer of the sensor. The electronic device 100 and the in-vehicle device 900 store maximum error values of the respective sensors. The electronic device 100 and the in-vehicle device 900 may transmit the maximum error values of the respective sensors before transmitting the acceleration. The acceleration deviation threshold may be a sum of a maximum error value of a sensor of the electronic device 100 and a maximum error value of a sensor of the in-vehicle device 900. In this way, an applicable acceleration deviation threshold can be obtained based on different electronic devices.
Next, a schematic structural diagram of a vehicle device 900 according to an embodiment of the present application will be described.
As shown in fig. 14, the vehicle device 900 includes, but is not limited to, an acceleration sensor 1401, a bluetooth module 1402, a camera 1403, and a processor 1404.
Among other things, an acceleration sensor 1401 may be used to obtain acceleration of the truck device 900. The acceleration sensor 1401 may also be used to send acceleration to the processor 1404. The acceleration sensor 1401 may also send acceleration to the bluetooth module 1402.
Bluetooth module 1402 may be used to establish a guest bluetooth connection with electronic device 100. The bluetooth module 1402 may also be configured to receive data sent by the electronic device 100 (e.g., acceleration of the electronic device 100, confirmation success signaling, confirmation failure signaling, item omission prompt, etc.). The bluetooth module 1402 may also be used to send data of the in-vehicle device 900 (e.g., acceleration of the in-vehicle device 900) to the electronic device 100.
The camera 1403 may be used to acquire images of the interior of the vehicle. The images in the vehicle acquired by the camera 1403 include an in-vehicle image before the vehicle is on the vehicle and an in-vehicle image after the vehicle is off the vehicle.
The processor 1404 may be used to determine whether the passenger's items remain in the vehicle based on the image in the vehicle. That is, the processor 1404 may acquire an in-vehicle image of the boarding vehicle through the camera 1403 after detecting the operation of boarding the passenger. The processor 1404 may acquire an in-vehicle image after getting off through the camera 1403 after detecting the operation of getting off the passenger. The processor 1404 may detect the passenger's boarding and disembarking operations through an image recognition algorithm (e.g., convolutional neural network algorithm) through images acquired by the camera. The processor 1404 may determine information about the items in the vehicle before the passenger gets on the vehicle from the image of the interior of the vehicle before the passenger gets on the vehicle. The processor 1404 may determine information about items in the passenger's post-drive vehicle from the post-drive vehicle image. The processor 1404 may compare the information of the items in front of the passenger with the information of the items after the passenger gets off the vehicle to determine whether the items in the vehicle after the passenger gets off the vehicle are the same as the items in front of the passenger. If so, the processor 1404 may determine that the occupant's items are not left in the vehicle. If different, the processor 1404 may prompt the driver that something is left on the vehicle. For example, the processor 1404 may instruct the vehicle bluetooth to report a first missing alert. The first missing prompt may prompt the driver that something of the passenger remains on the vehicle. Alternatively, the processor 1404 may instruct the vehicle center control display to display the first missing prompt. The processor 1404 may also send item omission indication information to the electronic device 100 via the bluetooth module 1402. The item missing indication information may be used to instruct the electronic device 100 to display a second missing cue information, which may be used to alert the passenger that an item is left in the vehicle.
In some embodiments, the in-vehicle apparatus 900 further includes a door sensor and a pressure sensor. Wherein the door sensor may be used to detect an operation of opening the door by the passenger. The pressure sensor may be used to detect whether a passenger is in the seat. In this way, the vehicle apparatus 900 can detect the getting-on operation and the getting-off operation of the passenger through the door sensor and the pressure sensor.
A set of interface schematic diagrams provided by embodiments of the present application are described next.
For example, as shown in fig. 15A, the electronic device 100 may display a desktop 1501. Wherein desktop 1501 may include a plurality of application icons, e.g., a taxi taking application icon 1502, etc. Wherein the taxi application icon 1502 may be used to trigger an interface displaying a taxi application (e.g., the taxi application interface 1510 shown in fig. 15B). The taxi taking application may be used to send the departure and destination points of the passenger to the driver. The taxi taking application may also be used to send driver information (location information, license plate number, vehicle colour, etc.) to the passenger. A status bar may also be displayed above desktop 1501, where a bluetooth icon may be displayed. The bluetooth icon is used to indicate that the electronic device 100 has bluetooth functionality enabled.
The electronic device 100 may receive an input from the passenger for the taxi application icon 1502, and in response thereto, display a taxi application interface 1510 as shown in fig. 15B.
As shown in fig. 15B, the taxi-application interface 1510 may include a text box 1511, a text box 1512, and a call vehicle control 1513. Wherein the text box 1511 may be used to obtain and display the user's departure location. Text box 1512 may be used to obtain and display a destination point for the user. The call vehicle control 1513 may be used to send the departure location and destination point to the driver's electronic device (e.g., the in-vehicle device 900). For example, text box 1511 may display a departure location "AA street", and text box 1512 may display a destination location "BB building".
The electronic device 100 may, upon receiving an input from the passenger for calling the vehicle control 1513, display a taxi taking application interface 1520 as shown in fig. 15C in response to the input. Meanwhile, the electronic device 100 may transmit the departure point and the destination point to the in-car device 900. When the in-vehicle device 900 receives the departure point and the destination point of the electronic device 100, vehicle information (e.g., vehicle position information, license plate number, driver name, vehicle color, etc.) may be transmitted to the electronic device 100. Upon receiving the vehicle information, the electronic device 100 may display the taxi taking application interface 1530 as shown in fig. 15D.
As shown in fig. 15D, the taxi application interface 1530 may include a vehicle information field 1531. The vehicle information field 1531 may be used to display information of a vehicle. The vehicle information bar 1531 may also be used to display the time when a vehicle with a license plate number "A123" arrives at the departure location "AA street". Optionally, the taxi application interface 1530 may also include a map animation that may be used to display the location of the vehicle and the user.
The electronic device 100 may turn on the guest bluetooth function and broadcast a guest bluetooth connection request when receiving the vehicle information. Wherein the guest bluetooth function may be used to establish a guest bluetooth connection between the electronic device 100 and the in-car device 900. The visitor bluetooth connection may be used for transmitting acceleration information between the electronic device 100 and the vehicle-mounted device 900, and may also be used for the vehicle-mounted device 900 to send a prompt message to the electronic device 100 that an article remains on the vehicle. It should be noted that, the electronic device 100 may perform key setting by setting a bluetooth function, cancel key information input setting, cancel pairing request creation setting. In this way, the electronic device 100 may create a bluetooth connection (i.e., guest bluetooth connection) with the in-vehicle device 900 that does not require pairing and keys. It should also be noted that the user of the electronic device 100 may be referred to as a passenger in the following description.
The vehicle-mounted device 900 may acquire an in-vehicle image of the front of the upper vehicle through the camera when detecting the operation of opening the door of the passenger. The vehicle device 900 may acquire an operation of opening the vehicle door through a door sensor, or recognize an operation of opening the vehicle door by a passenger through a screen acquired by a camera. The vehicle-mounted device 900 may also turn on the visitor bluetooth function after detecting that the passenger gets on the vehicle. The vehicle-mounted device 900 can identify the passenger through the image collected by the camera, and determine that the passenger gets on the vehicle. Alternatively, the vehicle apparatus 900 may determine whether the passenger gets on the vehicle through the pressure sensor. The in-car device 900 may receive a visitor bluetooth connection request of the electronic device 100 after turning on the visitor bluetooth function. After receiving the visitor bluetooth connection request of the electronic device 100, the in-car device 900 may send a visitor bluetooth connection response to the electronic device 100. After the electronic device 100 receives the visitor bluetooth connection response, the electronic device 100 and the in-car device 900 establish a visitor bluetooth connection.
The electronic device 100 and the in-car device 900 may exchange their respective accelerations via a guest bluetooth connection. The acceleration may be used to determine whether the electronic device 100 and the in-vehicle device 900 are in the same vehicle. If the electronic device 100 and the vehicle device 900 determine that the acceleration of the electronic device 100 and the acceleration of the vehicle device 900 are different, that is, not in the same vehicle, the electronic device 100 and the vehicle device 900 may disconnect the bluetooth connection of the visitor.
In one possible implementation, after the electronic device 100 and the vehicle device 900 determine that the acceleration of the electronic device 100 and the acceleration of the vehicle device 900 are different, the electronic device 100 may record identification information of the vehicle device 900 (for example, a bluetooth device name of the vehicle device 900). During a preset access prohibition time (for example, within 1 hour), the electronic device 100 may not establish the guest bluetooth connection with the in-vehicle device 900 when it is determined that the device that establishes the guest bluetooth connection is the in-vehicle device 900 based on the identification information. In this way, the electronic device 100 may record the identifier of the electronic device not in the same vehicle, so as to prevent the electronic device from being accessed again to the wrong electronic device, and improve the possibility that the electronic device 100 and the electronic device in the same vehicle establish the visitor bluetooth connection. It may be appreciated that the electronic device 100 may delete the identification information of the vehicle device 900 after the access prohibition time is preset.
If the electronic device 100 and the vehicle device 900 determine that the acceleration of the electronic device 100 and the acceleration of the vehicle device 900 are the same, that is, the same vehicle, the electronic device 100 and the vehicle device 900 may not disconnect the bluetooth connection of the visitor. Next, the embodiment of the present application will be written with the electronic device 100 and the in-car device 900 in the same car.
The vehicle-mounted device 900 may trigger the vehicle-mounted device 900 to determine whether the passenger gets off when detecting the operation of opening the door of the passenger, and acquire an in-vehicle image after getting off through the camera when determining that the passenger gets off. The vehicle device 900 may detect an operation of opening the door by a passenger through a door sensor, or may recognize an operation of opening the door by a passenger through a screen captured by a camera. In some embodiments, the vehicle apparatus 900 may acquire an in-vehicle image acquired by a camera after detecting an operation of opening a door of a passenger, and determine whether the passenger gets off the vehicle based on the in-vehicle image. When the in-vehicle apparatus 900 determines that the passenger gets off based on the in-vehicle image, the in-vehicle image after the passenger gets off may be used as the in-vehicle image after the passenger gets off. For example, the vehicle-mounted device 900 may determine that a passenger gets off the vehicle when it recognizes that there is no passenger in the seating area in the image. In other embodiments, the in-vehicle apparatus 900 may determine whether the passenger leaves the seat through a pressure sensor at the seat, and acquire an in-vehicle image after the passenger leaves the seat through an in-vehicle camera when the in-vehicle apparatus 900 determines that the passenger leaves the seat. In other embodiments, the vehicle apparatus 900 may determine whether the passenger gets off the vehicle through a camera and a pressure sensor. For example, the vehicle device 900 may determine whether the passenger leaves the seating area based on the image acquired by the camera after determining that the passenger leaves the seat through the pressure sensor. In this way, the in-vehicle image acquired by the in-vehicle apparatus 900 does not include a passenger after the vehicle is off, and it is more convenient to recognize the items in the vehicle.
After the in-vehicle image after the get-off is acquired by the in-vehicle apparatus 900, it may be determined whether the passenger's article remains on the vehicle based on the in-vehicle image before the get-on and the in-vehicle image after the get-off. The in-car apparatus 900 may recognize the items in the image of the front car in the upper car and the items in the image of the rear car in the lower car through an image recognition algorithm. The vehicle apparatus 900 can compare whether the articles in the front of the vehicle are the same as the articles in the rear of the vehicle, and determine whether the articles of the passenger remain in the vehicle. When the vehicle-mounted device 900 determines that the articles in the vehicle after the vehicle is off and the articles in the vehicle before the vehicle is on are different, it can determine that the articles of the passenger remain in the vehicle. The vehicle device 900 may broadcast the first missing prompt information through the vehicle-mounted audio after determining that the passenger's article remains in the vehicle. For example, the first missing cue information may be: "the passenger's thing is left in the car, please remind the passenger to retrieve".
The in-car apparatus 900 may also transmit item omission indication information to the electronic apparatus 100. After receiving the article omission indication information, the electronic device 100 may display a second omission indication information.
For example, the electronic device 100 may display a prompt box 1541 as shown in fig. 15E after receiving the item omission prompt. The prompt box 1541 may include a second missing prompt. The second missing cue information may be displayed in the form of text, animation, pictures, etc. For example, the second missing alert may be a text alert: "your items are lost in the car, the driver has not driven off the get-off location, please retrieve as soon as possible. Alternatively, the electronic device 100 may display a prompt box 1541 on the taxi taking application interface 1540 shown in fig. 15E, and the taxi taking application interface 1540 may be used to display the taxi taking fee of the passenger.
It should be noted that, the electronic device 100 may not only display the second missing alert information in text form, but also display the second missing alert information in voice broadcast form by the electronic device 100. Further, the electronic device 100 may prompt the user to view the second missing prompt information by vibrating the body.
Further, the vehicle apparatus 900 may determine whether the passenger gets off the vehicle again by the method (e.g., pressure sensor) described in the above embodiment when the passenger's operation of closing the door is detected. When the vehicle device 900 determines that the passenger gets off the vehicle, the first missing prompt message is broadcasted, and the article missing indication message is sent to the electronic device 100. In this way, it is possible to avoid a false notification to the passenger that an article is left in the vehicle in a scene where the passenger gets off temporarily (for example, a scene where the passenger gets off the vehicle).
The following describes a flow chart of a detection method provided in the embodiments of the present application.
Illustratively, as shown in FIG. 16, the method includes:
s1601, the electronic device 100 receives an input by the passenger for the first application.
The first application may be a taxi taking application (e.g., the taxi taking application shown in fig. 15A described above). The first application may be configured to receive an input from a passenger and to obtain taxi taking information from the passenger input. Trigger places and destinations can be included in the taxi taking information. The first application may also be used to send passenger taxi taking information to the driver.
The input for the first application may be an input for an icon of the first application (e.g., an input for the taxi taking application icon 1502 shown in fig. 15A and described above), or an input for a taxi taking control of a taxi taking page provided by the first application. The input described above for the driving application icon 1502 shown in fig. 15A (for example, the input described above for the driving application icon 1502 shown in fig. 15A).
The electronic device 100 may broadcast a guest bluetooth connection request to nearby electronic devices after receiving the passenger's input for the first application.
Alternatively, when the electronic device 100 receives the input of the icon for the first application from the user, the electronic device 100 may broadcast the guest bluetooth connection request at intervals of a preset time (e.g., 2 minutes) after receiving the input of the icon for the first application.
S1602, the vehicle device 900 detects a boarding operation of the passenger and acquires an in-vehicle image before the passenger boarding.
The in-vehicle camera device 900 may acquire an in-vehicle image (also referred to as an in-vehicle image before the passenger gets in) when the getting-on operation of the passenger is detected. The vehicle device 900 may also identify the object information in the image in the front of the vehicle through an image identification algorithm.
The boarding operation of the passenger can be the operation of the driving door of the passenger, or the braking operation of a driver, and the like. For example, the vehicle apparatus 900 may detect a boarding operation of a passenger through a door sensor. For another example, the vehicle apparatus 900 may detect a boarding operation of a passenger through an acceleration sensor. For another example, the in-vehicle apparatus 900 may detect an in-vehicle operation of a passenger through an image acquired by an in-vehicle camera.
For example, fig. 17A shows an in-vehicle image of an upper vehicle acquired by the vehicle-mounted device 900. The interior of the vehicle only includes the driver's items when the passenger has opened the door and has not been boarding the vehicle. The in-vehicle device 900 may obtain the in-vehicle item list of { < bottle, 1> } from the in-vehicle image of the upper vehicle, where the bottle is the identification of the item and 1 is the number of items. The in-vehicle front image and the obtained object list shown in fig. 17A are only examples, and the in-vehicle front image acquired by the vehicle device 900 in practical application is not specifically limited. For example, the identity of an item in the item list may be labeled item A.
S1603, the vehicle device 900 detects a sitting operation of the passenger, and turns on the guest bluetooth function.
The car set device 900 may turn on the guest bluetooth function upon detecting a seating operation of the passenger (i.e., detecting that the passenger is seated in the car), and receive a guest bluetooth connection request transmitted from the electronic device 100.
Wherein the in-vehicle apparatus 900 may detect a sitting-down operation of a passenger through a pressure sensor, an in-vehicle camera, or the like. In this way, the vehicle apparatus 900 can be prevented from taking the scene in which the driver gets on the vehicle temporarily as the scene in which the passenger gets on the vehicle to sit down.
Alternatively, the vehicle apparatus 900 may also take the door closing operation as the passenger sitting operation when the door closing operation of the passenger is detected.
Alternatively, the vehicle device 900 may directly turn on the visitor bluetooth function after detecting the boarding operation of the passenger.
S1604, the electronic device 100 sends a guest bluetooth connection request to the in-car device 900.
The electronic device 100 may broadcast a guest bluetooth connection request after receiving the passenger's input for the first application. The in-car device 900 may receive the guest bluetooth connection request broadcasted by the electronic device 100 after turning on the guest bluetooth function.
It should be noted that the communication connection between the electronic device 100 and the vehicle device 900 is not limited to the above-mentioned bluetooth connection, but may be other communication connection, for example, wi-Fi direct, or the like. The present application is not limited in this regard.
S1605, the in-vehicle apparatus 900 transmits a guest bluetooth connection response to the electronic apparatus 100.
After receiving the visitor bluetooth connection request sent by the electronic device 100, the vehicle device 900 sends a visitor bluetooth connection response to the electronic device 100, and establishes a visitor bluetooth connection with the electronic device 100. It will be appreciated that the electronic device 100 receives the guest bluetooth connection response and establishes a guest bluetooth connection with the in-vehicle device 900.
In one possible implementation, to increase the likelihood that the electronic device 100 and the target vehicle device establish a guest bluetooth connection. The target vehicle device is a vehicle device that is on the same vehicle as the electronic device 100 after a passenger gets on the vehicle. The electronic device 100 may determine, in the received one or more visitor bluetooth connection responses, a vehicle device with the strongest bluetooth signal, and establish a visitor bluetooth connection with the vehicle device. It will be appreciated that the stronger the bluetooth signal, the closer the distance between the vehicle equipment and the electronic device 100.
In one possible implementation, to secure data transferred between the electronic device 100 and the in-car device 900 over the guest bluetooth connection. The guest bluetooth connection may only be used to transmit motion information requests, motion information, item omission indication information, and calibration information (e.g., maximum error value of the sensor, designated acquisition time point, time to acquire acceleration). The motion information may include, but is not limited to, acceleration, velocity, and the like, among others. That is, when the motion information is acceleration, the motion information request is an acceleration request.
In some embodiments, the electronic device 100 may send a guest bluetooth connection request including a specified header to the in-vehicle device 900. The in-car device 900 may also send a guest bluetooth connection response to the electronic device 100 that includes the specified header. Thereafter, the electronic device 100 and the in-vehicle device 900 may continue to transmit acceleration through the data packet including the specified header. Wherein, the data in the data packet is the acceleration after encryption. The encryption and decryption modes of the electronic device 100 and the vehicle device 900 are the same.
For example, the guest bluetooth connection request sent by the electronic device 100 is: 1001 0000. Wherein 1001 designates a header. 0000 is the data in the data packet, it is understood that the data in the data packet may be any value. Here, writing is performed with a data value of 0000. After receiving the visitor bluetooth connection request, the vehicle device 900 determines that the packet header is 1001, and replies a visitor bluetooth connection response to the electronic device 100. For example, the guest bluetooth connection response may be: 1001 0000. Wherein 1001 designates a header, 0000 is data in a packet. Thereafter, the electronic device 100 may send the acceleration to the in-vehicle device 900. For example, the acceleration transmitted by the electronic device 100 is: 1001 5001. Where 1001 is a specified header, and 5001 is an encrypted acceleration. When the encryption manner of the electronic device 100 and the vehicle device 900 is to arrange the original data in reverse order, the vehicle device 900 may obtain an acceleration of 1.005m/s based on 5001 2 . It should be noted that, the data packet structure and the data encryption/decryption manner between the visitor bluetooth connections are only examples, and are not limited to the embodiments of the present application.
In other embodiments, the electronic device 100 may send a guest bluetooth connection request including a specified header and a specified data segment to the in-vehicle device 900. The packet header is designated as the same fixed data segment that the electronic device 100 and the in-car device 900 acquire from the server. Wherein the specified data segment may be a data segment of specified length randomly generated by the electronic device 100. After receiving the visitor bluetooth connection request, the vehicle device 900 may encrypt the specified data segment based on an encryption algorithm, and use the encrypted specified data segment as a packet header of the visitor bluetooth connection response. The electronic device 100 may then establish a guest bluetooth connection with the in-vehicle device 900 after determining that the header of the guest bluetooth connection response is an encrypted data segment. Thereafter, both the electronic device 100 and the in-vehicle device 900 may use the encrypted data segment as a header of a data packet for transmitting acceleration. It will be appreciated that the data in the data packet used to transmit acceleration is encrypted acceleration. Note that the encryption and decryption algorithm in the electronic device 100 and the vehicle device 900 are the same.
For example, the guest bluetooth connection request sent by the electronic device 100 is: 1001 0000. Wherein 1001 designates a header. 0000 is the data in the packet. After receiving the visitor bluetooth connection request, the vehicle device 900 determines that the packet header is 1001, and replies a visitor bluetooth connection response to the electronic device 100. When the encryption mode of the electronic device 100 and the vehicle device 900 for data is that 1 is added to the value of the original data, the vehicle device 900 may obtain that the packet header of the visitor bluetooth connection response is 0001. The visitor bluetooth connection response may be: 0001 0000. Wherein 0001 designates a header, 0000 is data in a data packet. It is understood that the data in the data packet may be any value. Then, the packet header of the data packet generated by the electronic device 100 or the vehicle device 900 for transmitting the acceleration is 0001, and the data is the encrypted acceleration. For example, when the acceleration is 1.005m/s 2 The packet is 0001 1006. It should be noted that, the data packet structure and the data encryption/decryption manner between the visitor bluetooth connections are only examples, and are not limited to the embodiments of the present application.
After the electronic device 100 and the vehicle device 900 establish the visitor bluetooth connection, it can be verified whether the electronic device 100 and the vehicle device 900 are on the same vehicle, and when the electronic device 100 and the vehicle device 900 are on the same vehicle, the vehicle device 900 can send the article missing indication information to the electronic device 100 through the visitor bluetooth connection.
In one possible embodiment, the electronic device 100 and the in-vehicle device 900 are on the same vehicle when the motion states of the electronic device 100 and the in-vehicle device 900 are the same. Specifically, the electronic device 100 may determine whether the motion state of the electronic device 100 and the motion state of the vehicle device 900 are the same through the motion information of the electronic device 100 and the motion information of the vehicle device 900. When the difference between the motion information of the electronic device 100 and the motion information of the in-vehicle device 900 is smaller than the motion deviation threshold, whether the motion state of the electronic device 100 and the motion state of the in-vehicle device 900 are the same. The motion deviation threshold may be preset or may be obtained based on an error value of a sensor, where the sensor is a sensor for acquiring motion information. Wherein the motion information may include, but is not limited to, acceleration, velocity, and the like. In the embodiments described in fig. 13-16, the motion information may be embodied in the form of acceleration, and the motion deviation threshold is an acceleration deviation threshold. For example, the electronic device 100 and the in-vehicle device 900 may determine whether the motion states of the electronic device 100 and the in-vehicle device 900 are the same by performing step S1606-step S1610. It is understood that, not limited to acceleration, the movement states of the electronic device 100 and the in-vehicle device 900 may be determined to be the same between the electronic device 100 and the in-vehicle device 900 based on the speeds of the electronic device 100 and the in-vehicle device 900.
S1606, the electronic device 100 transmits an acceleration request to the vehicle device 900.
After receiving the guest bluetooth connection response, the electronic device 100 may send an acceleration request to the in-car device 900. The acceleration request may be used to instruct the in-vehicle device 900 to send the acquired acceleration to the electronic device 100.
S1607, the vehicle apparatus 900 acquires the first acceleration of the vehicle apparatus 900 based on the acceleration request.
The in-vehicle device 900 may obtain the first acceleration of the in-vehicle device 900 after receiving the acceleration request.
S1608, the in-vehicle apparatus 900 transmits the first acceleration to the electronic apparatus 100.
S1609, the electronic apparatus 100 acquires a second acceleration of the electronic apparatus 100.
The electronic device 100 may obtain the second acceleration after sending the acceleration request to the in-vehicle device 900.
S1610, the electronic device 100 determines whether the first acceleration and the second acceleration are the same.
After obtaining the first acceleration and the second acceleration, the electronic device 100 may compare whether the values of the first acceleration and the second acceleration are the same. When the electronic apparatus 100 determines that the first acceleration and the second acceleration are the same, step S1611 may be performed. When the electronic device 100 determines that the first acceleration and the second acceleration are different, a confirmation failure signaling may be sent to the in-vehicle device 900. The electronic device 100 may also disconnect the guest bluetooth connection with the in-car device 900. It should be noted that, after the connection between the electronic device 100 and the target vehicle device 900 is disconnected, the electronic device 100 may continue to broadcast the connection request of the guest bluetooth until the electronic device 100 and the target vehicle device establish the connection of the guest bluetooth.
In one possible implementation, after the electronic device 100 and the vehicle device 900 determine that the acceleration of the electronic device 100 and the acceleration of the vehicle device 900 are different, the electronic device 100 may record identification information of the vehicle device 900 (for example, a bluetooth device name of the vehicle device 900). During a preset access prohibition time (for example, within 1 hour), the electronic device 100 may not establish the guest bluetooth connection with the in-vehicle device 900 when it is determined that the device that establishes the guest bluetooth connection is the in-vehicle device 900 based on the identification information.
Further, when the electronic device 100 and the vehicle device 900 acquire acceleration, the time for acquiring the acceleration may be recorded. In this way, the accelerations acquired at the same time point can be compared, and the acceleration difference between the electronic device 100 and the vehicle-mounted device 900 due to the difference in the time points at which the accelerations are acquired can be avoided.
Optionally, the acceleration request may include a specified acquisition time point. Wherein the specified acquisition time is after the point in time at which the electronic device 100 sends the acceleration request. In this way, the electronic device 100 and the vehicle-mounted device 900 can acquire the acceleration at the designated acquisition time point, so that the determination result is more accurate.
In some embodiments, the times of the electronic device 100 and the in-vehicle device 900 are not synchronized. The electronic device 100 and the in-vehicle device 900 may also perform a time calibration prior to transmitting the acceleration. For example, the electronic device 100 and the in-car device 900 may synchronize time over a satellite or cellular network.
The acceleration of the electronic device 100 and the acceleration of the vehicle device 900 may deviate due to the difference between the sensors of the electronic device 100 and the vehicle device 900. Thus, the electronic device 100 may determine that the acceleration of the electronic device 100 and the acceleration of the in-vehicle device 900 are the same when the absolute value of the difference between the acceleration of the electronic device 100 and the acceleration of the in-vehicle device 900 does not exceed the acceleration deviation threshold.
Further, the electronic device 100 may send the acceleration request to the in-vehicle device 900 multiple times until the number of times the electronic device 100 sends the acceleration request reaches a preset number of times (e.g., 3 times). The electronic device 100 may determine whether the first acceleration and the second acceleration are the same after each time the first acceleration of the vehicle device 900 is acquired. When the electronic device 100 determines that the first acceleration and the second acceleration are the same and the number of times the electronic device 100 has transmitted the acceleration request is less than the preset number of times, the acceleration request may be transmitted again to the in-vehicle device 900 after being separated by a preset time period (for example, 1 minute). When the electronic device 100 determines that the first acceleration and the second acceleration are the same, and the number of acceleration requests that the electronic device 100 has transmitted is equal to a preset number of times, a confirmation success signaling may be transmitted to the in-vehicle device 900. When the electronic device 100 determines that the first acceleration is different from the second acceleration, the confirmation failure signaling is directly sent to the vehicle device 900, and the bluetooth connection of the visitor is disconnected.
Alternatively, the electronic device 100 may send a preset number (e.g., 3) of acceleration requests to the in-vehicle device 900. The electronic device 100 may determine that the acceleration of the electronic device 100 is the same as the acceleration of the in-vehicle device 900 when it is determined that the same number of times as the first acceleration and the second acceleration reaches the preset number of times threshold (for example, 2 times), and the value of the preset number of times threshold is less than or equal to the preset number of times. Optionally, the time points of each acceleration request of the electronic device 100 are separated by a preset period.
That is, the electronic device 100 and/or the vehicle device 900 may determine that the motion states of the electronic device 100 and the vehicle device 900 are the same when the acceleration of the electronic device 100 and the acceleration of the vehicle device 900 are determined to be the same N times in succession. Alternatively, the electronic device 100 and/or the vehicle device 900 may continuously determine whether the acceleration of the electronic device 100 and the acceleration of the vehicle device 900 are the same M times, and determine that the motion states of the electronic device 100 and the vehicle device 900 are the same when determining that the acceleration of the electronic device 100 and the acceleration of the vehicle device 900 are the same N times or more.
Further, the acceleration requests of the electronic device 100 are separated by a predetermined period of time. Specifically, after sending the 1 st acceleration request, the electronic device 100 may send the 2 nd acceleration request at intervals of a preset duration a, and send the 3 rd acceleration request at intervals of a preset duration B, where the value of the preset duration B is different from the value of the preset duration a. For example, the value of the preset time period a is 1 minute, and the value of the preset time period B is 2 minutes.
Alternatively, the in-vehicle device 900 may send the first acceleration list to the electronic device 100 after receiving the acceleration request. The electronic device 100 may also obtain a second acceleration list. The electronic device 100 may determine whether the acceleration of the electronic device 100 and the acceleration of the in-vehicle device 900 are the same based on the first acceleration list and the second acceleration list. The first acceleration list comprises a plurality of accelerations. Similarly, the second acceleration list includes a plurality of accelerations. The electronic device 100 may compare the plurality of accelerations in the first acceleration list and the plurality of accelerations in the second acceleration list in sequence, and record the same number of times of comparison. The electronic device 100 may divide the number of comparisons by the total number of comparisons using the same number of comparisons to obtain the pass rate. When the passing rate reaches a preset passing threshold (e.g., 0.8), the electronic device 100 determines that the first acceleration list and the second acceleration list are the same. It is understood that the electronic device 100 may send multiple acceleration requests to the in-vehicle device 900.
For example, when the first acceleration list is {1.005,1.343,1.532,1.793,1.935}, and the second acceleration list is {1.005,1.343,1.532,1.789,1.935}, the passing rate is 0.8, and it is determined that the first acceleration list and the second acceleration list are identical.
Further optionally, a first acceleration list and a second accelerationThe degree list also includes the acquisition time corresponding to each acceleration. The first acceleration list may be {<193532,1.005>,<193537,1.343>,<193542,1.532>,……,<1933603,1.935>}. Wherein, the liquid crystal display device comprises a liquid crystal display device,<193532,1.005>193532 in (a) is used for indicating that the acquisition time is "19:35:32", and 1.005 is used for indicating that the acceleration acquired by the vehicle-mounted device 900 is 1.005m/s 2 . The electronic device 100 may compare only the accelerations in the acceleration list that are the same in time and calculate the passing rate. Alternatively, the acceleration request may include a specified acquisition time point. For example, the acceleration request may include a plurality of specified acquisition time points at which the electronic device 100 and the in-vehicle device 900 may acquire acceleration. In an example, the acceleration request may include a start acquisition time point, an end acquisition time point, and an acquisition time interval. Wherein the time difference between the start acquisition time point and the end acquisition time point is an integer multiple of the acquisition time interval. The electronic device 100 and the in-vehicle device 900 may acquire acceleration at every acquisition time interval between the start acquisition time point and the end acquisition time point, and obtain an acceleration list.
Alternatively, the above-described operations of transmitting the acceleration request and determining whether the acceleration of the electronic device 100 and the acceleration of the in-vehicle device 900 are the same may be performed by the in-vehicle device 900.
In one possible implementation, the electronic device 100 and the in-vehicle device 900 may transmit the acceleration to each other every predetermined time after establishing the guest bluetooth connection (i.e., the electronic device 100 transmits the second acceleration to the in-vehicle device 900 every predetermined time, and the in-vehicle device 900 transmits the first acceleration to the electronic device 100 every predetermined time). The electronic device 100 and the vehicle device 900 determine whether the accelerations of the two devices are the same. When the electronic device 100 and the vehicle-mounted device 900 continuously determine that the first acceleration and the second acceleration are the same for a preset number of times, or when the first acceleration and the second acceleration are the same for a preset number of times in the obtained determination result after the electronic device 100 and the vehicle-mounted device 900 continuously determine the preset number of times, the accelerations of the electronic device 100 and the vehicle-mounted device 900 are the same. Both the electronic device 100 and the in-car device 900 may disconnect the guest bluetooth connection when it is determined that the accelerations of the electronic device 100 and the in-car device 900 are different.
It should be noted that, in the above embodiment, the various ways of determining whether the acceleration of the electronic device 100 and the acceleration of the vehicle device 900 are the same may be used in combination, which is not limited in this embodiment.
S1611, the electronic device 100 transmits a confirmation success signaling to the in-vehicle device 900.
When the electronic device 100 determines that the first acceleration and the second acceleration are the same, the electronic device 100 may determine the accelerations of the electronic device 100 and the in-vehicle device 900, that is, the electronic device 100 and the in-vehicle device 900 are in the same vehicle. The electronic device 100 may send a confirmation success signaling to the in-vehicle device 900. The confirmation success signaling may be used to instruct the vehicle equipment 900 not to disconnect the guest bluetooth connection.
It should be noted that, the above-mentioned step of determining whether the motion states of the electronic device 100 and the vehicle device 900 are the same may be performed by the vehicle device 900. That is, the in-vehicle apparatus 900 may receive the motion information of the electronic apparatus 100 and based on the motion information of the electronic apparatus 100 and the motion information of the in-vehicle apparatus 900. It is determined whether the motion states of the electronic device 100 and the in-car device 900 are the same. When the in-car device 900 determines that the motion states of the electronic device 100 and the in-car device 900 are the same based on the motion information of the electronic device 100, the communication connection with the electronic device 100 is maintained. Further, when the vehicle device 900 determines that the motion states of the electronic device 100 and the vehicle device 900 are the same based on the motion information of the electronic device 100, the vehicle device 900 may send a confirmation success signaling to the electronic device 100, where the confirmation success signaling may be used to indicate that the electronic device 100 maintains a communication connection with the vehicle device 900. When the in-car device 900 determines that the motion states of the electronic device 100 and the in-car device 900 are different based on the motion information of the electronic device 100, the communication connection with the electronic device 100 is disconnected. Further, when the vehicle device 900 determines that the motion states of the electronic device 100 and the vehicle device 900 are different based on the motion information of the electronic device 100, the vehicle device 900 may send a confirmation failure signaling to the electronic device 100, where the confirmation failure signaling may be used to instruct the electronic device 100 to disconnect the communication connection.
In one possible implementation, the electronic device 100 may obtain, through the server of the first application, a bluetooth identifier of the target vehicle device (e.g., the vehicle device 900), and carry the bluetooth identifier in the broadcast guest bluetooth connection request. After receiving the visitor bluetooth connection request, the target vehicle device may send a visitor bluetooth connection response to the electronic device 100 when it is determined that the bluetooth identifier carried in the visitor bluetooth connection request is the same as the bluetooth identifier of the target vehicle device. After receiving the visitor bluetooth connection response of the target vehicle device, the electronic device 100 may establish a visitor bluetooth connection with the target vehicle device, and receive the article omission indication information through the visitor bluetooth connection.
In another possible implementation, the electronic device 100 may send the bluetooth identification of the electronic device 100 to the target in-vehicle device (e.g., in-vehicle device 900) through the server of the first application. After receiving the visitor bluetooth connection request, the target vehicle device may send a visitor bluetooth connection response carrying the bluetooth identifier of the electronic device 100 to the electronic device 100. The electronic device 100 may establish a guest bluetooth connection with the target vehicle device that sends the guest bluetooth connection response when it is determined that the bluetooth identifier carried by the guest bluetooth connection response is the bluetooth identifier of the electronic device 100. The electronic device 100 may receive the item omission indication information through the guest bluetooth connection.
S1612, the vehicle device 900 detects a get-off operation of the passenger, and acquires an in-vehicle image of the passenger after getting off.
The in-vehicle apparatus 900 may acquire an in-vehicle image of a passenger after getting off (also referred to as an in-vehicle image after getting off) after detecting that the passenger gets off.
In some embodiments, when the in-vehicle apparatus 900 includes a door sensor, the in-vehicle apparatus 900 may trigger the in-vehicle apparatus 900 to detect whether the passenger gets off the vehicle through a pressure sensor and/or an in-vehicle camera after detecting a door opening operation of the passenger through the door sensor. The in-vehicle apparatus 900 may perform step S1613 after detecting that the passenger gets off the vehicle.
Alternatively, when the in-vehicle apparatus 900 does not include the door sensor, the in-vehicle apparatus 900 may acquire an in-vehicle image at a preset time (e.g., 1 ms) after receiving the confirmation success signaling, and determine whether the passenger gets off the vehicle based on the in-vehicle image. That is, the vehicle-mounted device 900 may determine that the passenger gets off the vehicle when recognizing that the image of the passenger is not included in the in-vehicle image. When the in-vehicle apparatus 900 determines that the passenger gets off the vehicle, step S1613 may be performed.
S1613, the vehicle device 900 determines whether or not there is a missing article based on the in-vehicle image before the entering vehicle and the in-vehicle image after the exiting vehicle.
The in-car device 900 may identify the item information in the in-car image after getting off the car through an image identification algorithm. The in-vehicle apparatus 900 may compare whether the items in the in-vehicle image before the upper vehicle and the items in the in-vehicle image after the lower vehicle are identical. When the in-car apparatus 900 determines that the items in the in-car image before the entering car and the items in the in-car image after the exiting car are the same, it is determined that no items are missing. When the in-vehicle apparatus 900 determines that the article in the in-vehicle image before the boarding and the article in the in-vehicle image after the alighting are different, it is determined that there is an article omission (i.e., that the passenger's article remains in the vehicle).
For example, fig. 17B shows an in-vehicle image after the vehicle is off the vehicle, acquired by the vehicle-mounted device 900. When the passenger has opened the door and gets off the vehicle, the interior of the vehicle includes not only the driver's article but also the passenger's article. The in-vehicle apparatus 900 can obtain the in-vehicle item list of { < bottle, 1>, < bag, 1> } from the in-vehicle image after the vehicle is off. The vehicle device 900 may determine that the object list of the image in the vehicle before the vehicle is on the vehicle (see the embodiment shown in fig. 17A above) is different from the object list of the image in the vehicle after the vehicle is off, and determine that the object is missing.
Alternatively, the in-vehicle apparatus 900 may directly compare whether the in-vehicle image before the entering vehicle and the in-vehicle image after the exiting vehicle are identical by an image comparison method (e.g., pixel comparison). When the in-car apparatus 900 determines that the in-car image before the entering car and the in-car image after the exiting car are the same, it is determined that no article is missing. When the in-car apparatus 900 determines that the in-car image before the entering car and the in-car image after the exiting car are different, it determines that there is a missing article.
When the in-vehicle apparatus 900 determines that there is a missing article, step S1614 and step S1616 may be executed.
When the in-car device 900 determines that no item is missing, it may disconnect the guest bluetooth connection with the electronic device 100.
S1614, the in-vehicle apparatus 900 transmits the article omission instruction information to the electronic apparatus 100.
Wherein the article omission indication information is used to instruct the electronic apparatus 100 to execute step S1615.
Alternatively, the vehicle-mounted device 900 may perform step S1614 after detecting the door closing operation of the passenger after determining that the article is missing. Alternatively, the vehicle device 900 may detect a door closing operation of the passenger after determining that the article is missing. And after detecting the door closing operation of the passenger, an in-vehicle image is acquired, and when it is determined that the image of the passenger is not included in the in-vehicle image, step S1614 is performed. The detailed description of the door closing operation of the passenger detected by the vehicle device 900 may be referred to the description of the door opening operation of the passenger detected in step S1603, which is not repeated herein.
Optionally, the device 900 may send the images in the front and the rear of the vehicle to the electronic device 100 through the bluetooth connection, and the electronic device 100 determines whether the passenger's article remains on the vehicle based on the images in the front and the rear of the vehicle.
S1615, the electronic device 100 displays the second missing cue information.
After receiving the article omission indication information, the electronic device 100 may display second omission indication information. The second missing alert message may be used to alert the passenger that an item is left in the vehicle. For example, the electronic device 100 may display a prompt box 1541 as shown in fig. 15E after receiving the article omission indication information.
Optionally, after receiving the article omission indication information, the electronic device 100 may prompt the passenger that an article is left in the vehicle by displaying one or more of a text, vibrating, playing an animation, broadcasting a voice, displaying a picture, and the like.
S1616, the vehicle device 900 broadcasts the first missing alert.
Wherein, first omission prompt message is used for prompting that driver passenger's article is left in the car.
In one possible implementation, the in-vehicle device 900 may acquire an in-vehicle image before the boarding when a door opening operation of the passenger is detected. The in-vehicle apparatus 900 may acquire an in-vehicle image after getting off after detecting that the passenger gets off. The vehicle-mounted device 900 may broadcast the first missing prompt information when it is determined that the passenger's article remains in the vehicle based on the in-vehicle image before the vehicle and the in-vehicle image after the vehicle is off. In this way, a guest bluetooth connection may not be established with the electronic device 100.
In some application scenes, the electric automobile becomes a choice for many users to go out due to the characteristics of energy conservation and environmental protection of the electric automobile. However, when the electric quantity of the charging vehicle is low, the user needs to search information of each charging station by himself, and then screens out an appropriate charging station based on the charging station information to charge the electric vehicle. Thus, the operation of charging the electric vehicle by the user is troublesome and consumes time of the user. Therefore, the embodiment of the application provides a detection method, when the electronic device 100 detects a scene to be charged, charging station information can be acquired through the server 1000, and charging car information can be acquired through the car machine device 900. The electronic device 100 may obtain charging service information based on the charging station information and the charging car information. The charging service information includes one or more charging station options, one charging station option corresponding to each charging station, the charging station indicated by the charging station option being a charging station including a charging device that the vehicle device 900 can use, and the vehicle device 900 can reach before the electricity consumption is completed. The charging station options include charging price, charging time, etc. Wherein the one or more charging station options include a first charging station option. When the electronic device 100 receives user input for the first charging station option, navigation information to the first charging station may be displayed. The electronic device 100 may also transmit a charging service reservation request to the server 1000. In this way, the user can quickly select and reach the available charging stations.
Further, after the server 1000 detects that the vehicle device 900 enters the first charging station, the parking position information of the vehicle device 900 may be obtained. The parking position information may be used to indicate a parking area in which the vehicle equipment 900 is located. The server 1000 may also send a confirmation charging prompt to the electronic device 100, and the electronic device 100 may display a start charging control after receiving the confirmation charging prompt. The electronic device 100 may send a start charge request to the server 1000 upon receiving user input for a start charge control. The server 1000 may transmit the parking position information to the charging device 1100 after receiving the start charging request. The charging device 1100 may arrive at the location of the in-vehicle device 900 and charge the in-vehicle device 900 based on the parking location information. When the charging device 1100 starts charging the in-vehicle device 900, the vehicle charging information may be transmitted to the electronic device 100 through the server 1000. The vehicle charging information may include the power of the in-vehicle device 900. The electronic device 100 may display the vehicle charging information. In this way, the user can view the charging condition of the truck apparatus 900 in real time.
Next, a communication system 30 provided in an embodiment of the present application is described. The communication system 30 includes the electronic device 100 and the vehicle device 900. Wherein a communication connection (e.g., a bluetooth connection) is established between the electronic device 100 and the in-vehicle device 900. Data may be transferred between the electronic device 100 and the in-vehicle device 900 via the communication connection. The vehicle device 900 is an electric vehicle or a device forming an electric vehicle. The in-vehicle device 900 may include, but is not limited to, an in-vehicle camera or the like. The in-vehicle device 900 may be used to obtain data of an electric vehicle (e.g., a remaining amount of charge of the electric vehicle, an image in front of a vehicle head, etc.). The electronic device 100 may be a handheld electronic device, a wearable device, or the like, and specifically, the hardware structure of the electronic device 100 may be referred to the embodiment shown in fig. 1, which is not described herein again. In the following embodiments, the vehicle device 900 is written as a charging vehicle.
A set of interface schematic diagrams provided by embodiments of the present application are described next.
For example, as shown in fig. 18A, the electronic device 100 may be displayed with a desktop 1801, the desktop 1801 including a plurality of application icons (e.g., car charging application icons). One or more card assemblies (e.g., a charging service card 1802) may also be included in the desktop 1801. Wherein a card component (also referred to as a card) may be used to display specified function information that may be used to trigger the electronic device 100 to perform an operation indicated by the function information (e.g., to trigger the electronic device 100 to display a page corresponding to the specified function information in the card component). The card may be displayed on a desktop or other designated shortcut (e.g., negative one-screen, service center, etc.). Among them, the charging service card 1802 may have functional information for providing an automobile charging service displayed thereon. For example, the charging service card 1802 may be used to trigger the electronic device 100 to display power information, charging service information, etc. of the in-vehicle device 900.
When the electronic device 100 detects a scene to be charged, charging station information may be acquired through the server 1000, and charging car information may be acquired through the car machine device 900. The electronic device 100 may obtain charging service information based on the charging station information and the charging car information. The charging service information includes one or more charging station options, wherein the one or more charging station options include a first charging station option. After the electronic device 100 obtains the charging service information, a charging information bar 1804 as shown in fig. 18B may be displayed.
The specific embodiment of the electronic device 100 for obtaining the charging service information may refer to the embodiment shown in fig. 19, which is not described herein again. The charging station options may include, but are not limited to, identification information of the charging station, a predicted charging duration, a predicted charging cost, and a distance to be traveled. Wherein the identification information of the charging station may be used to indicate the charging station. The projected charge duration may be used to characterize the time that the in-vehicle device 900 is charged and the projected charge cost may be used to characterize the cost required to fully charge the in-vehicle device 900. The distance to be traveled may be used to indicate the distance of the vehicle equipment 900 to the charging station.
It should be noted that, the electronic device 100 may also obtain the priority of each charging station option based on one or more of the parameters of the estimated charging cost, the estimated charging time period, and the distance to be traveled. The electronic device 100 may display the charging station options sequentially from a position closest to the status bar to a position farthest from the status bar according to the level of the priority. The electronic device 100 may display the charging station option with the highest priority at the position closest to the status bar. For example, the electronic device 100 may set the charging station option with the shortest predicted charging duration to have the highest priority.
Illustratively, as shown in fig. 18B, the charge service card 1802 has remaining power information 1803 and a charge station information field 1804 displayed thereon. The remaining power information 1803 may be used to indicate the remaining power of the vehicle device 900. The charging station information bar 1804 may include one or more charging station options. The one or more charging station options include a charging station option 1804A. Charging station options may include, but are not limited to, the name of the charging station, the length of the charging time to be expected, the charge cost to be expected, the distance to be travelled, and the like. Here, the electronic device 100 may receive a user sliding input (e.g., up slide) with respect to the charging station information bar 1804, displaying different charging station options. The charging station option 1804A may be used to indicate the charging station a, for example, a name of the charging station a is displayed in the charging station option 1804A, an estimated charging time period of the charging station a is 1 hour, an estimated charging cost of the charging station a is 20 yuan, and a distance to be travelled between the vehicle apparatus 900 and the charging station a is 1.2km. Optionally, the charging service card 1802 may also include charging prompt information, which may be used to prompt the user that the vehicle device 900 needs to be charged. The charging prompt information can be one or more of text prompt information, animation prompt information and voice prompt information. For example, the charging prompt information may be a text prompt: "current charge is low, please charge as soon as possible".
Alternatively, the electronic device 100 may display only the charging station option with the highest priority in the charging service card 1802. The electronic device 100 may also display more controls on the charging service card 1802. The further controls may be used to trigger the electronic device 100 to jump to display a charging service interface, which may be used to display charging station options.
The electronic device 100 may, upon receiving the user input for the charging station option 1804A, send a charging service reservation request to the server 1000 in response to the input. The charging service reservation request includes car identification information and charging station identification information. Wherein, the car identification information is used for indicating the car machine 900, and the charging station identification information is used for indicating the charging station A. Upon receiving the charging service reservation request, the server 1000 may determine the charging device 1100 based on the charging station identification information. The server 1000 may transmit the car identification information to the charging device 1100, and the charging device 1100 may charge the car machine device 900 after the car machine device 900 reaches the charging station a.
The electronic device 100 may also display a navigation image 1813 as shown in fig. 18C in response to input (e.g., a click) by the user for the charging station option 1804A upon receiving the input. As shown in fig. 18C, the charging service card 1802 may display reservation success prompt information 1811, distance to travel information 1812, and navigation images 1813. The reservation success prompt 1811 may be used to prompt the user to reach charging station a to charge the vehicle device 900. For example, reservation success alert 1811 may be a text alert: "reservation of charging service is successful". Among them, the distance to travel information 1812 may be used to prompt the user for the distance (e.g., 1 km) of the current location to the charging station a. Among other things, navigation image 1813 may be used to display the travel route of the current location to charging station a.
Alternatively, the electronic device 100 may, upon receiving the user input for the charging station option 1804A, skip to display a map interface of the map application and display a navigation map of the current location to the charging station a in the map interface in response to the input.
When the server 1000 detects that the vehicle device 900 arrives at the charging station a, the parking position information of the vehicle device 900 may be acquired, which may be used to indicate the position of the vehicle device 900 in the charging station a. The server 1000 may also send a start charge request to the electronic device 100, which upon receipt by the electronic device 100, may display a start charge control 1822 as shown in fig. 18D.
As shown in fig. 18D, the electronic device 100 can display a start charging control 1822 on the charging service card 1802. Wherein the start charging control 1822 may be used to trigger the electronic device 100 to send a start charging response to the server 1000. Optionally, a confirmation charge prompt 1821 may also be displayed on the charging service card 1802. The confirm charge prompt 1821 may be used to prompt the user whether to begin charging. For example, the confirmation charging prompt 1821 may be a text prompt: "to the charging station a, whether to start charging". Optionally, a later query control may also be displayed on the charging service card 1802. The later query control may be used to trigger the electronic device 100 to display the charging service card 1802 shown in fig. 18A and to display the charging service card 1802 shown in fig. 18D again after a preset time (e.g., after 5 minutes). Optionally, a charge rejection control may also be displayed on the charging service card 1802, where the charge rejection control may be used to trigger the electronic device 100 to send a charge rejection response to the server 1000, and the server 1000 may notify the charging device 1100 to cancel charging the vehicle device 900.
When electronic device 100 receives a user input for start charging control 1822, a start charging response may be sent to server 1000 in response to the input. After receiving the start charging response, the server 1000 may transmit the parking position information to the charging apparatus 1100. After receiving the parking position information, charging device 1100 may travel to the location indicated by the parking position information. After the charging device 1100 reaches the location indicated by the parking position information, it is also possible to confirm whether the vehicle parked at the location is the vehicle-mounted device 900 through the car identification information. When the charging device 1100 determines the in-vehicle device 900, charging of the in-vehicle device 900 may begin. After the charging device 1100 starts to charge the vehicle device, vehicle charging information including the power of the vehicle device 900 may be transmitted to the electronic device 100 through the server 1000. After receiving the vehicle charging information, the electronic device 100 may display a charging service card 1802 as shown in fig. 18E.
As shown in fig. 18E, the charging service card 1802 has a vehicle charging prompt 1831 displayed therein, and the vehicle charging prompt 1831 may include one or more of a text prompt, a picture prompt, an animation prompt, and a voice prompt. The vehicle charge reminder information 1831 may be used to remind the user that the vehicle-mounted device 900 is being charged. Optionally, the vehicle charging prompt 1831 may also be used to prompt the user of the real-time power of the vehicle device 900. Optionally, the vehicle charging prompt 1831 may also be used to prompt the user for the charging time of the vehicle device 900. For example, the vehicle charging prompt 1831 may include a text prompt: "charging is being performed, charging is expected to be completed after 1 h", and the vehicle charging prompt information 1831 may further include text prompt information: "current power: 20% ". Optionally, a charge cancellation control 1832 may also be displayed in the charge service card 1802, and the charge cancellation control 1832 may be used to trigger the electronic device 100 to send charge cancellation information to the server 1000. After receiving the charge cancellation information, the server 1000 may notify the charging device 1100 to stop charging the vehicle device 900.
Note that the electronic device 100 may not be limited to displaying the content displayed in the charging service card 1802 shown in fig. 18A to 18E in the form of a card. For example, the electronic device 100 may display content displayed in the charging service card 1802 in an interface of an automobile charging application, which is not limited in the embodiments of the present application.
It is understood that in order for the electronic device 100 to display the power of the in-car device 900 in the charging service card 1802, the charging device 1100 may send the power of the in-car device 900 to the electronic device 100 at preset time intervals (e.g., 1 s). Alternatively, the in-vehicle device 900 may send the power of the in-vehicle device 900 to the electronic device 100 when the value of the power changes, for example, from 20% to 21%.
In this way, the electronic device 100 may display charging station options corresponding to charging stations that the user may use, and display navigation information to a charging station after the user selects a certain charging station option. The electronic device 100 can also display the electric quantity of the vehicle equipment 900 in real time, and the user can check the charging condition of the vehicle equipment 900 in real time.
In one possible implementation, when the in-vehicle apparatus 900 includes a display screen and an input device (e.g., a touch screen, mechanical keys, etc.), the operations performed by the electronic apparatus 100 may be performed by the in-vehicle apparatus 900. Thus, the user can directly view the charging service related information on the vehicle-mounted display screen. Alternatively, when the electronic device 100 detects that the user leaves the vehicle device 900, the vehicle charging information may be acquired from the vehicle device 900, and the vehicle charging prompt information may be displayed based on the vehicle charging information. In this way, the user may leave the charging station where the in-vehicle device 900 is located during the charging of the in-vehicle device 900. And, the user can know the charge condition of the car machine device 900 through the electronic device 100.
Next, a flow chart of a detection method provided in an embodiment of the present application is described.
Illustratively, as shown in FIG. 19, the method includes:
s1901, the electronic device 100 detects a scene to be charged.
The scene to be charged may include, but is not limited to, a low power scene, a parking lot scene, a destination scene, and the like.
The electronic device 100 may acquire the power of the vehicle device 900 at intervals of a preset time (e.g., 1 second), and when the electronic device 100 determines that the power of the vehicle device 900 is lower than a preset power threshold (e.g., 20%), it determines that the current scene is a low power scene.
The electronic device 100 may also acquire a front road image through an in-vehicle camera (e.g., a vehicle recorder) of the vehicle device 900, and identify whether parking lot entrance information (e.g., a parking lot sign, etc.) is included in the front road image through an image recognition algorithm. When the electronic device 100 recognizes that the front road image includes parking lot entrance information, it may determine that the current scene is a parking lot scene. Alternatively, the electronic device 100 may acquire the position information of the car machine device 900 through a global navigation positioning system, and may also acquire the parking lot position information near the car machine device 900 through a map server. When the electronic device 100 determines that the distance between the car-set device 900 and the parking lot is less than the specified distance threshold (e.g., 10 meters), it is determined that the current scene is a parking lot scene.
The electronic device 100 may also store a historical parking place (e.g., work place) of the user. When the electronic device 100 detects that the distance from the car machine device 900 to the historical parking place is smaller than the specified distance threshold, the current scene is determined to be the destination scene. Alternatively, the electronic device 100 may acquire the destination address input by the user, and determine that the current scene is the destination scene when the electronic device 100 detects that the distance from the vehicle device 900 to the destination is less than the specified distance threshold.
Optionally, after the electronic device 100 obtains the destination address input by the user, it may determine that the power consumed by the vehicle device 900 reaches the destination, and compare whether the power consumed is greater than the remaining power of the vehicle device 900, and when the electronic device 100 determines that the power consumed by the vehicle device 900 reaches the destination is greater than the remaining power of the vehicle device 900, calculate a difference between the power consumed and the remaining power. The electronic device 100 may acquire charging station information in the vicinity of a travel route of the vehicle device 900 when the amount of electricity that the vehicle device 900 has consumed is greater than the difference between the amount of electricity consumed and the amount of electricity remaining, and obtain and display charging service information based on the charging station information in the vicinity of the travel route and the charging car information. Alternatively, the electronic device 100 may acquire charging station information in the vicinity of the travel route of the vehicle device 900 when the remaining electric power of the vehicle device 900 is smaller than the electric power consumed by the vehicle device 900 on the remaining route, and obtain and display charging service information based on the charging station information in the vicinity of the travel route and the charging car information.
In one possible implementation, the electronic device 100 may obtain destination information to which the user is traveling, including a destination address and a route to the destination. For example, the destination route may be obtained by the electronic device 100 from a map server based on the location of the electronic device 100 and the destination address. After the electronic device 100 determines that the electric quantity of the vehicle-mounted device 900 is lower than the electric quantity consumed by the vehicle-mounted device 900 to travel to the destination address according to the route to the destination, the electronic device 100 obtains charging information of one or more charging stations.
After the electronic device 100 detects the above-described scene to be charged, steps S1902 and S1903 may be performed. It should be noted that, in the embodiment of the present application, the execution order of the step S1902 and the step S1903 is not limited, for example, the electronic device 100 may execute the step S1902 first, or the electronic device 100 may execute the step S1903 first, or the electronic device 100 may execute the step S1902 and the step S1903 synchronously.
In one possible implementation, the electronic device 100 may not perform step S1901 and directly perform steps S1902-S1904.
S1902, the electronic device 100 acquires charging station information (including information of the first charging station) from the server 1000.
The server 1000 may be any server storing charging station information of a plurality of charging stations, for example, the server 1000 may be a server corresponding to the above-mentioned vehicle charging application. Wherein the plurality of charging stations includes a first charging station. The charging station information may include, but is not limited to, identification information (e.g., name) of the charging station, the number of non-operated charging devices in the charging station, the charging power of the non-operated charging devices in the charging station, the charging interface model of the non-operated charging devices in the charging station (e.g., five-hole three-pin, nine-hole two-pin, etc.), the location of the charging station, the charging cost per unit amount of electricity, etc.
The server 1000 may transmit charging station information (i.e., charging information of one or more charging stations) to the electronic device 100.
Alternatively, the server 1000 may transmit only charging station information corresponding to the charging station including the non-operating charging device to the electronic device 100.
Optionally, the electronic device 100 may also obtain charging station information through historical transaction records with the charging station, location Based Services (LBS), wireless Beacon (Beacon) scanning, and the like.
S1903, the electronic device 100 acquires the charging car information from the car machine device 900.
The charging car information may include, but is not limited to, a charging interface model of the car device 900, a remaining power of the car device 900, a battery capacity of the car device 900, a location of the car device 900, a history charging record, and the like.
S1904, the electronic device 100 obtains and displays charging service information based on the charging station information and the charging car information; the charging service information comprises one or more charging station options, wherein the one or more charging station options comprise a first charging station option, and the first charging option corresponds to the first charging station.
The charging station options comprise identification information of the charging station, expected charging duration, expected charging cost and distance to be driven. Wherein the identification information of the charging station may be used to indicate the charging station. The projected charge duration may be used to characterize the time that the in-vehicle device 900 is charged and the projected charge cost may be used to characterize the cost required to fully charge the in-vehicle device 900. The distance to be traveled may be used to indicate the distance of the vehicle equipment 900 to the charging station. Specifically, the electronic device 100 obtains the charging station option as follows:
first, the electronic device 100 may screen out that the number of the non-operating charging devices is greater than zero based on the number of the non-operating charging devices of the charging station, the charging interface model of the charging device, and the charging interface model of the vehicle device 900, and the charging interface model of the charging device includes one or more charging stations of the interface model of the vehicle device 900.
Then, the electronic device 100 obtains, from the one or more charging stations obtained by the screening, a distance (also referred to as a distance to be travelled) between the electronic device 100 and the one or more charging stations based on the position of the one or more charging stations and the position of the vehicle device 900. The electronic device 100 may also calculate a distance that the vehicle device 900 can travel before power is lost (also referred to as a distance that can travel) based on the remaining power of the vehicle device 900. The electronic device 100 may screen out the one or more charging stations for charging stations having a distance to be traveled less than the travelable distance. Charging stations screened for a distance to be travelled less than the drivable distance may be referred to herein as preselected charging stations.
Thereafter, the electronic device 100 may calculate a time required for charging (i.e., an estimated charge duration) and a charge (i.e., an estimated charge) at each of the preselected charging stations based on the charge power and charge per unit amount of the non-operated charging device of the preselected charging station, and the remaining amount and battery capacity of the vehicle-mounted device 900.
In this way, the electronic device 100 may obtain one or more charging station options. The electronic device 100 may display the one or more charging station options. For example, the electronic device 100 may display the one or more charging station options via the charging service card 1802 shown in fig. 18B.
In one possible implementation, the electronic device 100 may obtain the distance to be traveled by the vehicle device 900 to each charging station based on the locations of the vehicle device 900 and each charging station. And then, based on the distance to be travelled and the travelling speed of the vehicle device 900, obtaining the arrival time point of the vehicle device 900 at each charging station, and acquiring the charging station information including the unused charging device after the arrival time point from the server 1000. The electronic device 100 obtains charging station options based on the charging station information and the charging car information. In this way, the electronic device 100 can provide a charging station with unused charging devices when it arrives at the charging station, improving the utilization of the charging devices.
It should be noted that, based on one or more of the parameters of the estimated charge cost, the estimated charge duration, and the distance to be travelled, the electronic device 100 may obtain the priority set by one or more charging station options, and set the position of the one or more charging station options in the display screen of the electronic device 100 according to the priority of the one or more charging stations. For example, the higher priority charging station options are located closer to the status bar in the display of the electronic device 100.
For example, the electronic device 100 may set the priority of the one or more charging station options based on the projected charge fee. The electronic device 100 may set the charging station option priority higher the lower the estimated charging cost.
In some embodiments, the electronic device 100 has a historical charging record stored therein, or the electronic device 100 may obtain the historical charging record from the in-vehicle device 900. The history charging record includes charging station charging information (for example, a name of a charging station, a location of the charging station, a number of times of charging at the charging station, etc.) of the vehicle device 900 charged before. The electronic device 100 may set the charging station option corresponding to the charging station with the largest number of charges in the vicinity of the in-vehicle device 900 (for example, in the area with a radius of 1 km centered on the in-vehicle device 900) to have the highest priority.
It is understood that the first charging station option includes identification information of the first charging station, an estimated charging duration, and the like. The first charging station option may be used to trigger the electronic device 100 to select a charging device (e.g., charging device 1100) of the first charging station.
S1905, the electronic device 100 receives an input from the user for the first charging station option.
Wherein the input for the first charging station option may include, but is not limited to, a single click, a double click, a long press, and the like. For example, the input may be an input for the charging station option 1804A shown in fig. 18B described above.
S1906, the electronic device 100 transmits a charging service reservation request including car identification information, which may be used to instruct the in-car device 900, and charging station identification information to the server 1000. The charging station identification information may be used to indicate the first charging station.
The electronic device 100 may, upon receiving the user input for the first charging station option, send a charging service reservation request to the server 1000 in response to the input. The charging service subscription request includes car identification information, charging station identification information, wherein the car identification information may be used to indicate the in-car device 900. For example, the car identification information may include, but is not limited to, license plate number, car model, color, etc. of the car machine equipment. The charging station identification information is used for indicating a first charging station corresponding to the first charging station option.
S1907, the server 1000 transmits the car identification information to the charging apparatus 1100.
After receiving the charging service reservation request sent by the electronic device 100, the server 1000 may determine, based on the charging station identification information, that the vehicle outlet device 900 is to be charged by an unused charging device of the first charging station. The server 1000 may transmit the vehicle identification information to an unused charging device of the first charging station, e.g., the charging device 1100.
After receiving the car identification information, the charging device 1100 cannot use the charging device 1100 except for the car device 900.
S1908, the electronic device 100 displays the navigation information.
After receiving the input of the user for the first charging station option, the electronic device 100 may display, in response to the input, navigation information (e.g., a navigation route from the location of the electronic device 100 to the first charging station) to the first charging station corresponding to the first charging station option. For example, the first charging station may be charging station a, and the electronic device 100 may display the navigation image 1813 shown in fig. 18C described above after receiving an input for the first charging station option.
S1909, the server 1000 detects that the vehicle device 900 enters the first charging station, and may acquire parking position information of the vehicle device 900.
Among other things, the server 1000 may detect whether the vehicle device 900 is driving into the first charging station in a variety of ways. In some embodiments, the server 1000 may detect whether the in-car device 900 is driven into the first charging station through a camera of the first charging station or a full automatic toll collection (ETC). Specifically, the server 1000 may acquire an image of a vehicle entering the first charging station through a camera at the entrance of the first charging station. The server 1000 may recognize the vehicle identification information in the obtained vehicle image by an image recognition algorithm. The server 1000 may confirm whether the in-vehicle device in the vehicle image is the in-vehicle device 900 based on the vehicle identification information. When the server 1000 determines that the vehicle device in the vehicle image is the vehicle device 900, it may be determined that the vehicle device 900 is driven into the first charging station. Alternatively, the server 1000 may automatically identify the license plate number of the vehicle entering the first charging station through ETC, determine whether the vehicle is the vehicle device 900 based on the license plate number, and determine that the vehicle device 900 enters the first charging station when the server 1000 determines that the vehicle is the vehicle device 900.
In other embodiments, the server 1000 may acquire the location of the vehicle device 900 through the electronic device 100 at a preset time interval and determine that the vehicle device 900 is driven into the first charging station when it is determined that the location of the vehicle device 900 and the location of the first charging station overlap. Alternatively, the electronic device 100 may send a signaling for instructing the vehicle device 900 to enter the first charging station to the server 1000 after the vehicle device 900 enters the first charging station, and the server 1000 may determine that the vehicle device 900 enters the first charging station after receiving the signaling.
After the server 1000 detects that the vehicle device 900 enters the first charging station, parking position information of the vehicle device 900 may be acquired. Wherein the parking position information may be used to indicate the position of the vehicle equipment 900 in the first charging station. For example, the parking location information may include one or more of a parking area number, a parking space number, an indoor location fingerprint, an indoor GPS signal, and the like.
The server 1000 may obtain parking position information of the car machine apparatus 900 in various ways. For example, the server 1000 may obtain the parking area number of the parking position of the vehicle device 900 through the camera of the first charging station, and the parking space number. For another example, the electronic device 100 may obtain, through a camera of the vehicle device 900, a parking area number of a parking position of the vehicle device 900, a parking space number, and the like. For another example, the server 1000 may send query location information to the electronic device 100, and the electronic device 100 may display location prompt information after receiving the query location information, where the location prompt information may be used to prompt the user to input parking location information (e.g., a parking space number). The electronic device 100 may receive the parking position information input by the user and transmit the parking position information to the server 1000.
S1910, the server 1000 may transmit a start charging request to the electronic device 100.
After detecting that the in-vehicle device 900 enters the first charging station, the server 1000 may send a start charging request to the electronic device 100. The start charge request may be used to instruct the electronic device 100 to display a start charge control.
S1911, the electronic device 100 may display a start charging control.
After receiving the start charging request sent by the server 1000, the electronic device 100 may display a start charging control. Wherein the start charging control may be used to trigger the electronic device 100 to send a start charging response to the server 1000.
S1912, the electronic device 100 receives an input of the user for starting the charging control.
Wherein the input for starting the charging control may be a single click, a double click, a long press, etc. For example, the input may be an input to the start charge control 1822 shown in fig. 18D described above.
S1913, the electronic apparatus 100 transmits a start charge response to the server 1000.
After receiving the input of the user for the start charging control, the electronic device 100 may send a start charging response to the server 1000 in response to the input. The start charge response may be used to instruct the server 1000 to notify the charging device 1100 to charge the in-vehicle device 900.
Alternatively, the electronic device 100 may display the start charging control while displaying the navigation information. In this way, without the server 1000 detecting whether the in-vehicle device 900 is driving into the first charging station, the electronic device 100 may send a start charging request to the server 1000 upon receiving an input of a user for a start charging control. The server 1000 may determine that the vehicle outlet device 900 is driven into the first charging station after receiving the start charging request. The server 1000 may obtain the parking position information of the vehicle device 900 after determining that the vehicle device 900 enters the first charging station, wherein the description of the server 1000 obtaining the parking position information may refer to the embodiment shown in step S1909 and will not be described herein.
S1914, the server 1000 transmits the parking position information to the charging apparatus 1100.
After receiving the start charging response, the server 1000 may transmit parking position information to the charging device 1100.
S1915, the charging device 1100 charges the vehicle device 900 after reaching the parking area indicated by the parking position information.
After receiving the parking position information, the charging device 1100 may obtain the position of the vehicle device 900 in the first charging station based on the parking position information. The charging device 1100 may be moved to the location of the in-vehicle device 900 to charge the in-vehicle device 900.
Further, the charging device 1100 may confirm that the vehicle parked at the location is the vehicle device 900 based on the vehicle identification information after reaching the location indicated by the parking location information, and then charge the vehicle device 900.
S1916, the charging apparatus 1100 transmits the vehicle charging information to the server 1000.
The charging device 1100 may obtain the vehicle charging information of the vehicle device 900 after docking the charging interface with the charging interface of the vehicle device 900, and send the vehicle charging information to the electronic device 100. The vehicle charging information includes an electric quantity of the vehicle-mounted device 900. The vehicle charging information may be used to indicate that the in-vehicle device 900 is charging.
S1917, the server 1000 transmits the vehicle charging information to the electronic device 100.
After receiving the vehicle charging information sent by the charging device 1100, the server 1000 may send the vehicle charging information to the electronic device 100.
S1918, the electronic apparatus 100 displays the vehicle charging information.
After receiving the vehicle charging information, the electronic device 100 may display a vehicle charging prompt. The vehicle charging prompt may be used to prompt the user that the vehicle device 900 is charging. The vehicle charging prompt may also be used to prompt the user for real-time power to the vehicle equipment 900. For example, the vehicle charging prompt information may be referred to the embodiment shown in fig. 18E, which is not described herein.
It is understood that, in order that the electronic device 100 may display the power of the vehicle device 900 in real time, the charging device 1100 may transmit the vehicle charging information to the electronic device 100 at every preset time (for example, 1 s). Alternatively, the in-vehicle device 900 may send the vehicle charging information to the electronic device 100 when the value of the electric quantity changes, for example, from 20% to 21%.
Alternatively, the electronic device 100 may directly obtain and display the power information of the vehicle device 900 from the vehicle device 900.
The following describes a detection method provided in the embodiments of the present application.
Fig. 20 shows a flow chart of a detection method provided in an embodiment of the present application.
As shown in fig. 20, the detection method includes the steps of:
s2001, acquiring physiological information parameters, blood alcohol concentration parameters and acquisition time parameters for acquiring the blood alcohol concentration parameters.
S2002, determining predicted sobering-up time based on physiological information parameters, blood alcohol concentration parameters and acquisition time parameters.
S2003, the predicted sobering-up time is displayed.
Wherein the steps described above may be performed by the electronic device 100 shown in fig. 2-7 described above. A detailed description of the determination of the predicted sobering-up time by the electronic device 100 may refer to the embodiments shown in fig. 2 to 7, and will not be repeated here.
Alternatively, and without limitation to the electronic device 100 shown in fig. 2-7, the step of determining the predicted sobering-up time may be performed by other electronic devices, such as a cloud server. Alternatively, the steps of acquiring the physiological information parameter, the blood alcohol concentration parameter, and the acquisition time parameter of acquiring the blood alcohol concentration parameter may be performed by other electronic devices, such as the electronic device 200 shown in fig. 2, without being limited to the electronic device 100 shown in fig. 2-7.
The following describes a detection method provided in the embodiments of the present application.
Fig. 21 shows a flow chart of a detection method provided in an embodiment of the present application.
As shown in fig. 21, the detection method includes the steps of:
s2101, behavior data of a user is acquired.
S2102, determining the fatigue degree before driving of the user based on the behavior data of the user.
S2103, determining a first recommended driving duration of the user based on the pre-driving fatigue degree of the user.
S2104 displays the first recommended driving duration.
Wherein the steps described above may be performed by the electronic device 100 shown in fig. 8-12 described above. A detailed description of the determination of the first recommended driving duration by the electronic device 100 may refer to the foregoing embodiments shown in fig. 8-12, and will not be repeated herein.
Alternatively, the step of determining the first recommended driving duration may be performed by other electronic devices, such as a cloud server, without being limited to the electronic device 100 shown in fig. 8-12. Alternatively, the step of acquiring the behavior data of the user may be performed by other electronic devices, such as the electronic device 500 shown in fig. 8, without being limited to the electronic device 100 shown in fig. 8-12 described above. Alternatively, the step of displaying the first recommended driving time period may be performed by other electronic devices, such as the electronic device 500 shown in fig. 8, without being limited to the electronic device 100 shown in fig. 8-12 described above.
In one possible implementation manner, obtaining behavior data of a user specifically includes: and acquiring the traveling time of the user, and acquiring behavior data of the user at a first time before the traveling time. The travel time and the first time differ by a preset time. The travel time is the departure time shown in fig. 8 to 12, and the first time is the trigger time shown in fig. 8 to 12.
The following describes a detection method provided in the embodiments of the present application.
Fig. 22 shows a flow chart of a detection method provided in an embodiment of the present application.
As shown in fig. 22, the detection method includes the steps of:
S2201, the first electronic device detects a boarding operation of a passenger, and acquires an in-vehicle image before the passenger boarding.
S2202, the first electronic device and the second electronic device establish communication connection.
S2203, the first electronic device detects the getting-off operation of the passenger and acquires the in-car image of the passenger after getting off.
S2204, the first electronic device broadcasts the first missing prompt information when determining that the object of the passenger is left in the vehicle based on the vehicle interior image before the passenger gets on and the vehicle interior image after the passenger gets off.
Wherein, first omission prompt message is used for prompting passenger's article to leave in the car.
S2205, the first electronic device sends the article omission indication information to the second electronic device through the communication connection.
S2206, the second electronic device displays the second missing prompt information.
Wherein, the second omission prompt message is used for prompting that the passenger's article is left in the car.
The first electronic device may be the vehicle device 900 shown in fig. 13-17B. Detailed descriptions of the steps performed by the vehicle device 900 may refer to the embodiments shown in fig. 13-17B, and are not repeated herein.
The second electronic device may be the electronic device 100 shown in fig. 13-17B. Detailed descriptions of the steps performed by the electronic device 100 may refer to the embodiments shown in fig. 13-17B, and are not repeated herein.
The first electronic device and the second electronic device may form a first communication system.
The following describes a detection method provided in the embodiments of the present application.
Fig. 23 shows a flow chart of a detection method provided in an embodiment of the present application.
As shown in fig. 23, the detection method includes the steps of:
s2301, the first electronic device obtains charging information of one or more charging stations.
The first electronic device displays one or more charging station options based on the charging information of the one or more charging stations, the one or more charging station options including the first charging station option S2302.
S2303, the first electronic device receives an input for a first charging station option, and displays first navigation information, where the first navigation information is used to indicate a route from the first electronic device to a charging station corresponding to the first charging station option.
S2304, the server detects that the first electronic device arrives at the first charging station, and obtains parking position information of the first electronic device in the first charging station.
S2305, the server transmits the parking position information to the charging device.
S2306, the charging device reaches a position in the first charging station indicated by the parking position information, and charges the first electronic device.
The first electronic device may be the vehicle device 900 shown in fig. 18A-19. Detailed descriptions of the steps performed by the vehicle device 900 may refer to the embodiments shown in fig. 18A-19, and are not repeated herein.
The server may be the server 1000 shown in fig. 18A to 19. In particular, the detailed description of the steps performed by the server 1000 may refer to the embodiment shown in fig. 18A-19, and will not be repeated herein.
The charging device may be the charging device 1100 shown in fig. 18A to 19. A detailed description of the charging device 1100 performing the above steps may refer to the embodiment shown in fig. 18A-19, and will not be repeated herein.
The first electronic device, the server and the charging device may constitute a second communication system.
In one possible implementation, the first electronic device may be the electronic device 100 shown in fig. 18A-19, and then the charging device charges the vehicle device 900 shown in fig. 18A-19 after reaching the location in the first charging station indicated by the parking location information.
The detection methods shown in fig. 20 to 21 described above may be used in combination with each other. For example, the electronic device 100 described in fig. 20-21 may be the same electronic device. The electronic device 100 may perform the steps described above in the embodiments illustrated in fig. 20-21, which are not limited in this application.
The above embodiments are merely for illustrating the technical solution of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the corresponding technical solutions from the scope of the technical solutions of the embodiments of the present application.

Claims (39)

1. A method of detection comprising:
acquiring physiological information parameters, blood alcohol concentration parameters and acquisition time parameters for acquiring the blood alcohol concentration parameters;
determining a predicted sobering-up time based on the physiological information parameter, the blood alcohol concentration parameter, and the acquisition time parameter; wherein the predicted sobering-up time is used for indicating a time point when the blood alcohol concentration of the user is lower than a threshold blood alcohol concentration;
displaying the predicted sobering-up time.
2. The method of claim 1, wherein the physiological information parameters include one or more of weight, height, age, gender, sleep time, sleep quality.
3. The method according to claim 1 or 2, wherein said determining a predicted sobering-up time based on said physiological information parameter, said blood alcohol concentration parameter and said acquisition time parameter, in particular comprises:
and determining the predicted sobering-up time through an alcohol prediction model based on the physiological information parameter, the blood alcohol concentration parameter and the acquisition time parameter.
4. A method according to any one of claims 1-3, wherein prior to said obtaining a physiological information parameter, a blood alcohol concentration parameter and a time of collection parameter for collecting said blood alcohol concentration parameter, the method further comprises:
receiving a first input;
the method for acquiring the physiological information parameter, the blood alcohol concentration parameter and the acquisition time parameter for acquiring the blood alcohol concentration parameter specifically comprises the following steps:
and in response to the first input, acquiring the physiological information parameter, the blood alcohol concentration parameter and the acquisition time parameter.
5. The method of any one of claims 1-4, wherein prior to said determining a predicted sober-up time based on the physiological information parameter, the blood alcohol concentration parameter, and the acquisition time parameter, the method further comprises:
Receiving a second input;
determining a predicted sober-up time based on the physiological information parameter, the blood alcohol concentration parameter, and the acquisition time parameter, specifically including:
in response to the second input, the predicted sober-up time is determined based on the physiological information parameter, the blood alcohol concentration parameter, and the acquisition time parameter.
6. The method according to any one of claims 1-5, wherein said determining a predicted sobering-up time based on said physiological information parameter, said blood alcohol concentration parameter and said acquisition time parameter, in particular comprises:
acquiring parameters of intake of wine;
determining the predicted sober-up time based on the physiological information parameter, the intake alcohol parameter, the blood alcohol concentration parameter, and the acquisition time parameter.
7. The method according to claim 6, wherein obtaining parameters of intake of wine, in particular, comprises:
acquiring a container image of the drunk water through a camera;
and determining the wine intake parameter based on the container image.
8. The method of claim 6 or 7, wherein the intake parameters include a degree parameter for indicating the degree of intake of wine by the user and a volume parameter for indicating the volume of intake of wine by the user.
9. The method according to any one of claims 6-7, wherein said determining said predicted sobering-up time based on said physiological information parameter, said intake wine parameter, said blood alcohol concentration parameter and said acquisition time parameter, in particular comprises:
based on the intake alcohol parameters and the physiological information parameters, obtaining a predicted alcohol absorption rate and a predicted alcohol metabolism rate through the alcohol prediction model;
based on the physiological information parameters, inputting the wine parameters, predicting the alcohol absorption rate, and predicting the alcohol metabolism rate to obtain a corresponding relationship between the blood alcohol concentration and time;
and determining the predicted sober-up time based on the blood alcohol concentration parameter, the acquisition time parameter and the corresponding relation between the blood alcohol concentration and time.
10. A method of detection comprising:
acquiring behavior data of a user;
determining the degree of fatigue of the user before driving based on the behavior data of the user;
determining a first recommended driving duration of the user based on the pre-driving fatigue degree of the user;
and displaying the first recommended driving duration.
11. The method according to claim 10, wherein the obtaining behavior data of the user specifically includes:
Acquiring the travel time of the user;
acquiring behavior data of the user at a first moment before the trip moment; the first moment and the travel moment are different by preset time.
12. The method according to claim 11, wherein the obtaining the travel time of the user specifically includes:
acquiring schedule information of the user, wherein the schedule information comprises one or more of bill information, conference information and schedule information of the user;
and acquiring the travel time of the user based on the schedule information of the user.
13. The method according to any one of claims 10-12, further comprising:
acquiring physical state data of the user in a vehicle running state;
determining the fatigue degree in driving of the user based on the physical state data of the user;
determining the final fatigue degree of the user based on the pre-driving fatigue degree of the user and the in-driving fatigue degree of the user;
determining a second recommended driving duration based on the final fatigue level of the user;
and displaying the second recommended driving duration.
14. The method according to claim 13, wherein said determining the degree of fatigue in driving of said user based on said user's physical state data, in particular comprises:
and determining the fatigue degree in driving through a second fatigue model based on the physical state data of the user, wherein the second fatigue model is obtained through training according to the historical physical state data of the user.
15. The method according to claim 13, wherein said determining the degree of fatigue in driving of said user based on said user's physical state data, in particular comprises:
acquiring vehicle running data of the user in a vehicle running state;
and determining the fatigue degree in driving of the user based on the physical state data of the user and the vehicle running data of the user.
16. The method according to claim 15, wherein the determining the fatigue level in driving of the user based on the physical state data of the user and the on-board driving data of the user, comprises:
determining a second fatigue model based on the physical state data of the user;
and determining the fatigue degree in driving through a second fatigue model based on the on-vehicle running data of the user and the physical state data of the user.
17. The method according to any one of claims 10-16, wherein the obtaining behavior data of the user specifically comprises:
acquiring user data of the user, wherein the user data of the user comprises one or more of movement duration, movement intensity and sleep duration;
and determining behavior data of the user based on the user data.
18. The method according to any one of claims 10-17, wherein said determining the degree of pre-driving fatigue of the user based on the behavioral data of the user, in particular comprises:
determining the fatigue degree of the user before driving through a first fatigue model based on the behavior data of the user; the first fatigue model is obtained through training according to historical behavior data of the user.
19. A detection method applied to a first communication system, wherein the first communication system comprises a first electronic device and a second electronic device; the method comprises the following steps:
the first electronic equipment detects boarding operation of a passenger and acquires an in-car image before the passenger boarding;
the first electronic device and the second electronic device establish communication connection;
The first electronic equipment detects the getting-off operation of a passenger and acquires an in-car image of the passenger after getting off;
when the first electronic device determines that the articles of the passenger are left in the vehicle based on the vehicle interior image before the passenger gets on the vehicle and the vehicle interior image after the passenger gets off the vehicle, broadcasting first omission prompt information; wherein the first missing prompt message is used for prompting that the articles of the passengers are left in the vehicle;
the first electronic device sends the article omission indication information to the second electronic device through the communication connection;
the second electronic device displays second omission prompt information based on the article omission indication information, wherein the second omission prompt information is used for prompting the passengers that articles are omitted.
20. The method of claim 19, wherein the second electronic device is the strongest electronic device of all electronic devices detected by the first electronic device.
21. The method of claim 19 or 20, wherein after the first electronic device and the second electronic device establish a communication connection, the method further comprises:
the first electronic equipment sends the motion information of the first electronic equipment to the second electronic equipment through the communication connection;
When the second electronic equipment determines that the motion state of the first electronic equipment is the same as the motion state of the second electronic equipment based on the motion information of the first electronic equipment, the second electronic equipment sends a confirmation success signaling to the first electronic equipment;
and the first electronic equipment receives the confirmation success signaling and keeps communication connection with the second electronic equipment.
22. The method of claim 21, wherein the second electronic device determines, based on the motion information of the first electronic device, that the motion state of the first electronic device is the same as the motion state of the second electronic device, specifically comprising:
when the second electronic equipment continuously determines that the motion information of the first electronic equipment is the same as the motion information of the second electronic equipment for N times, determining that the motion state of the first electronic equipment is the same as the motion state of the second electronic equipment; wherein N is a positive integer.
23. The method of claim 21, wherein the second electronic device determines, based on the motion information of the first electronic device, that the motion state of the first electronic device is the same as the motion state of the second electronic device, specifically comprising:
When the second electronic device judges whether the motion information of the first electronic device is identical to the motion information of the second electronic device or not for M times, the second electronic device judges that the motion information of the first electronic device is identical to the motion information of the second electronic device for at least N times, and the second electronic device determines that the motion state of the first electronic device is identical to the motion state of the second electronic device; wherein, N is less than or equal to M, M and N are positive integers.
24. The method of claim 22 or 23, wherein the motion information of the first electronic device and the motion information of the second electronic device are the same when the difference between the motion information of the first electronic device and the motion information of the second electronic device is less than a motion deviation threshold.
25. The method of claim 19 or 20, wherein after the first electronic device and the second electronic device establish a communication connection, the method further comprises:
the second electronic equipment sends the motion information of the second electronic equipment to the first electronic equipment through the communication connection;
when the first electronic equipment determines that the motion state of the first electronic equipment is the same as the motion state of the second electronic equipment based on the motion information of the second electronic equipment, the first electronic equipment sends a confirmation success signaling to the second electronic equipment;
And the second electronic equipment receives the confirmation success signaling and keeps communication connection with the first electronic equipment.
26. The method of claim 21, wherein the method further comprises:
and when the second electronic equipment determines that the motion state of the first electronic equipment is different from the motion state of the second electronic equipment based on the motion information of the first electronic equipment, the second electronic equipment is disconnected from the communication connection of the first electronic equipment.
27. The method of claim 26, wherein the second electronic device disconnects the communication with the first electronic device, specifically comprising:
the second electronic equipment sends a failure confirmation signaling to the first electronic equipment;
and the first electronic equipment receives the confirmation failure signaling and disconnects the communication connection with the second electronic equipment.
28. The method of claim 26 or 27, wherein after the second electronic device disconnects the communication with the first electronic device, the method further comprises:
the second electronic device broadcasts a communication connection request.
29. The method according to any of claims 19-28, wherein the first electronic device and the second electronic device establish a communication connection, in particular comprising:
The second electronic device broadcasts a communication connection request;
the first electronic equipment receives a communication connection request of the second electronic equipment;
the first electronic device sends a communication connection response to the second electronic device;
the second electronic equipment receives the communication connection response of the first electronic equipment and establishes communication connection with the first electronic equipment.
30. The method according to any of claims 19-29, wherein the first electronic device and the second electronic device establish a communication connection, in particular comprising:
after the first electronic equipment detects that the passenger sits down in the vehicle, the first electronic equipment receives a communication connection request of the second electronic equipment;
and the first electronic equipment sends a communication connection response to the second electronic equipment, and establishes communication connection with the second electronic equipment.
31. The detection method is applied to a second communication system, and is characterized in that the second communication system comprises a first electronic device, a server and a charging device; the method comprises the following steps:
the first electronic equipment receives charging information of one or more charging stations sent by the server;
The first electronic device displays one or more charging station options based on charging information of the one or more charging stations, the one or more charging station options including a first charging station option;
after receiving input for the first charging station options, the first electronic device displays first navigation information, wherein the first navigation information is used for indicating a route from the position of the first electronic device to a first charging station corresponding to the first charging station options;
after the server detects that the first electronic equipment arrives at the first charging station, the server acquires parking position information of the first electronic equipment in the first charging station;
the server sends the parking position information to a charging device;
and after the charging equipment reaches the position of the first charging station indicated by the parking position information, charging the first electronic equipment.
32. The method of claim 31, wherein the first electronic device displays one or more charging station options based on charging information of the one or more charging stations, specifically comprising:
the first electronic device determines one or more charging station options based on the charging information of the one or more charging stations and the charging information of the first electronic device.
33. The method of claim 31 or 32, wherein the one or more charging station options include a charging price, a charging time, and a distance of arrival, the charging price being indicative of a charge required for the first electronic device to be fully charged, the charging time being indicative of a time required for the first electronic device to be fully charged, the distance of arrival being indicative of a distance between the first electronic device and a charging station to which the charging station option corresponds.
34. The method according to any one of claims 31-33, wherein the first electronic device receives charging information of one or more charging stations sent by the server, in particular comprising:
when the first electronic equipment detects a scene to be charged, the first electronic equipment receives charging information of one or more charging stations sent by the server, wherein the scene to be charged comprises a low-power scene and a parking lot scene; the low-power scene is a scene that the power of the first electronic device is lower than a preset power threshold, and the parking lot scene is a scene that the distance between the first electronic device and a nearby parking place is smaller than a specified distance threshold.
35. The method according to any one of claims 31-33, wherein the first electronic device receives charging information of one or more charging stations sent by the server, in particular comprising:
the first electronic equipment acquires destination information which a user goes to, wherein the destination information comprises a destination address and a route to a destination;
and after the first electronic equipment determines that the electric quantity of the first electronic equipment is lower than the electric quantity consumed by the first electronic equipment for driving to the destination address according to the route to the destination, the first electronic equipment receives charging information of one or more charging stations sent by the server.
36. The method according to any one of claims 31-35, wherein the server sends the parking position information to a charging device, in particular comprising:
the server sends a charging starting request to the first electronic equipment;
the first electronic equipment receives the charging starting request and displays a charging starting control;
after the first electronic device receives a fourth input for the charging start control, responding to the fourth input, and sending a charging start response to the server;
And the server receives the response of starting charging and sends the parking position information to the charging equipment.
37. The method of any of claims 31-36, wherein the communication system further comprises a second electronic device, after the charging device reaches the location of the first charging station indicated by the parking location information, after charging the first electronic device, the method further comprising:
the charging device sends vehicle charging information to the second electronic device, wherein the vehicle charging information comprises the electric quantity of the first electronic device;
and after receiving the vehicle charging information, the second electronic equipment displays vehicle charging prompt information, wherein the vehicle charging prompt information is used for prompting a user of the electric quantity of the first electronic equipment.
38. An electronic device, comprising: one or more processors, a display screen, one or more memories; wherein the display screen, one or more memories are coupled to the one or more processors, the one or more memories for storing computer program code comprising computer instructions that, when executed by the one or more processors, cause the first electronic device to perform the method of any of claims 1-37.
39. A computer readable storage medium comprising instructions which, when run on a first electronic device, cause the first electronic device to perform the method of any of claims 1-37.
CN202111667026.8A 2021-12-30 2021-12-30 Detection method and device Pending CN116416192A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111667026.8A CN116416192A (en) 2021-12-30 2021-12-30 Detection method and device
PCT/CN2022/141989 WO2023125431A1 (en) 2021-12-30 2022-12-26 Test method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111667026.8A CN116416192A (en) 2021-12-30 2021-12-30 Detection method and device

Publications (1)

Publication Number Publication Date
CN116416192A true CN116416192A (en) 2023-07-11

Family

ID=86997836

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111667026.8A Pending CN116416192A (en) 2021-12-30 2021-12-30 Detection method and device

Country Status (2)

Country Link
CN (1) CN116416192A (en)
WO (1) WO2023125431A1 (en)

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009162721A (en) * 2008-01-10 2009-07-23 Toyota Motor Corp Drive auxiliary device
TW201641937A (en) * 2015-05-29 2016-12-01 鴻海精密工業股份有限公司 System and method of checking driving after drinking
CN105391867A (en) * 2015-12-06 2016-03-09 科大智能电气技术有限公司 Charging pile work method based on reservation authentication and payment guiding by mobile phone APP
CN110505837B (en) * 2017-04-14 2023-01-17 索尼公司 Information processing apparatus, information processing method, and recording medium
CN109927655A (en) * 2019-04-16 2019-06-25 东风小康汽车有限公司重庆分公司 The method of adjustment and device of drive parameter, automobile
CN111415347B (en) * 2020-03-25 2024-04-16 上海商汤临港智能科技有限公司 Method and device for detecting legacy object and vehicle
CN111703368A (en) * 2020-06-28 2020-09-25 戴姆勒股份公司 System and method for detecting and reminding forgotten objects in vehicle

Also Published As

Publication number Publication date
WO2023125431A1 (en) 2023-07-06

Similar Documents

Publication Publication Date Title
CN106293032B (en) Portable terminal device, and control method and apparatus thereof
US9638537B2 (en) Interface selection in navigation guidance systems
US10259466B2 (en) System for monitoring and classifying vehicle operator behavior
US20190349470A1 (en) Mobile device context aware determinations
US20170279957A1 (en) Transportation-related mobile device context inferences
US9073405B2 (en) Apparatus and method for a telematics service
US20160267335A1 (en) Driver distraction detection system
US10950063B2 (en) Method and device for in-vehicle payment
US10829130B2 (en) Automated driver assistance system
US10440174B2 (en) Management of movement states of an electronic device using communications circuitry data
CN105225509A (en) A kind of road vehicle intelligent early-warning method, device and mobile terminal
WO2016028228A1 (en) System, method and apparatus for determining driving risk
CN110022403A (en) Terminal charge based reminding method, device, equipment and storage medium
CN106200477B (en) A kind of intelligent vehicle-mounted system and update method
CN104680714A (en) Method and system for sending prompts to user
FR2935523A1 (en) METHOD AND SYSTEM FOR AUTOMATICALLY AND DIRECTLY CONNECTING A DRIVER AND AT LEAST ONE PERSON TO BE TRANSPORTED.
WO2023051322A1 (en) Travel management method, and related apparatus and system
US11151811B2 (en) Information processing device, information processing method, and non-transitory recording medium storing program
CN112061026A (en) Vehicle-mounted early warning method, vehicle-mounted intelligent monitoring equipment and storage medium
CN111310062A (en) Matching method, matching server, matching system, and storage medium
CN115514788A (en) Precise Beidou and AIOT combined new energy school bus driver and passenger polymorphic perception system and analysis and early warning method thereof
WO2021082608A1 (en) Method and electronic device for prompting travel plan
WO2023125431A1 (en) Test method and apparatus
CN116415061A (en) Service recommendation method and related device
CN113532453A (en) Movement route recommendation method, device and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination