CN115144013A - Driver detection method, electronic device, and storage medium - Google Patents

Driver detection method, electronic device, and storage medium Download PDF

Info

Publication number
CN115144013A
CN115144013A CN202211068203.5A CN202211068203A CN115144013A CN 115144013 A CN115144013 A CN 115144013A CN 202211068203 A CN202211068203 A CN 202211068203A CN 115144013 A CN115144013 A CN 115144013A
Authority
CN
China
Prior art keywords
information
vehicle
movement information
wearable device
condition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211068203.5A
Other languages
Chinese (zh)
Inventor
万努梁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202211068203.5A priority Critical patent/CN115144013A/en
Publication of CN115144013A publication Critical patent/CN115144013A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C23/00Combined instruments indicating more than one navigational value, e.g. for aircraft; Combined measuring devices for measuring two or more variables of movement, e.g. distance, speed or acceleration

Abstract

The embodiment of the application provides a driver detection method, electronic equipment and a storage medium, and relates to the technical field of data processing, wherein the method comprises the following steps: obtaining movement information collected by wearable equipment worn on arms or hands of a user, wherein the user is located in a vehicle; based on the movement information, it is determined whether the actual first travel situation of the vehicle indicated by the movement information is the same as a second travel situation: when the steering wheel of the vehicle rotates according to the rotation condition of the wearable equipment represented by the movement information, the vehicle theoretically moves; based on the determination result, it is detected whether the user is a driver. By applying the scheme provided by the embodiment of the application, whether the user is a driver in the vehicle can be detected.

Description

A driver detection method electronic device and storage medium
Technical Field
The present application relates to the field of data processing technologies, and in particular, to a driver detection method, an electronic device, and a storage medium.
Background
In a vehicle driving scenario, a driver of a vehicle needs to observe environmental conditions in the vicinity of the vehicle, control a driving route of the vehicle, determine a congestion condition of a road, and the like, compared to a passenger of the vehicle, that is, the driver needs to acquire more information related to the vehicle and traffic. In order to better provide personalized special services for the driver, it is necessary to detect whether the user is a driver in the vehicle.
Disclosure of Invention
In view of the above, the present application provides a driver detection method, an electronic device, and a storage medium to detect whether a user is a driver of a vehicle.
In a first aspect, an embodiment of the present application provides a driver detection method, where the method includes:
obtaining movement information collected by a wearable device worn on an arm or hand of a user, wherein the user is located in a vehicle;
and judging whether the first actual traveling condition of the vehicle represented by the movement information is the same as a second traveling condition or not on the basis of the movement information, wherein the second traveling condition is that: when the steering wheel of the vehicle rotates according to the rotation condition of the wearable equipment represented by the movement information, the vehicle theoretically moves;
and detecting whether the user is a driver or not based on the judgment result.
In one embodiment of the present application, the determining whether the first actual traveling condition of the vehicle indicated by the movement information is the same as the second actual traveling condition based on the movement information includes:
obtaining travel information representing an actual first travel situation of the vehicle based on the movement information;
obtaining rotation information representing rotation conditions of the wearable device based on the movement information;
judging whether a first traveling condition and a second traveling condition represented by the traveling information are the same, wherein the second traveling condition is as follows: and the theoretical running condition of the vehicle when the steering wheel of the vehicle rotates according to the rotation condition shown by the rotation information.
In an embodiment of the application, in a case where multi-frame information collected by the wearable device at different times is included in the movement information, the determining, based on the movement information, traveling information representing an actual first traveling situation of the vehicle includes:
for each frame of information contained in the movement information, obtaining direction information representing the traveling direction of the vehicle at the moment when the wearable device collects the frame of information;
and acquiring traveling information representing actual first traveling conditions of the vehicle according to the traveling directions of the vehicle represented by the direction information.
In an embodiment of the application, the obtaining of the travel information indicating an actual first travel situation of the vehicle based on the movement information includes:
obtaining travel information representing an actual first travel situation of the vehicle based on acceleration data, angular velocity data, and wearable device orientation data included in the movement information.
In an embodiment of the application, in a case that the movement information includes multi-frame information collected by the wearable device at different times, the obtaining, based on the movement information, rotation information indicating a rotation condition of the wearable device includes:
acquiring attitude information representing the attitude of the wearable equipment at the moment when the wearable equipment acquires the frame information aiming at each frame of information contained in the movement information;
and obtaining rotation information representing the rotation condition of the wearable equipment according to the postures of the wearable equipment represented by the posture information.
In an embodiment of the application, the obtaining, based on the movement information, rotation information indicating a rotation condition of the wearable device includes:
and obtaining rotation information representing the rotation condition of the wearable device based on the acceleration data and the angular velocity data contained in the movement information.
In one embodiment of the present application, the determining whether the first actual traveling condition of the vehicle indicated by the movement information is the same as the second actual traveling condition based on the movement information includes:
inputting the movement information into a pre-trained travelling condition judgment model, obtaining an output result, and determining whether the actual first travelling condition of the vehicle represented by the movement information is the same as the second travelling condition;
wherein the output result indicates whether the first traveling condition is the same as the second traveling condition, and the traveling condition judgment model is trained based on the sample movement information.
In one embodiment of the present application, the sample movement information includes at least one of the following information collected by a wearable device worn on a hand or arm of a user located in a vehicle: the information acquisition method comprises the steps of acquiring information acquired by wearable equipment when a driver user rotates a steering wheel in the vehicle running process, acquiring information acquired by the wearable equipment when the driver user rotates the steering wheel when the vehicle is in a stopped state, acquiring information acquired by the wearable equipment worn on the driver user when the vehicle keeps going straight, and acquiring information acquired by the wearable equipment worn on a non-driver user.
In an embodiment of the application, the detecting whether the user is a driver based on the determination result includes:
determining that the user is a driver when the number of the obtained forward judgment results reaches a preset number, wherein the forward judgment results indicate that the first traveling situation is the same as the second traveling situation.
In a second aspect, embodiments of the present application provide an electronic device, comprising a memory for storing computer program instructions and a processor for executing the program instructions, wherein the computer program instructions, when executed by the processor, trigger the electronic device to perform the steps of any of the first aspects.
In a third aspect, an embodiment of the present application provides a computer-readable storage medium, where the computer-readable storage medium includes a stored program, where the program, when executed, controls an apparatus in which the computer-readable storage medium is located to perform the method in any one of the first aspects.
When the technical scheme provided by the embodiment of the application is adopted for detecting the driver, the user is positioned in the vehicle, so that the wearable device worn by the user can move along with the vehicle, the movement information of the wearable device can represent the actual first traveling condition of the vehicle, and in addition, the movement information can represent the rotation condition of the wearable device, so that the theoretical second traveling condition of the vehicle when the steering wheel of the vehicle rotates according to the rotation condition of the wearable device represented by the movement information is determined. Therefore, after the movement information collected by the wearable device worn on the arm or the hand of the user is obtained, whether the user is a driver can be detected based on whether the first traveling condition and the second traveling condition are the same.
If the user is a driver, the user needs to rotate the steering wheel clockwise to enable the vehicle to turn right, so that the wearable device worn on the arm or the hand of the user rotates clockwise along with the arm, and the user needs to rotate the steering wheel counterclockwise to enable the vehicle to turn left, so that the wearable device worn on the arm or the hand of the user rotates counterclockwise along with the arm. As can be seen, if the user is a driver, the actual first travel situation of the vehicle is the same as the second travel situation corresponding to the turning situation of the wearable device. However, if the user is not a driver, the movement of the arm or hand of the user is not related to the traveling situation of the vehicle, and the first traveling situation and the second traveling situation may be different. Therefore, whether the user is a driver or not can be detected by adopting the scheme provided by the embodiment of the application.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive labor.
Fig. 1 is a schematic view of an electronic device according to an embodiment of the present disclosure;
FIG. 2 is a schematic flow chart illustrating a first driver detection method according to an embodiment of the present disclosure;
FIG. 3A is a schematic flowchart of a second driver detection method according to an embodiment of the present disclosure;
fig. 3B is a flowchart for determining a vehicle traveling condition and a wearable device rotation condition according to an embodiment of the present disclosure;
FIG. 4 is a flowchart illustrating a third method for detecting a driver according to an embodiment of the present disclosure;
FIG. 5 is a schematic flowchart illustrating a fourth method for detecting a driver according to an embodiment of the present disclosure;
fig. 6 is a schematic diagram of a data processing process in a driver detection process according to an embodiment of the present application.
Detailed Description
For better understanding of the technical solutions of the present application, the following detailed descriptions of the embodiments of the present application are provided with reference to the accompanying drawings.
In the embodiments of the present application, terms such as "first" and "second" are used to distinguish the same or similar items having substantially the same function and action. For example, the first instruction and the second instruction are for distinguishing different user instructions, and the order of the first instruction and the second instruction is not limited. Those skilled in the art will appreciate that the terms "first," "second," etc. do not denote any order or quantity, nor do the terms "first," "second," etc. denote any order or importance.
It is noted that the words "exemplary," "for example," and "such as" are used herein to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
The embodiment of the application can be applied to electronic devices such as tablet computers, personal Computers (PCs), personal Digital Assistants (PDAs), smart watches, netbooks, wearable electronic devices, augmented Reality (AR) devices, virtual Reality (VR) devices, vehicle-mounted devices, smart cars, robots, smart glasses, and smart televisions.
As shown in fig. 1, fig. 1 is a schematic diagram of an electronic device according to an embodiment of the present disclosure, and the electronic device shown in fig. 1 may include a processor 110, external memory interfaces 1 to n120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management Module 140, a power management Module 141, a battery 142, an antenna 1, an antenna 2, a mobile Communication Module 2G/3G/4G/5G150, a Wireless Communication Module BT (bitstream)/WLAN (Wireless Local Area Network), a Wireless Local Area Network (Wireless Local Area Network)/NFC (Near Field Communication), near Field Communication)/IR (Infrared Radiation, infrared)/FM (Frequency Modulation) 160, an audio Module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor Module 180, a button 190, a motor 191, an indicator 192, a camera head 1 n 193, a camera head 1 to n, a display screen (Subscriber Identity card) 195, and a Subscriber Identity card interface 195. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiments of the present application does not constitute a specific limitation to electronic devices. In other embodiments of the present application, an electronic device may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components may be used. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the Processor 110 may include an Application Processor (AP), a modem Processor (modem), a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband Processor, and/or a neural-Network Processing Unit (NPU), etc. Wherein, the different processing units may be independent devices or may be integrated in one or more processors.
The processor 110 may generate operation control signals according to the instruction operation code and the timing signal, so as to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The Interface may include an Integrated Circuit (I2C) Interface, an Inter-Integrated Circuit Sound (I2S) Interface, a Pulse Code Modulation (PCM) Interface, a Universal Asynchronous Receiver/Transmitter (UART) Interface, a Mobile Industry Processor Interface (MIPI), a General-Purpose Input/Output (GPIO) Interface, and a Subscriber Identity Module (SIM) Interface.
The I2C interface is a bidirectional synchronous Serial bus, and includes a Serial Data Line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc., respectively, through different I2C bus interfaces. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through an I2C bus interface to implement a touch function of the electronic device.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 through an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through the I2S interface, so as to implement a function of receiving a call through a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. The audio module 170 may transmit the acquired downstream audio stream data and the acquired upstream audio stream data to an electronic device wirelessly connected to the electronic device through the wireless communication module 160.
In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 and the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through the UART interface, so as to achieve the function of obtaining the downstream audio stream through the electronic device connected via bluetooth.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI Interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture functionality of electronic device 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the electronic device.
It should be understood that the connection relationship between the modules illustrated in the embodiments of the present application is only an exemplary illustration, and does not limit the structure of the electronic device. In other embodiments of the present application, the electronic device may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The wireless communication function of the electronic device may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in an electronic device may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied on the first electronic device. In some embodiments, the transmission of call data between two electronic devices may be accomplished through the mobile communication module 150, for example, as a called party device, downstream data from a calling party device may be obtained, and upstream data may be transmitted to the calling party device.
The Wireless Communication module 160 may provide solutions for Wireless Communication applied to electronic devices, including Wireless Local Area Networks (WLANs), such as Wireless Fidelity (Wi-Fi) Networks, bluetooth (BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), and Infrared (IR).
In some embodiments, antenna 1 of the electronic device is coupled to the mobile communication module 150 and antenna 2 is coupled to the wireless communication module 160 so that the electronic device can communicate with the network and other devices through wireless communication techniques. In one embodiment of the present application, the electronic device may enable a local area network connection with another electronic device through the wireless communication module 160. The wireless communication technologies may include Global System for Mobile Communications (GSM), general Packet Radio Service (GPRS), code Division Multiple Access (CDMA), wideband Code Division Multiple Access (WCDMA), time-Division-Synchronous Code Division Multiple Access (TD-SCDMA), long Term Evolution (LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. GNSS may include Global Positioning System (GPS), global Navigation Satellite System (GLONASS), beidou Navigation Satellite System (BDS), quasi-Zenith Satellite System (QZSS), and/or Satellite Based Augmentation System (SBAS), among others.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The Display panel may be a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), an Active Matrix Organic Light-Emitting Diode (Active-Matrix Organic Light-Emitting Diode, AMOLED), a flexible Light-Emitting Diode (FLED), a MiniLED, a Micro led, a Micro-OLED, a Quantum dot Light-Emitting Diode (QLED), or the like. In some embodiments, the electronic device may include 1 or N display screens 194, N being a positive integer greater than 1.
The external Memory interface 120 may be used to connect an external Memory card, such as a Micro Secure Digital (SD) card, to expand the storage capability of the electronic device. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. Files such as music, video, audio files, etc. are saved in the external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, and application programs (such as a sound playing function, an image playing function, a recording function, and the like) required by at least one function. The storage data area can store data (such as uplink audio data, downlink audio data, a phone book and the like) created in the using process of the electronic equipment. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk Storage device, a Flash memory device, a Universal Flash Storage (UFS), and the like. The processor 110 operates by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor 110, various functional applications of the electronic device and data processing are performed.
The electronic device may implement a call conflict handling function and the like through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor.
The audio module 170 is used to convert digital audio information into analog audio signals for output, and also used to convert analog audio inputs into digital audio signals. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into a sound signal. When the electronic device answers the call or voice information, the voice transmitted by the caller's device can be heard through the receiver 170B.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When a call is made or voice information is sent, a user can make a sound by approaching the microphone 170C through the mouth of the user, and a sound signal is input into the microphone 170C, so that the collection of an uplink audio stream is realized.
The pressure sensor 180A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. In some embodiments, a manual answer call function may be implemented when the user clicks a press on an answer key on the display screen 194, and a manual hang up call function may be implemented when the user clicks a press on a hang up key on the display screen 194.
The touch sensor 180K is also called a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor 180K may pass the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device at a different position than the display screen 194.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic device may receive a key input, and generate a key signal input related to user settings and function control of the electronic device.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects in response to touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be attached to and detached from the electronic device by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic equipment can support 1 or N SIM card interfaces, and N is a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 is also compatible with external memory cards. The electronic equipment realizes functions of conversation, data communication and the like through the interaction of the SIM card and the network. In some embodiments, the electronic device employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device and cannot be separated from the electronic device.
The scheme provided by the embodiment of the application can be started after the wearable device is determined to be configured on the arm or the hand of the user and the user is determined to be positioned in the vehicle. Specifically, the wearable device can detect whether the wearable device is worn on an arm or an arm of the user.
In an embodiment of the application, the wearable device can detect whether the temperature near the wearable device is within a human body temperature range through the temperature sensor configured by the wearable device, and if the temperature is within the human body temperature range, the wearable device is determined to be worn on an arm or a hand of a user, or the wearable device can determine whether the pulse and heart rate sensor configured by the wearable device can acquire pulse and heart rate data, and if the pulse and heart rate data can be acquired, the wearable device is determined to be worn on the arm or the hand of the user.
In addition, the wearable device can acquire current acceleration data of the wearable device through an accelerometer configured in the wearable device, and if the acceleration data is greater than a preset threshold, it is determined that the wearable device is located in the vehicle, and if it is determined that the wearable user is worn on an arm or a hand of the user, the wearable device can start to execute the scheme provided by the embodiment of the application.
In order to detect whether a user is a driver, embodiments of the present application provide a driver detection method, which may be applied to the electronic device shown in fig. 1 or a wearable device.
If the method is applied to the electronic device shown in fig. 1, a communication connection exists between the electronic device and the wearable device, the wearable device starts to send the collected movement information to the electronic device after determining that the wearable device is worn on an arm or a hand by a user and the user is located in a vehicle, the electronic device may start to execute the scheme provided by the embodiment of the application to detect whether the user is a driver, and if the user is determined to be the driver, the electronic device may send information related to the vehicle and traffic required by the driver to the wearable device, so that the wearable device can show the requested information to the user.
If the method is applied to the wearable device, after the wearable device determines that the wearable device is worn on an arm or a hand of a user and the user is located in a vehicle, the scheme provided by the embodiment of the application can be directly executed, the movement information collected by the wearable device is obtained, whether the user is a driver or not is detected through the scheme provided by the embodiment of the application, and if the user is determined to be the driver, the information related to the vehicle and traffic can be automatically generated, or the information related to the vehicle and traffic required by the driver is requested from a server or other electronic devices, and the requested information is displayed to the user.
For example, if the execution subject of the embodiment of the present application detects that the user wearing the wearable device is the driver, it is possible to determine that the user gets off the vehicle and starts walking after detecting that the user has changed from the riding state to the walking state, and then the parking position information of the vehicle may be provided to the user. Specifically, when the execution subject of the embodiment of the present application is the electronic device shown in fig. 1, the electronic device may transmit parking position information to the wearable device worn by the driver, and when the execution subject of the embodiment of the present application is the wearable device, the wearable device may generate information indicating a current position of the wearable device as the parking position information.
Or, if the execution subject of the embodiment of the application detects that the user wearing the wearable device is a driver, the execution subject may provide the road condition information, recommended route information, and the like of a road segment near the vehicle to the user, and may provide epidemic situation area information near the vehicle to the user, so that the driver may detour in advance. However, if the user wearing the wearable device is not a driver, the road condition information, recommended route information, and epidemic situation area information need not be provided to the user. Specifically, when the execution main body of the embodiment of the present application is the electronic device shown in fig. 1, the electronic device may send the road condition information, the recommended route information, and the epidemic situation area information to a wearable device worn by a driver, and when the execution main body of the embodiment of the present application is the wearable device, the wearable device may request the server or another electronic device for the road condition information, the recommended route information, and the epidemic situation area information.
As shown in fig. 2, a schematic flowchart of a first driver detection method provided in an embodiment of the present application includes the following steps S201 to S203.
S201: movement information collected by a wearable device worn on an arm or hand of a user is obtained.
Wherein the user is located in a vehicle.
The wearable device can be a watch, a bracelet, a ring and the like. If the execution main body of the application is not the wearable device, the wearable device may send the movement information to the execution main body of the application after acquiring the movement information, and if the execution main body of the application is the wearable device, the wearable device may directly acquire the movement information after acquiring the movement information.
Specifically, the movement information may include acceleration data, angular velocity data, and wearable device orientation data, and the wearable device may include an accelerometer, a gyroscope, and a magnetometer, which are respectively configured to collect the acceleration data, the angular velocity data, and the wearable device orientation data. In addition, other devices capable of measuring movement data in the prior art can be further installed in the wearable device, and the movement information can also include movement data acquired by the devices.
The data collected by the accelerometer, the data collected by the gyroscope and the data collected by the magnetometer respectively comprise data with different three axes.
In an embodiment of the application, the wearable device may continuously acquire the movement information, the movement information acquired by the wearable device at a time may be referred to as a frame of movement information, and each frame of movement information may represent a movement condition of the wearable device at the corresponding time. The mobile information obtained in the scheme provided by the embodiment of the application can the frame motion information is continuously acquired by the wearable device by a preset number of frames.
S202: based on the movement information, it is determined whether the first traveling situation of the vehicle indicated by the movement information is the same as the second traveling situation.
Wherein the second travel condition is: and the theoretical running condition of the vehicle when the steering wheel of the vehicle rotates according to the rotation condition of the wearable equipment represented by the movement information.
Since the user is located in the vehicle and the wearable device is worn on the arm or hand of the user, the wearable device moves along with the movement of the vehicle, and therefore the movement information collected by the wearable device can represent the actual first travel situation of the vehicle.
The first travel condition may include left turn of the vehicle, right turn of the vehicle, and straight travel of the vehicle.
In addition, the wearable device is worn on the arm or the hand of the user, so that the rotation condition of the wearable device can reflect the rotation condition of the arm or the hand of the user, and if the user is a driver of the vehicle, the rotation of the arm or the hand of the user can drive the steering wheel of the vehicle to rotate, so that the traveling condition of the vehicle is influenced. The rotation condition of the wearable device may include clockwise rotation of the wearable device and counterclockwise rotation of the wearable device. If the wearable device rotates clockwise, the wearable device indicates that the arm or the hand of the user rotates clockwise to drive the steering wheel of the vehicle to rotate clockwise, and theoretically, the second running condition of the vehicle is that the vehicle rotates right; on the contrary, if the wearable device rotates counterclockwise, it indicates that the arm or the hand of the user rotates counterclockwise to drive the steering wheel of the vehicle to rotate counterclockwise, and theoretically, the second traveling condition of the vehicle is that the vehicle rotates left.
In one embodiment of the present application, the step S202 can be implemented by steps S202A-S202C shown in fig. 3A, which will not be described in detail herein.
In another embodiment of the present application, the step S202 can be implemented by the step S202D shown in fig. 4, and will not be described in detail here.
S203: and detecting whether the user is a driver or not based on the judgment result.
Specifically, if the determination result indicates that the actual first traveling condition of the vehicle is the same as the theoretical second traveling condition, it indicates that the turning of the hand of the user does cause the turning of the direction of the vehicle, and further causes the vehicle to turn, so that it can be determined that the user is the driver, otherwise, it is determined that the user is not the driver.
In addition, in the embodiment of the present application, step S203 can be realized by step S203A shown in fig. 5, which is not described in detail herein.
As can be seen from the above, when the technical solution provided by the embodiment of the present application is used for driver detection, since the user is located in the vehicle, the wearable device worn by the user moves along with the vehicle, and the movement information of the wearable device can indicate the actual first traveling condition of the vehicle. Therefore, after the movement information collected by the wearable device worn on the arm or hand of the user is obtained, whether the user is a driver can be detected based on whether the first travel condition and the second travel condition are the same.
If the user is a driver, the user needs to rotate the steering wheel clockwise to enable the vehicle to turn right, so that the wearable device worn on the arm or the hand of the user rotates clockwise along with the arm, and the user needs to rotate the steering wheel counterclockwise to enable the vehicle to turn left, so that the wearable device worn on the arm or the hand of the user rotates counterclockwise along with the arm. As can be seen, if the user is a driver, the actual first travel situation of the vehicle is the same as the second travel situation corresponding to the turning situation of the wearable device. However, if the user is not the driver, the movement of the arm or hand of the user is not related to the traveling situation of the vehicle, and the first traveling situation and the second traveling situation may be different. Therefore, whether the user is a driver or not can be detected by adopting the scheme provided by the embodiment of the application.
Referring to fig. 3A, a schematic flowchart of a second driver detection method provided in the embodiment of the present application, compared with the foregoing embodiment shown in fig. 2, the foregoing step S202 may be implemented by the following steps S202A to S202C.
S202A: travel information indicating an actual first travel situation of the vehicle is obtained based on the movement information.
Specifically, the travel information indicating the actual first travel situation of the vehicle may be obtained based on acceleration data, angular velocity data, and wearable device orientation data included in the movement information.
In an embodiment of the application, in a case where the movement information includes multi-frame information collected by the wearable device at different times, the travel information may be obtained through the following steps a to B.
Step A: and obtaining, for each frame of information included in the movement information, direction information indicating a traveling direction of the vehicle at a time when the wearable device collects the frame of information.
Specifically, each frame of information may be input into the travel direction determination model, and the output result may be obtained as the direction information at different times. The travel direction determination model is used to detect the travel direction of the vehicle, and the travel direction determination model may be a machine learning model trained using sample movement information that is known to indicate the travel direction of the vehicle.
In addition, other algorithms in the prior art may also be used to determine the traveling direction of the vehicle, which is not limited in the embodiment of the present application.
And B: travel information indicating an actual first travel situation of the vehicle is obtained based on the travel direction of the vehicle indicated by each piece of direction information.
After the respective direction information is obtained, the traveling direction of the vehicle at different time can be determined, and the first traveling condition of the vehicle can be determined according to the change condition of the traveling direction of the vehicle, so that the traveling information can be obtained.
For example, if three pieces of direction information are obtained in step a, which respectively indicate the traveling direction of the vehicle from front to back at three times a-c, the traveling direction of the vehicle at time a is north, the traveling direction of the vehicle at time b is northeast, and the traveling direction of the vehicle at time c is east, it can be determined that the vehicle is traveling in a right turn, and traveling information indicating that the vehicle is turning right is obtained.
In another embodiment of the present application, the traveling condition of the vehicle may also be directly determined based on the obtained movement information including the multiframe information. Specifically, all the information may be input into a pre-trained traveling condition detection model to obtain an output result as traveling information. The travel situation detection model is used to detect the travel situation of the vehicle, and the travel situation detection model may be a machine learning model trained using sample movement information that is known to indicate the travel situation of the vehicle.
In addition, other algorithms in the prior art may also be used to determine the traveling condition of the vehicle, which is not limited in the embodiment of the present application.
S202B: and obtaining rotation information representing the rotation condition of the wearable device based on the movement information.
Specifically, the rotation information indicating the rotation of the wearable device may be obtained based on acceleration data and angular velocity data included in the movement information.
The rotation conditions comprise three conditions of clockwise rotation of the wearable device, anticlockwise rotation of the wearable device and other conditions except for clockwise rotation and anticlockwise rotation. For example, other situations of the wearable device besides clockwise rotation and counterclockwise rotation include that the wearable device is not moving or moving irregularly, etc.
In an embodiment of the application, in a case that the movement information includes multi-frame information collected by the wearable device at different times, the rotation information may be obtained through the following steps C to D.
Step C: and acquiring attitude information indicating the attitude of the wearable device at the moment when the wearable device collects the frame information, for each frame of information contained in the movement information.
The gesture of the wearable device comprises right movement, left movement, upward movement, downward movement, no movement and the like of the wearable device.
Specifically, each frame of information may be input to the pose determination model, and an output result may be obtained as pose information at different times. The posture determination model is used for detecting the posture of the wearable device, and the posture determination model may be a machine learning model trained by sample movement information, where the sample movement information is movement information of the posture of the wearable device known to be represented.
In addition, other algorithms in the prior art may also be used to determine the posture of the wearable device, which is not limited in this application embodiment.
Step D: and obtaining rotation information representing the rotation condition of the wearable device according to the postures of the wearable device represented by the posture information.
After the posture information is obtained respectively, the postures of the wearable device at different moments can be determined, and the rotation condition of the wearable device can be determined according to the change condition of the postures of the wearable device, so that the rotation information is obtained.
For example, by obtaining three gesture information in step C, which respectively represent the gestures of the wearable device at three moments a-C from front to back, at the moment a, the gesture of the wearable device is moved to the right, the gesture of the wearable device at the moment b is moved to the right and downwards, and the gesture of the wearable device at the moment C is moved downwards, it can be determined that the current rotation condition of the wearable device is clockwise rotation, and rotation information representing clockwise rotation of the wearable device is obtained.
In another embodiment of the present application, the rotation condition of the wearable device may also be directly determined based on the obtained movement information including the multi-frame information. Specifically, all the information may be input into a rotation condition detection model trained in advance, and an output result is obtained as the travel information. The rotation condition detection model is used for detecting the rotation condition of the wearable device, and the rotation condition detection model may be a machine learning model obtained by training sample movement information, where the sample movement information is movement information of the rotation condition of the wearable device which is known to be represented.
In addition, other algorithms in the prior art may also be used to determine the movement of the wearable device, which is not limited in the embodiment of the present application.
S202C: and judging whether the first traveling condition and the second traveling condition indicated by the traveling information are the same or not.
Wherein the second travel condition is: and the theoretical running condition of the vehicle when the steering wheel of the vehicle rotates according to the rotating condition shown by the rotating information.
Referring to fig. 3B, a flowchart for obtaining the travel information and the rotation information is provided according to an embodiment of the present disclosure.
As can be seen from the figure, the advancing directions of the vehicle at different moments are determined according to the data collected by the accelerometer, the data collected by the gyroscope and the data collected by the magnetometer which are contained in different frame information in the mobile information, wherein the advancing directions are respectively advancing direction 1 and advancing direction 2 \8230, the advancing direction 8230and the advancing direction n, and the postures of the wearable device at different moments are determined according to the data collected by the accelerometer and the data collected by the gyroscope which are contained in different frame information in the mobile information, and are respectively posture 1 and posture 2 \8230, the advancing direction 8230and the posture n. According to the method, the vehicle driving information is obtained by determining the driving conditions of the vehicle according to the driving direction 1 and the driving direction 2 \8230, the driving condition 8230is determined by the driving direction n, including the conditions of left turning, right turning and straight driving of the vehicle, and the rotation condition of the wearable device is determined according to the posture 1 and the posture 2 \8230, the posture 8230, the rotation condition of the wearable device is determined according to the posture n, including the conditions of clockwise rotation, anticlockwise rotation or other conditions of the wearable device, so that the rotation information of the wearable device is obtained.
As can be seen from the above, in the solution provided in the embodiment of the present application, the travel information indicating the first travel condition of the vehicle and the rotation information indicating the rotation condition of the wearable device may be obtained separately according to the movement information, and on this basis, it may be determined whether the first travel condition indicated by the travel information and the second travel condition of the vehicle when the steering wheel of the vehicle rotates according to the rotation condition indicated by the rotation information are the same, so as to obtain the determination result.
Referring to fig. 4, a flow chart of a third driver detection method provided in the embodiment of the present application is schematically illustrated, and compared with the foregoing embodiment shown in fig. 2, the foregoing step S202 may be implemented by the following step S202D.
S202D: and inputting the movement information into a pre-trained traveling condition judgment model, obtaining an output result, and determining whether the actual first traveling condition of the vehicle represented by the movement information is the same as the second traveling condition.
Wherein the output result indicates whether the first traveling condition is the same as the second traveling condition.
Specifically, the travel condition determination model may be a machine learning model, and the output result may include a first output result indicating that the first travel condition is the same as the second travel condition and a second output result indicating that the first travel condition is different from the second travel condition. That is, the traveling situation determination model may output two different output results, and the traveling situation determination model is a binary model.
The travel situation determination model is trained based on sample movement information, and it is known whether the first travel situation represented by the sample movement information is the same as the second travel situation. The sample movement information may include: at least one of the following information collected by a wearable device worn on a hand or arm of a user located in a vehicle: the information acquisition method comprises the steps of acquiring information acquired by wearable equipment when a driver user rotates a steering wheel in the vehicle running process, acquiring information acquired by the wearable equipment when the driver user rotates the steering wheel when the vehicle is in a stopped state, acquiring information acquired by the wearable equipment worn on the driver user when the vehicle keeps going straight, and acquiring information acquired by the wearable equipment worn on a non-driver user.
The first traveling condition and the second traveling condition represented by the information collected by the wearable device when the driver user rotates the steering wheel in the traveling process of the vehicle are the same and can be called as forward samples, the movement information collected by the wearable device when the driver user rotates the steering wheel when the vehicle is in a stopped state, the movement information collected by the wearable device worn by the driver user when the vehicle keeps traveling straight, and the first traveling condition and the second traveling condition represented by the movement information collected by the wearable device worn by the non-driver user are different and can be called as reverse samples.
According to the embodiment of the application, the forward sample and the reverse sample are adopted when the traveling condition judgment model is obtained through training, so that the traveling condition judgment model obtained through training can identify the mobile information of which the represented first traveling condition is the same as the second traveling condition, and can identify the mobile information of which the represented first traveling condition is different from the second traveling condition, and the output result of the traveling condition judgment model obtained through training is accurate.
As can be seen from the above, in the embodiment of the present application, the movement information is directly input into the travel situation determination model to determine whether the first travel situation is the same as the second travel situation, and it is not necessary to actually determine the first travel situation of the vehicle and the rotation situation of the wearable device, which are indicated by the movement information, so that the manner of obtaining the determination result is simplified.
Referring to fig. 5, a schematic flow chart of a fourth driver detection method provided in the embodiment of the present application, compared with the foregoing embodiment shown in fig. 2, the foregoing step S203 can be implemented by the following step S203A.
S203A: and determining the user as a driver when the number of the obtained forward judgment results reaches a preset number.
Wherein the positive determination result indicates that the first traveling condition is the same as the second traveling condition.
The scheme provided by the embodiment of the application can circularly execute the steps S201 to S202, continuously acquire the movement information, perform a judgment once every time the movement information is acquired, acquire a judgment result once, and determine that the user is a driver if the accumulation of the acquired forward judgment results reaches a preset number.
Specifically, every time the forward determination result is obtained, 1 is added to the cumulative number of the current forward determination result, and the initial value of the cumulative number is 0.
In addition, a statistical period may be preset, if the number of the forward determination results obtained in the statistical period reaches a preset number, it is determined that the user is a driver, and if the number of the forward determination results does not reach the preset number, it is determined that the user is not a driver, and the execution of the embodiment of the present application is stopped.
Moreover, the user may be determined as the driver only when the continuously obtained predetermined number of determination results are all positive determination results, otherwise, the embodiment of the present application is continuously executed to detect whether the user is the driver.
As can be seen from the above, in the solution provided in the embodiment of the present application, only when the obtained forward determination result reaches the preset number, it is determined that the user is a driver. Instead of determining that the user is the driver as long as the forward judgment result is obtained, the detection result error caused by accidental conditions can be avoided, and the accuracy of the detection result obtained by the scheme provided by the embodiment of the application is improved.
Referring to fig. 6, a schematic diagram of a data processing process in a driver detection process according to an embodiment of the present application is provided.
As can be seen from the figure, in the scheme provided in the embodiment of the present application, three types of data, i.e., data acquired by an accelerometer, data acquired by a gyroscope, and data acquired by a magnetometer, are used as the movement information. And adopting an N-frame sliding window to acquire data acquired by the N-frame accelerometer, data acquired by the N-frame gyroscope and data acquired by the N-frame magnetometer each time to detect whether the first traveling condition and the second traveling condition of the vehicle represented by the movement information are the same or not. And if the number of the obtained forward judgment results reaches the preset number, judging that the user is the driver.
In a specific implementation manner, the present application further provides a computer storage medium, where the computer storage medium may store a program, and when the program runs, the computer storage medium controls a device in which the computer readable storage medium is located to perform some or all of the steps in the foregoing embodiments. The storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM) or a Random Access Memory (RAM).
In a specific implementation, an embodiment of the present application further provides a computer program product, where the computer program product includes executable instructions, and when the executable instructions are executed on a computer, the computer is caused to perform some or all of the steps in the foregoing method embodiments.
Embodiments of the mechanisms disclosed herein may be implemented in hardware, software, firmware, or a combination of these implementations. Embodiments of the application may be implemented as computer programs or program code executing on programmable systems comprising at least one processor, a storage system (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device.
Program code may be applied to input instructions to perform the functions described herein and generate output information. The output information may be applied to one or more output devices in a known manner. For purposes of this Application, a processing system includes any system having a Processor such as, for example, a Digital Signal Processor (DSP), a microcontroller, an Application Specific Integrated Circuit (ASIC), or a microprocessor.
The program code may be implemented in a high level procedural or object oriented programming language to communicate with a processing system. The program code can also be implemented in assembly or machine language, if desired. Indeed, the mechanisms described in this application are not limited in scope to any particular programming language. In any case, the language may be a compiled or interpreted language.
In some cases, the disclosed embodiments may be implemented in hardware, firmware, software, or any combination thereof. The disclosed embodiments may also be implemented as instructions carried by or stored on one or more transitory or non-transitory machine-readable (e.g., computer-readable) storage media, which may be read and executed by one or more processors. For example, the instructions may be distributed via a network or via other computer readable media. Thus, a machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer), including, but not limited to, floppy diskettes, optical disks, compact disk Read Only memories (CD-ROMs), magneto-optical disks, read Only Memories (ROMs), random Access Memories (RAMs), erasable Programmable Read Only Memories (EPROMs), electrically Erasable Programmable Read Only Memories (EEPROMs), magnetic or optical cards, flash Memory, or a tangible machine-readable Memory for transmitting information (e.g., carrier waves, infrared signals, digital signals, etc.) using the Internet in electrical, optical, acoustical or other forms of propagated signals. Thus, a machine-readable medium includes any type of machine-readable medium suitable for storing or transmitting electronic instructions or information in a form readable by a machine (e.g., a computer).
In the drawings, some features of the structures or methods may be shown in a particular arrangement and/or order. However, it is to be understood that such specific arrangement and/or ordering may not be required. Rather, in some embodiments, the features may be arranged in a manner and/or order different from that shown in the figures. In addition, the inclusion of a structural or methodical feature in a particular figure is not meant to imply that such feature is required in all embodiments, and in some embodiments, may not be included or may be combined with other features.
It should be noted that, in each device embodiment of the present application, each unit/module is a logical unit/module, and physically, one logical unit/module may be one physical unit/module, or a part of one physical unit/module, and may also be implemented by a combination of multiple physical units/modules, where the physical implementation manner of the logical unit/module itself is not the most important, and the combination of the functions implemented by the logical unit/module is the key to solving the technical problem provided by the present application. Furthermore, in order to highlight the innovative part of the present application, the above-mentioned device embodiments of the present application do not introduce units/modules which are not so closely related to solve the technical problems presented in the present application, which does not indicate that no other units/modules exist in the above-mentioned device embodiments.
It is noted that, in the examples and description of the present patent, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
While the present application has been shown and described with reference to certain preferred embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present application.

Claims (11)

1. A driver detection method, characterized in that the method comprises:
obtaining movement information collected by a wearable device worn on an arm or a hand of a user, wherein the user is located in a vehicle;
and judging whether the first actual traveling condition of the vehicle represented by the movement information is the same as a second traveling condition or not on the basis of the movement information, wherein the second traveling condition is that: when the steering wheel of the vehicle rotates according to the rotation condition of the wearable equipment represented by the movement information, the vehicle theoretically moves;
and detecting whether the user is a driver or not based on the judgment result.
2. The method according to claim 1, wherein the determining whether the first actual traveling condition of the vehicle indicated by the movement information is the same as the second actual traveling condition based on the movement information comprises:
obtaining travel information representing an actual first travel situation of the vehicle based on the movement information;
obtaining rotation information representing rotation conditions of the wearable device based on the movement information;
judging whether a first traveling condition and a second traveling condition represented by the traveling information are the same, wherein the second traveling condition is as follows: and the theoretical running condition of the vehicle when the steering wheel of the vehicle rotates according to the rotation condition shown by the rotation information.
3. The method of claim 2, wherein in a case where multi-frame information collected by the wearable device at different time instants is included in the movement information, the determining, based on the movement information, traveling information representing a first actual traveling situation of the vehicle includes:
for each frame of information contained in the movement information, obtaining direction information representing the traveling direction of the vehicle at the moment when the wearable device collects the frame of information;
and acquiring the travelling information representing the actual first travelling situation of the vehicle according to the travelling direction of the vehicle represented by each direction information.
4. The method of claim 2, wherein said obtaining travel information indicative of a first actual travel condition of the vehicle based on the movement information comprises:
travel information representing an actual first travel situation of the vehicle is obtained based on acceleration data, angular velocity data, and wearable device orientation data included in the movement information.
5. The method according to claim 2, wherein in a case that the movement information includes multi-frame information collected by the wearable device at different times, the obtaining rotation information representing rotation of the wearable device based on the movement information includes:
acquiring attitude information representing the attitude of the wearable equipment at the moment when the wearable equipment collects the frame information aiming at each frame of information contained in the movement information;
and obtaining rotation information representing the rotation condition of the wearable device according to the postures of the wearable device represented by the posture information.
6. The method of claim 2, wherein obtaining rotation information indicative of rotation of the wearable device based on the movement information comprises:
and obtaining rotation information representing the rotation condition of the wearable device based on the acceleration data and the angular velocity data contained in the movement information.
7. The method according to claim 1, wherein the determining whether the first actual traveling condition of the vehicle indicated by the movement information is the same as the second actual traveling condition based on the movement information comprises:
inputting the movement information into a pre-trained travelling condition judgment model, obtaining an output result, and determining whether the actual first travelling condition of the vehicle represented by the movement information is the same as the second travelling condition;
wherein the output result indicates whether the first travel situation is the same as the second travel situation, and the travel situation determination model is trained based on the sample movement information.
8. The method of claim 7, wherein the sample movement information comprises at least one of the following information collected by a wearable device worn on a hand or arm of a user located in a vehicle: the information acquisition method comprises the steps of acquiring information acquired by wearable equipment when a driver user rotates a steering wheel in the vehicle traveling process, acquiring information acquired by the wearable equipment when the driver user rotates the steering wheel when the vehicle is in a stopped state, acquiring information acquired by the wearable equipment worn by the driver user when the vehicle keeps going straight, and acquiring information worn by the wearable equipment worn by a non-driver user.
9. The method according to any one of claims 1-8, wherein the detecting whether the user is a driver based on the determination result comprises:
determining that the user is a driver when the number of the obtained forward judgment results reaches a preset number, wherein the forward judgment results indicate that the first traveling situation is the same as the second traveling situation.
10. An electronic device comprising a memory for storing computer program instructions and a processor for executing the program instructions, wherein the computer program instructions, when executed by the processor, trigger the electronic device to perform the method of any of claims 1-9.
11. A computer-readable storage medium, comprising a stored program, wherein the program, when executed, controls an apparatus in which the computer-readable storage medium is located to perform the method of any of claims 1-9.
CN202211068203.5A 2022-09-02 2022-09-02 Driver detection method, electronic device, and storage medium Pending CN115144013A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211068203.5A CN115144013A (en) 2022-09-02 2022-09-02 Driver detection method, electronic device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211068203.5A CN115144013A (en) 2022-09-02 2022-09-02 Driver detection method, electronic device, and storage medium

Publications (1)

Publication Number Publication Date
CN115144013A true CN115144013A (en) 2022-10-04

Family

ID=83416202

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211068203.5A Pending CN115144013A (en) 2022-09-02 2022-09-02 Driver detection method, electronic device, and storage medium

Country Status (1)

Country Link
CN (1) CN115144013A (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104731336A (en) * 2015-03-27 2015-06-24 百度在线网络技术(北京)有限公司 Application method and device of mobile terminal suitable for set scene
CN104875745A (en) * 2015-05-18 2015-09-02 百度在线网络技术(北京)有限公司 Status information processing method and system
US20160036964A1 (en) * 2014-07-29 2016-02-04 Verizon Patent And Licensing Inc. Determining that a user is in a vehicle or driving a vehicle based on sensor data gathered by a user device
CN106530621A (en) * 2016-12-08 2017-03-22 杭州联络互动信息科技股份有限公司 Safe driving method and apparatus based on smart wearable equipment
CN107796394A (en) * 2016-09-05 2018-03-13 华为终端(东莞)有限公司 A kind of vehicle indoor positioning method, apparatus, system and wearable smart machine
CN110211402A (en) * 2019-05-30 2019-09-06 努比亚技术有限公司 Wearable device road conditions based reminding method, wearable device and storage medium
CN110623673A (en) * 2019-09-29 2019-12-31 华东交通大学 Fully-flexible intelligent wrist strap for recognizing gestures of driver
CN112596830A (en) * 2020-12-16 2021-04-02 广东湾区智能终端工业设计研究院有限公司 Interface display method and device
CN113568509A (en) * 2015-06-08 2021-10-29 北京三星通信技术研究有限公司 Portable electronic device and operation method thereof
CN113573939A (en) * 2019-02-14 2021-10-29 华为技术有限公司 Method and system for sequential micro-activity based driver detection on smart devices
CN113728323A (en) * 2019-04-25 2021-11-30 华为技术有限公司 Driver's advice selection from a plurality of candidates
CN113967348A (en) * 2020-07-24 2022-01-25 荣耀终端有限公司 Information display method and electronic equipment

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160036964A1 (en) * 2014-07-29 2016-02-04 Verizon Patent And Licensing Inc. Determining that a user is in a vehicle or driving a vehicle based on sensor data gathered by a user device
CN104731336A (en) * 2015-03-27 2015-06-24 百度在线网络技术(北京)有限公司 Application method and device of mobile terminal suitable for set scene
CN104875745A (en) * 2015-05-18 2015-09-02 百度在线网络技术(北京)有限公司 Status information processing method and system
CN113568509A (en) * 2015-06-08 2021-10-29 北京三星通信技术研究有限公司 Portable electronic device and operation method thereof
CN107796394A (en) * 2016-09-05 2018-03-13 华为终端(东莞)有限公司 A kind of vehicle indoor positioning method, apparatus, system and wearable smart machine
CN106530621A (en) * 2016-12-08 2017-03-22 杭州联络互动信息科技股份有限公司 Safe driving method and apparatus based on smart wearable equipment
CN113573939A (en) * 2019-02-14 2021-10-29 华为技术有限公司 Method and system for sequential micro-activity based driver detection on smart devices
CN113728323A (en) * 2019-04-25 2021-11-30 华为技术有限公司 Driver's advice selection from a plurality of candidates
CN110211402A (en) * 2019-05-30 2019-09-06 努比亚技术有限公司 Wearable device road conditions based reminding method, wearable device and storage medium
CN110623673A (en) * 2019-09-29 2019-12-31 华东交通大学 Fully-flexible intelligent wrist strap for recognizing gestures of driver
CN113967348A (en) * 2020-07-24 2022-01-25 荣耀终端有限公司 Information display method and electronic equipment
CN112596830A (en) * 2020-12-16 2021-04-02 广东湾区智能终端工业设计研究院有限公司 Interface display method and device

Similar Documents

Publication Publication Date Title
CN110138959B (en) Method for displaying prompt of human-computer interaction instruction and electronic equipment
CN111724775B (en) Voice interaction method and electronic equipment
CN112954749B (en) Network switching method and electronic equipment
CN110716776A (en) Method for displaying user interface and vehicle-mounted terminal
CN112861638A (en) Screen projection method and device
CN110032307A (en) A kind of moving method and electronic equipment of application icon
CN111742539B (en) Voice control command generation method and terminal
US10184854B2 (en) Mobile device and control method for position correlation utilizing time-based atmospheric pressure measurements
CN110742580A (en) Sleep state identification method and device
CN112923943A (en) Auxiliary navigation method and electronic equipment
CN111368765A (en) Vehicle position determining method and device, electronic equipment and vehicle-mounted equipment
CN111222836A (en) Arrival reminding method and related device
WO2022022335A1 (en) Method and apparatus for displaying weather information, and electronic device
WO2022062884A1 (en) Text input method, electronic device, and computer-readable storage medium
CN111163213A (en) Terminal control method and device and terminal equipment
CN112584037B (en) Method for saving image and electronic equipment
CN112150778A (en) Environmental sound processing method and related device
WO2020029146A1 (en) Method for obtaining movement track of user and terminal
WO2023207667A1 (en) Display method, vehicle, and electronic device
CN116033069B (en) Notification message display method, electronic device and computer readable storage medium
CN110069136B (en) Wearing state identification method and equipment and computer readable storage medium
CN115144013A (en) Driver detection method, electronic device, and storage medium
US20170230492A1 (en) Wearable device and method of controlling communication
CN113790732B (en) Method and device for generating position information
JP2018032209A (en) Electronic apparatus, control method, and control program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination