CN112304330B - Method for displaying running state of vehicle and electronic equipment - Google Patents
Method for displaying running state of vehicle and electronic equipment Download PDFInfo
- Publication number
- CN112304330B CN112304330B CN202011180455.8A CN202011180455A CN112304330B CN 112304330 B CN112304330 B CN 112304330B CN 202011180455 A CN202011180455 A CN 202011180455A CN 112304330 B CN112304330 B CN 112304330B
- Authority
- CN
- China
- Prior art keywords
- vehicle
- mobile terminal
- coordinate system
- data
- lane
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 68
- 238000003860 storage Methods 0.000 claims abstract description 16
- 230000001133 acceleration Effects 0.000 claims description 76
- 238000006243 chemical reaction Methods 0.000 claims description 25
- 230000015654 memory Effects 0.000 claims description 23
- 230000006870 function Effects 0.000 claims description 19
- 230000001960 triggered effect Effects 0.000 claims description 18
- 230000008859 change Effects 0.000 claims description 16
- 230000000295 complement effect Effects 0.000 claims description 12
- 230000008569 process Effects 0.000 claims description 11
- 238000001914 filtration Methods 0.000 claims description 8
- 230000006698 induction Effects 0.000 claims description 5
- 230000001747 exhibiting effect Effects 0.000 claims description 3
- 238000004364 calculation method Methods 0.000 abstract description 11
- 230000004927 fusion Effects 0.000 abstract description 8
- 238000010586 diagram Methods 0.000 description 26
- 230000006399 behavior Effects 0.000 description 15
- 239000011159 matrix material Substances 0.000 description 10
- 238000013473 artificial intelligence Methods 0.000 description 6
- 238000004590 computer program Methods 0.000 description 6
- 239000013598 vector Substances 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 5
- 230000005484 gravity Effects 0.000 description 4
- 230000001360 synchronised effect Effects 0.000 description 4
- 238000004422 calculation algorithm Methods 0.000 description 3
- 238000013434 data augmentation Methods 0.000 description 3
- 230000003993 interaction Effects 0.000 description 3
- 230000033001 locomotion Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 238000005096 rolling process Methods 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 238000003745 diagnosis Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 230000033764 rhythmic process Effects 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Theoretical Computer Science (AREA)
- Navigation (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Disclosed are a method of displaying a driving state of a vehicle, a mobile terminal for displaying the driving state of the vehicle, an electronic device, and a computer storage medium. The method can only use the built-in sensor of the mobile terminal as a signal source, is independent of vehicle-mounted equipment, and solves the problem of high cost of combined inertial navigation equipment. Meanwhile, the method and the device can accurately and efficiently identify and display various driving states of the vehicle under the condition of not depending on the strict fixed position of the mobile terminal through multi-sensor fusion and coordinate system calculation. The present disclosure also provides a driver with information related to the driving of a vehicle more intuitively by displaying an animation of the driving state of the vehicle on the mobile terminal.
Description
Technical Field
The present disclosure relates to an automatic driving technique, and more particularly, to a method of displaying a driving state of a vehicle, a mobile terminal for displaying the driving state of the vehicle, an electronic device, and a computer storage medium.
Background
Currently, the identification of the driving state mainly depends on the in-vehicle apparatus. For example, the information such as the speed, the course angle and the load can be obtained by utilizing a combined inertial navigation (IMU) and an on-board fault diagnosis (OBD) system on the on-board equipment, so that dangerous driving behaviors such as overspeed, overload and the like of the vehicle can be further distinguished. However, the current vehicle-mounted equipment has high manufacturing cost and large installation difficulty, and is difficult to realize large-scale deployment in a short time.
Methods of recognizing driving states using mobile terminals (e.g., mobile terminals) are also currently proposed. However, the existing methods for identifying the vehicle state by using the mobile terminal require the mobile terminal to be fixed in position or orientation. Therefore, when the driving state is recognized by the mobile terminal, other operations of the mobile terminal by the user will be affected. In addition, the types of driving states that can be identified by the mobile terminal are few at present, and the accuracy and sensitivity of the identified vehicle states are not high enough, so that the vehicle states are difficult to be commercially used.
The interface for displaying the running state of the vehicle on the mobile terminal is not intuitive, and thus the driver cannot be provided with information related to the running of the vehicle more intuitively.
Disclosure of Invention
Embodiments of the present disclosure provide a method of displaying a driving state of a vehicle, a mobile terminal for displaying the driving state of the vehicle, an electronic device, and a computer storage medium.
Embodiments of the present disclosure provide a method of displaying a driving state of a vehicle, the method being performed by a mobile terminal located in a driving vehicle, the method comprising: acquiring a total lane of a road on which the vehicle runs and an initial lane on which the vehicle runs; displaying a user interface, wherein the user interface comprises a road picture corresponding to the total lane, and the road picture corresponding to the total lane comprises a road picture corresponding to the initial lane; displaying a virtual animation object corresponding to the vehicle on the road picture, wherein the virtual animation object is displayed on the road picture corresponding to the initial lane; acquiring positioning data of the mobile terminal; acquiring a plurality of sensor data related to a running state of the vehicle, the plurality of sensor data being based on a coordinate system of a mobile terminal, wherein a Y-axis of the coordinate system of the mobile terminal points to an orientation of the mobile terminal, a Z-axis of the coordinate system of the mobile terminal is perpendicular to a direction of a screen of the mobile terminal, an X-axis of the coordinate system of the mobile terminal is perpendicular to a plane formed by the Y-axis of the coordinate system of the mobile terminal and the Z-axis of the coordinate system of the mobile terminal, the plurality of sensor data including acceleration data, angular velocity data, geomagnetic intensity data, and geomagnetic direction data; determining a running state of the vehicle on a coordinate system of the vehicle based on the plurality of sensor data and the positioning data, wherein a Y-axis of the vehicle-based coordinate system points to a running direction of the vehicle, a Z-axis of the vehicle-based coordinate system points to a geocenter, and an X-axis of the vehicle-based coordinate system is perpendicular to a plane composed of the Y-axis of the vehicle-based coordinate system and the Z-axis of the vehicle-based coordinate system; and dynamically updating and displaying the virtual animation object on a road picture corresponding to the total lane based on the initial lane in which the vehicle runs and the determined running state of the vehicle.
Embodiments of the present disclosure provide a display screen, a plurality of sensors, a positioning assembly, and a processor, wherein the plurality of sensors are configured to: acquiring a plurality of sensor data related to a running state of the vehicle, the plurality of sensor data being based on a coordinate system of a mobile terminal, the plurality of sensor data including acceleration data, angular velocity data, geomagnetic intensity data, and geomagnetic direction data; the positioning assembly is configured to: acquiring positioning data of the mobile terminal; the processor is configured to: acquiring a total lane of a road on which the vehicle runs and an initial lane on which the vehicle runs; acquiring positioning data of the mobile terminal from the positioning component; acquiring the plurality of sensor data from the plurality of sensors; determining a driving state of the vehicle on a coordinate system of the vehicle based on the plurality of sensor data and the positioning data; determining a lane change in which the vehicle is traveling based on the initial lane in which the vehicle is traveling and the determined traveling state of the vehicle; the display screen is configured to: displaying a user interface, wherein the user interface comprises a road picture corresponding to the total lane, and the road picture corresponding to the total lane comprises a road picture corresponding to the initial lane; displaying a virtual animation object corresponding to the vehicle on the road picture, wherein the virtual animation object is displayed on the road picture corresponding to an initial lane where the vehicle runs; and displaying the animation of dynamically moving the virtual animation object from the road picture corresponding to the initial lane to the road picture corresponding to the changed lane.
Embodiments of the present disclosure provide an electronic device. The electronic device includes: one or more processors; and one or more memories, wherein the memories have stored therein computer readable code, which when executed by the one or more processors, causes the electronic device to perform the method described above.
According to yet another embodiment of the present disclosure, there is also provided a computer-readable storage medium, which instructions, when executed by a processor of an electronic device, cause the electronic device to implement the above-described method.
The embodiment of the disclosure provides a method for displaying a running state of a vehicle, a mobile terminal, electronic equipment and a computer storage medium for displaying the running state of the vehicle, which can solve the problem of high cost of combined inertial navigation equipment by using a sensor built in the mobile terminal as a signal source and not depending on vehicle-mounted equipment. Meanwhile, the embodiment of the disclosure can accurately and efficiently identify and display various driving states of the vehicle under the condition of not depending on the strict fixed position of the mobile terminal through multi-sensor fusion and coordinate system calculation. Embodiments of the present disclosure also provide a driver with vehicle travel-related information more intuitively by displaying an animation of a travel state of a vehicle on a mobile terminal.
Drawings
Fig. 1 is an example schematic diagram showing a scenario for a mobile terminal located in a vehicle that is running.
Fig. 2A is a flowchart illustrating a method of exhibiting a driving state of a vehicle according to an embodiment of the present disclosure.
Fig. 2B is a schematic diagram showing a login interface showing a driving state of a vehicle according to an embodiment of the present disclosure.
Fig. 2C is a schematic diagram illustrating a work interface showing a driving state of a vehicle according to an embodiment of the present disclosure.
Fig. 2D is a plurality of dynamic graphs showing a driving state of a vehicle according to an embodiment of the present disclosure.
Fig. 2E is a schematic diagram showing still another example of a work interface showing a running state of a vehicle according to an embodiment of the present disclosure.
Fig. 3A is a schematic diagram showing a plurality of interfaces provided to demonstrate a running state of a vehicle according to an embodiment of the present disclosure.
Fig. 3B is a schematic diagram showing a plurality of interfaces provided to demonstrate a running state of a vehicle according to an embodiment of the present disclosure.
Fig. 3C is a schematic diagram illustrating a method of determining a driving state of a vehicle on a coordinate system of the vehicle according to an embodiment of the present disclosure.
Fig. 4 is a block diagram illustrating a mobile terminal for displaying a driving state of a vehicle according to an embodiment of the present disclosure.
Fig. 5 shows a schematic diagram of an apparatus for displaying a driving state of a vehicle according to an embodiment of the present disclosure.
Fig. 6 illustrates a schematic diagram of an architecture of an exemplary computing device, according to an embodiment of the present disclosure.
Fig. 7 shows a schematic diagram of a storage medium according to an embodiment of the present disclosure.
Detailed Description
Example embodiments according to the present disclosure will be described below with reference to the accompanying drawings.
Fig. 1 is an example schematic diagram illustrating a scenario 100 for a mobile terminal located in a vehicle that is traveling.
Referring to fig. 1, a user uses a mobile terminal in a running vehicle, wherein the orientation of the mobile terminal may not be fixed. For example, the orientation of the mobile terminal may be the same as or different from the direction of travel of the vehicle being traveled. According to the embodiment of the disclosure, the mobile terminal can accurately and intuitively display the traveling direction of the vehicle and the behavior of the vehicle under the condition that the orientation of the mobile terminal is not fixed. For example, the mobile terminal may display driving behavior of the driver in a dynamic picture manner, and provide vehicle travel-related information in an intuitive manner.
The Mobile terminal described herein may be any electronic device, such as a smart phone, tablet computer, notebook computer, palm top computer, MID (Mobile INTERNET DEVICES, mobile internet device), etc. The mobile terminal may also be various application software that can be loaded in the above-described device, such as map navigation software, for example.
The mobile terminal may recognize driving behavior of the vehicle using artificial intelligence technology and display the recognized driving behavior of the vehicle on a display screen of the mobile terminal. Artificial intelligence (ARTIFICIAL INTELLIGENCE, AI) as described herein is the theory, method, technique, and application of simulating, extending, and expanding human intelligence, sensing environments, obtaining knowledge, and using knowledge to obtain optimal results using a digital computer or a digital computer-controlled machine. In other words, artificial intelligence is an integrated technology of computer science that attempts to understand the essence of intelligence and to produce a new intelligent machine that can react in a similar way to human intelligence. Artificial intelligence, i.e. research on design principles and implementation methods of various intelligent machines, enables the machines to have functions of sensing, reasoning and decision, such as reasoning and deciding on the behavior of drivers.
The scheme provided by the embodiment of the disclosure relates to artificial intelligence automatic driving and other technologies, and is specifically described through the following embodiments. Specifically, the automatic driving technology generally comprises high-precision map, environment perception, behavior decision, path planning, motion control and other technologies, and has wide application prospect.
The embodiment of the disclosure provides a method for displaying a running state of a vehicle, a mobile terminal, electronic equipment and a computer storage medium for displaying the running state of the vehicle, which can solve the problem of high cost of combined inertial navigation equipment by using a sensor built in the mobile terminal as a signal source and not depending on vehicle-mounted equipment. Meanwhile, the embodiment of the disclosure can accurately and efficiently identify and display various driving states of the vehicle under the condition of not depending on the strict fixed position of the mobile terminal through multi-sensor fusion and coordinate system calculation. Embodiments of the present disclosure also provide a driver with vehicle travel-related information more intuitively by displaying an animation of a travel state of a vehicle on a mobile terminal.
Fig. 2A is a flowchart illustrating a method 200 of demonstrating a driving state of a vehicle according to an embodiment of the present disclosure. Fig. 2B is a schematic diagram illustrating a login interface 200 showing a driving state of a vehicle according to an embodiment of the present disclosure. Fig. 2C is a schematic diagram illustrating a work interface 200C showing a driving state of a vehicle according to an embodiment of the present disclosure. Fig. 2D is a plurality of dynamic graphs showing a driving state of a vehicle according to an embodiment of the present disclosure. Fig. 2E is a schematic diagram showing still another example of a work interface showing a running state of a vehicle according to an embodiment of the present disclosure.
In step S201, the mobile terminal acquires a total lane of a road on which the vehicle is traveling and an initial lane on which the vehicle is traveling.
For example, the mobile terminal may acquire a total lane of a road on which the vehicle is traveling and an initial lane on which the vehicle is traveling, which are input by a user. The mobile terminal may display an example interface 200B as shown in fig. 2B to facilitate user input of a total lane of a road on which the vehicle is traveling and an initial lane on which the vehicle is traveling. The example interface 200B may be a user login interface on which a user may enter numbers to enable the mobile terminal to obtain a total lane of a road on which the vehicle is traveling. For example, if the user inputs the number 6, it indicates that the current vehicle is traveling on a 6-lane road. The login interface may also provide the user with options related to the overall lane, e.g., "two-way eight lanes", "two-way four lanes", "1 left turn lane-3 straight lane-1 right turn lane", etc., based on which the user may select. The user can also inform the mobile terminal of the initial lane in which the vehicle is traveling by entering numbers on the interface. For example, if the user inputs the number 2, it indicates that the current vehicle is traveling on the second lane from left to right. The user can also inform the mobile terminal of the initial lane in which the vehicle is traveling by entering text on the interface. For example, the user may input a text such as "left turn lane" or "left start lane 2", informing the mobile terminal of the initial lane in which the vehicle is traveling. Of course, the login interface may also provide the user with options or questions related to the initial lane in which the vehicle is traveling to prompt the user to make a selection, e.g., "please you do you turn left on the first lane? "please select between lane 2 and lane 3", etc., the user may select based on this option. The interaction mode between the mobile terminal and the user is not limited by the present disclosure, as long as the mobile terminal can obtain the total lane of the road where the vehicle runs and the initial lane where the vehicle runs from the user input.
For example, the mobile terminal may also obtain, using a positioning component of the mobile terminal, a total lane of a road on which the vehicle is traveling and an initial lane on which the vehicle is traveling.
For example, based on the positioning data of the mobile terminal and offline map information stored in the mobile terminal, acquiring a total lane of a road on which the vehicle is traveling and an initial lane on which the vehicle is traveling; or inquiring information of the road on which the vehicle runs based on the positioning data of the mobile terminal so as to acquire a total lane of the road on which the vehicle runs and an initial lane on which the vehicle runs.
The location component is used to locate the current geographic location of the mobile terminal to enable navigation or LBS (Location Based Service, location-based services). The positioning component may be a positioning component based on the united states GPS (Global Positioning System ), the beidou system of china, the grainer system of russia or the galileo system of the european union. The positioning component can directly determine the state of the current road through the position obtained by positioning. For example, when the positioning component obtains that the vehicle is traveling on a road, the mobile terminal may invoke information for the road stored in the offline map to obtain the total lane for the road. The mobile terminal may also query the server for information of the road to obtain a total lane of the road. Of course, the mobile terminal may also directly estimate the lane in which the vehicle is traveling directly from the positioning data.
In step S202, a user interface is displayed, where the user interface includes a road screen corresponding to the total lane, and the road screen corresponding to the total lane includes a road screen corresponding to the initial lane.
For example, the user interface may be presented on the work interface-1 shown in FIG. 2C. Driving scenes are simulated in an animated manner on the user interface. For example, five lanes are shown in the user interface in fig. 2C, each lane being divided by a dashed lane line. For example, the user interface may be implemented using Canvas (Canvas) controls, view controls, or container controls, although the user interface in this disclosure may also be implemented and drawn using any known GUI (user interaction interface) controls, which is not limiting in this disclosure.
In step S203, a virtual animation object corresponding to the vehicle is displayed on the road screen, the virtual animation object being displayed on the road screen corresponding to the initial lane in which the vehicle travels.
For example, as shown by the user interface in a virtual animated object identified by a black dolly, it corresponds to a vehicle being driven. For example, the mobile terminal may create an attribute animation object corresponding to a running vehicle. The attribute animation is to change the parameter value of the object of the attribute animation continuously in a certain time interval and assign the changed parameter value to the attribute of the object continuously, so as to realize the animation effect of the object on the attribute. The attribute animation object may be an object animation drawing class (ObjectAnimator) or a numerical animation drawing class (valueAnimator). Wherein objectAnimator may perform an animation on a certain attribute of the object, the attribute value may be automatically updated. valueAnimator do not change the size of the attributes, but rather generate certain values over a period of time and then set the values for the attributes in the object to achieve animation.
For example, the attribute of the virtual animation object may be set to 3 to display the virtual animation object on the 3 rd lane, i.e., the middle lane, among the 5 lanes in fig. 2C. Of course, virtual animation objects in the present disclosure may also be created/drawn using any other known animation objects of a GUI (user interaction interface), which is not limited by the present disclosure.
In step S204, positioning data of the mobile terminal is acquired.
Alternatively, the positioning data may include a longitude data set, a latitude data set, a heading angle data set, a pitch angle data set, a roll angle data set, and the like of the vehicle. The positioning data are acquired in real time during the traveling process of the vehicle. The positioning data may thus include longitude data, latitude data, heading angle data, pitch angle data, and roll angle data acquired at a plurality of times, arranged in a time sequence. Although the positioning component of the mobile terminal is employed to collect such data, the movement of the mobile terminal is substantially consistent with the movement of the vehicle during travel of the vehicle. Thus, the positioning data of the mobile terminal may reflect the position and moving speed of the vehicle. Those skilled in the art will appreciate that the positioning data may include more and fewer data sets depending on the implementation of the positioning component, which is not limiting of the present disclosure.
In step S205, a plurality of sensor data relating to a running state of the vehicle is acquired, the plurality of sensor data being based on a coordinate system of a mobile terminal. The Y axis of the coordinate system based on the mobile terminal points to the direction of the mobile terminal, the Z axis of the coordinate system based on the mobile terminal is perpendicular to the direction of the screen of the mobile terminal, the X axis of the coordinate system based on the mobile terminal is perpendicular to a plane formed by the Y axis of the coordinate system based on the mobile terminal and the Z axis of the coordinate system based on the mobile terminal, and the plurality of sensor data comprise acceleration data, angular velocity data, geomagnetic intensity data and geomagnetic direction data.
Optionally, the acquiring the plurality of sensor data related to the running state of the vehicle further includes: acquiring acceleration data sets of at least three axial directions of the mobile terminal at a plurality of moments in the vehicle travelling process by using the accelerometer of the mobile terminal, wherein the at least three axial directions comprise a Y axis of the coordinate system based on the mobile terminal, an X axis of the coordinate system based on the mobile terminal and a Z axis of the coordinate system based on the mobile terminal; acquiring angular velocity data sets of the mobile terminal rotating around the at least three axial directions at the plurality of moments in time during the traveling of the vehicle by using a gyroscope of the mobile terminal; acquiring a set of geomagnetic intensity and geomagnetic direction of the mobile terminal at the plurality of moments in the running process of the vehicle by utilizing a magnetometer of the mobile terminal; and determining the plurality of sensor data related to the running state of the vehicle based on the acceleration data set, the angular velocity data set, and the sets of geomagnetic intensity and geomagnetic direction.
Those skilled in the art will appreciate that the plurality of sensor data associated with the driving state of the vehicle may also include more and fewer data sets depending on the type of the plurality of sensors, which is not limited by the present disclosure.
Optionally, the coordinate system of the mobile terminal refers to a coordinate system based on data measured by a mobile terminal that is not fixed or a fixed mobile terminal on a vehicle. Since the position of the mobile terminal in the vehicle may be different, and the mobile terminal may have different positions and orientations according to the operation of the user, it cannot be ensured that the coordinate system of the mobile terminal coincides with the coordinate system of the vehicle. Thus, the plurality of sensor data need to be further processed to represent the driving state of the vehicle in the vehicle coordinate system.
In step S206, a running state of the vehicle on a coordinate system of the vehicle is determined based on the plurality of sensor data and the positioning data.
Optionally, as shown in fig. 2C, a dynamic text region is provided and displayed on the user interface of the mobile terminal; and displaying the running state of the vehicle on a coordinate system of the vehicle in the dynamic text area in a text form; wherein the driving state includes: at least one of straight running, lane changing to the left, lane changing to the right, turning to the left, turning to the right, accelerating and decelerating. The driving state can be monitored in real time through dynamic words. The dynamic text area may be shown in the working interfaces-1 and-2 of fig. 2C, which show the driving behavior of the current driver in a text description manner.
Alternatively, the plurality of sensor data and the positioning data may be converted from the coordinate system of the mobile terminal to the reference coordinate system and then from the reference coordinate system to the coordinate system of the vehicle. Since the posture of the mobile terminal is not fixed, the mapping relationship of the coordinate system of the mobile terminal to the coordinate system of the vehicle is not uniquely fixed. Therefore, the real-time gesture of the mobile terminal can be acquired through the data of the plurality of sensors, and then the real-time gesture of the mobile terminal is firstly converted from the coordinate system of the mobile terminal to the reference coordinate system and then is converted to the coordinate system of the vehicle. It should be noted that, the reference coordinate system refers to a real world direction, that is, a direction pointing to a north magnetic pole is north, and a direction pointing to a south magnetic pole is south.
Optionally, as shown in fig. 2C, the mobile terminal displays a vehicle instant data area on the user interface; in the vehicle instant data area, the running state of the vehicle on the coordinate system of the vehicle is displayed in the form of numerical values; wherein the running state includes at least a part of a lateral acceleration of the vehicle, a longitudinal acceleration of the vehicle, a heading angular velocity of the vehicle, a heading angle of the vehicle, a roll angle of the vehicle, and a pitch angle of the vehicle.
The pitch angle of a vehicle is the angle between the X-axis of the coordinate system of the vehicle and the X-axis of the reference coordinate system (the angle ranges from-90 ° to +90°). The roll angle is the angle between the Z-axis of the coordinate system of the vehicle and the Z-axis of the reference coordinate system (angle range-180 deg. to +180°). The heading angle is an angle (angle range-180 ° to +180°) between the Y-axis of the coordinate system of the vehicle and the Y-axis of the reference coordinate system.
The vehicle immediate data area may be as shown in the working interfaces-1 and-2 of fig. 2C, which numerically show the current vehicle speed. Although only the current speed of the vehicle is shown in the working interfaces-1 and-2 of fig. 2C, it is also possible to show other running states of the vehicle, and sequentially display the lateral acceleration of the vehicle, the longitudinal acceleration of the vehicle, the heading angular velocity of the vehicle, the heading angle of the vehicle, the roll angle of the vehicle, the pitch angle of the vehicle, and the like in a display manner of scrolling left, scrolling right, scrolling up, scrolling down.
Optionally, as shown in fig. 2C, the mobile terminal displays a plurality of dynamic curve buttons on the user interface; when any one of the plurality of dynamic curve buttons is triggered, displaying a display interface of a dynamic curve corresponding to the triggered dynamic curve button; wherein the dynamic curve comprises: at least a portion of the acceleration profile based on the reference coordinates, the acceleration profile based on the coordinate system of the mobile terminal, the attitude angle profile based on the coordinate system of the mobile terminal, the angular velocity profile based on the coordinate system of the mobile terminal, the magnetic induction intensity profile based on the coordinate system of the mobile terminal, and the state parameter profile based on the coordinate system of the vehicle.
Optionally, the display interface of the dynamic curve may cover the road screen. For example, the user switches the user interface of the working interface-1 to the dynamic curve area of the working interface-2 by clicking/pressing/touching the dynamic curve button corresponding to the curve 1 in fig. 2C, where the dynamic curve area is used for displaying the display interface of the dynamic curve corresponding to the triggered dynamic curve button. For example, curve 1 in fig. 2C may correspond to an angular velocity curve based on a coordinate system of the mobile terminal.
The mobile terminal may create a canvas, view, or container corresponding to the dynamic curve over the current display page and then draw the corresponding dynamic curve in the canvas, view, or container. The mobile terminal may overlay the currently displayed user interface by making visible the attributes of the canvas, view, or container to which the dynamic curve corresponds, and then drawing the corresponding dynamic curve in the canvas, view, or container. Of course, those skilled in the art will appreciate that the user may trigger the display of the dynamic curve view in other ways, as well, and this disclosure is not limited in this regard.
The working interface-2 in fig. 2C is by way of example only, and those skilled in the art will appreciate that other dynamic curves may also be displayed by the dynamic curve region. For example, FIG. 2D gives some examples of the variety of dynamic curves that a dynamic curve region may display. These examples correspond to an acceleration profile based on a reference coordinate, an acceleration profile based on a coordinate system of a mobile terminal, an attitude angle profile based on a coordinate system of a mobile terminal, an angular velocity profile based on a coordinate system of a mobile terminal, a magnetic induction profile based on a coordinate system of a mobile terminal, and a state parameter profile based on a coordinate system of a vehicle, respectively. Although FIG. 2C is only schematically depicted as a dynamic curve corresponding to curve 1, those skilled in the art will appreciate that other buttons included on the user interface may correspond to the respective dynamic graphs of FIG. 2D.
In fig. 2D, the horizontal axis of each dynamic graph represents time, and the vertical axis represents a numerical value representing a dynamic curve. For example, each curve in the dynamic graph in fig. 2D indicates, according to numbers ① to ⑥, a value corresponding to acceleration based on the reference coordinate at each time, a value corresponding to acceleration based on the coordinate system of the mobile terminal at each time, a value corresponding to the attitude angle based on the coordinate system of the mobile terminal at each time, a value corresponding to the angular velocity based on the coordinate system of the mobile terminal at each time, a value corresponding to the magnetic induction intensity based on the coordinate system of the mobile terminal at each time, and a value corresponding to the state parameter based on the coordinate system of the vehicle at each time, respectively. The dynamic curves described above are all updated automatically over time. I.e. the dynamic curve always shows the change of the vehicle state within a period of time (e.g. one minute, five minutes) from the current moment.
When the dynamic graph is displayed without displaying the road picture, the text displayed in the dynamic text region may be changed accordingly to prompt the driver for the lane in which the vehicle is traveling without providing the road picture. For example, as shown in fig. 2C, the text displayed in its dynamic text region may be changed from "straight traveling" to "straight traveling on the second lane". Of course, the text displayed in the dynamic text field may also remain unchanged, which is not limiting to the present disclosure.
In step S207, the virtual animation object is dynamically updated and displayed on the road screen corresponding to the total lane based on the initial lane in which the vehicle is traveling and the determined traveling state of the vehicle.
For example, alternatively, the mobile terminal may determine a lane change in which the vehicle is traveling based on an initial lane in which the vehicle is traveling and the determined traveling state of the vehicle; and displaying the animation of dynamically moving the virtual animation object from the road picture corresponding to the initial lane to the road picture corresponding to the changed lane.
Referring to fig. 2E, for example, assume that the mobile terminal determines that the running state of the vehicle is to change lanes from the third lane to the second lane to the left. The mobile terminal may display an animation corresponding to the lane change to the left in the form of an animation. Alternatively, the virtual animation object may be caused to translate from the 3 rd lane in the screen to one or more lanes beside the 3 rd lane to the left or right by displacing the virtual animation object with respect to the width direction of the screen of the mobile terminal. When the attribute animation is produced, the attribute animation can be realized by calling corresponding functions specified in the attribute animation. For example, when the parameter translationX for the attribute animation is displaced, its lateral displacement in the screen may be achieved (the numerical values therein are only schematically illustrated) by calling an animation setting function ObjectAnimatoranimator =objectanimation. Alternatively, the animation duration may be related to the speed of the current vehicle, the faster the speed, the shorter the animation duration. Alternatively, the virtual animation object may also be rotated clockwise/counterclockwise by a certain angle to indicate that it is turning left (or merging left) or right (or merging right).
The embodiment of the disclosure provides a method for displaying a running state of a vehicle, which can solve the problem of high cost of combined inertial navigation equipment by using a sensor arranged in a mobile terminal as a signal source and not depending on vehicle-mounted equipment. Meanwhile, the embodiment of the disclosure can accurately and efficiently identify and display various driving states of the vehicle under the condition of not depending on the strict fixed position of the mobile terminal through multi-sensor fusion and coordinate system calculation. Embodiments of the present disclosure also provide a driver with vehicle travel-related information more intuitively by displaying an animation of a travel state of a vehicle on a mobile terminal.
Fig. 3A is a schematic diagram showing a plurality of interfaces 300A provided to demonstrate a running state of a vehicle according to an embodiment of the present disclosure. Fig. 3B is a schematic diagram showing a plurality of interfaces 300B provided to demonstrate a running state of a vehicle according to an embodiment of the present disclosure. Fig. 3C is a schematic diagram illustrating a method 300C of determining a driving state of a vehicle on a coordinate system of the vehicle according to an embodiment of the present disclosure.
Optionally, the user may also set the vehicle status to be displayed or being displayed.
Optionally, as shown in fig. 3A, the mobile terminal provides a function switching area on the display screen, where the function switching area includes a plurality of custom setting buttons; displaying a custom setting interface corresponding to the triggered custom setting button under the condition that any one of the custom setting buttons is triggered; wherein, the custom setting includes: at least one part of the acceleration resolving setting, the attitude angle resolving setting, the general setting and the driving data recording setting.
For example, as shown in FIG. 3A, after triggering the acceleration button, the user provides a custom settings interface associated with the acceleration resolution settings.
Optionally, providing the custom setting interface corresponding to the triggered custom setting button further includes: displaying a filter setting item on the custom setting interface, wherein the filter setting specified by the filter setting item is used for filtering the acceleration data set, the angular velocity data set and the geomagnetic intensity and geomagnetic direction set, and wherein the filter setting item comprises a complementary filter enabling button used for triggering whether the acceleration data set, the angular velocity data set and the geomagnetic intensity and geomagnetic direction set are filtered by a complementary filter; and/or wherein the filter setting entry comprises a kalman filter enable button for triggering whether to filter the acceleration data set, the angular velocity data set, and the set of geomagnetic intensity and geomagnetic direction with a kalman filter.
The complementary filter and the kalman filter are both associated with attitude angle resolution. For example, as shown in FIG. 3A, after triggering the attitude angle button, the user provides a custom settings interface associated with the attitude angle resolution settings, including whether to turn on the complementary filter and the Kalman filter. Although fig. 3A only shows complementary filters and kalman filters, those skilled in the art will appreciate that the mobile terminal may also design more or fewer filters to achieve attitude angle resolution dependent filtering.
The attitude angle is one of parameters for determining a conversion relationship between a coordinate system of the mobile terminal and a reference coordinate system. A method 300C of how the mobile terminal determines the running state of the vehicle on the coordinate system of the vehicle based on the plurality of sensor data and the positioning data using the attitude angle is further described below with reference to fig. 3B.
In step S301, the mobile terminal may acquire attitude angle data sets at the plurality of times based on the acceleration data set, the angular velocity data set, and the sets of geomagnetic intensity and geomagnetic direction.
Pose refers to the positional relationship between one coordinate system relative to another coordinate system. The Y axis of the coordinate system of the mobile terminal points to the advancing direction of the mobile terminal, the Z axis is perpendicular to the direction of the screen of the mobile terminal, and the X axis is perpendicular to a plane formed by the Y axis and the Z axis. The X, Y, Z axes of the reference coordinate system are respectively along the north direction, the east direction and the earth center direction. The attitude angle is used to represent the relative angular positional relationship between the coordinate system of the mobile terminal to the reference coordinate system. Attitude angles include pitch angle, roll angle, and heading angle.
The pitch angle is the angle between the X-axis of the coordinate system of the mobile terminal and the X-axis of the reference coordinate system (angle range-90 ° to +90°). The roll angle is the angle between the Z-axis of the coordinate system of the mobile terminal and the Z-axis of the reference coordinate system (angle range-180 deg. to +180°). The heading angle is an angle (angle range-180 ° to +180°) between the Y-axis of the coordinate system of the mobile terminal and the Y-axis of the reference coordinate system.
The gyroscope may measure the angular velocity of the above-described pitch angle, roll angle, and heading angle rotation as the mobile terminal moves/turns/swings.
The accelerometer may be used as an inclinometer. The accelerometer can measure the characteristics of gravity, and when the accelerometer is inclined, the gravity acceleration on the triaxial of the accelerometer changes, so that the rolling angle and the pitch angle are calculated. Magnetometers are sensors used to measure the intensity of the earth's magnetic field. The mobile terminal may use a heading angle (a heading angle is a component of the three-axis magnetometer in a horizontal direction) among the magnetometer attitude angles. The initial pitch angle, roll angle and heading angle can be obtained by an accelerometer and a magnetometer when the mobile terminal is stationary. And then through the initial pitch angle, the rolling angle and the course angle, and then by utilizing the angular velocity measured by the gyroscope, the attitude angle data set comprising attitude angle data at a plurality of moments can be calculated.
The specific estimation process is briefly described below.
Any complex angular positional relationship between two coordinate systems can be considered as a complex of a finite number of elementary rotations in space, and their corresponding transformation matrices are equal to the successive multiplication of transformation matrices determined by their elementary rotations (i.e. rotation about the X-axis, rotation about the Y-axis and rotation about the Z-axis), the successive multiplication being arranged right to left in the order of the elementary rotations.
Assume that an initial pitch angle is obtained by an accelerometer and magnetometerRoll angle θ and heading angle/>
In the following representation, p is used as a subscript to represent a value in the coordinate system of the mobile terminal, and n is used as a subscript to represent a value in the reference coordinate system. The rotation from the reference coordinate system to the coordinate system of the mobile terminal is represented by p as an upper label and n as a lower label, and the rotation from the coordinate system of the mobile terminal to the reference coordinate system is represented by n as an upper label and p as a lower label. Thus, the X-axis, Y-axis and Z-axis rotation matrices are respectivelyThe three matrices are shown below:
The three matrixes are inversely multiplied to obtain a rotation matrix which is rotated from the coordinate system of the mobile terminal to the reference coordinate system
In order to reduce the amount of computation, the equation (2) may be computed using a quaternion. The quaternion and the Euler angle can be mutually converted, and compared with the Euler angle operation, the quaternion operation can avoid the Euler angle deadlock phenomenon while reducing the calculated amount. The quaternion is an supercomplex, which can be regarded as a vector in four-dimensional space, and can describe the fixed-point rotation of the rigid body. It consists of four elements, including three imaginary parts i, j, k and one real part. The conversion relation of quaternion and euler angle is shown below.
Delta is any Euler angle. Based on equation (2), a posture matrix can be obtained using the relationship between the four elements represented by equation (3) and the Euler angle(Gesture matrix/>Is a rotation matrix/>The inverse of) the quaternion representation:
The q 0,q1,q2,q3 can be based on the initial pitch angle Roll angle θ and heading angle/>And (5) solving.
It is assumed that the angular velocities of the x, y and z axes at time t in the coordinate system of the mobile terminal, which can be obtained by the gyroscope, are ω x、ωy and ω z, respectively. Equation (5) can be obtained using the differential equation of the quaternion.
Thus, an updated pose matrix can be obtained using the value of q 0,q1,q2,q3 at time tUsing gesture matrix/>The updated attitude angle can be back-deduced.
The updated pose matrix may then be brought in byTo further acquire acceleration data sets, angular velocity data sets, and geomagnetic intensity and geomagnetic direction sets in the reference coordinate system at each time.
For example, assume that in the coordinate system of the mobile terminal, the acceleration data is a p. In the reference coordinate system, the acceleration data is a n. At this time, the relation between a p and a n is as in formula (6).
For example, assume that the angular velocity data is w p in the coordinate system of the mobile terminal. In the reference coordinate system, the angular velocity data is w n. At this time, w p and w n are related as in formula (7).
For example, the user may also choose to turn on/off one or more filters associated with the acceleration resolution. After turning on the acceleration-resolving correlation filter, the mobile terminal will employ such a filter to further process the acceleration data a p. Thereby, a more accurate acceleration value a p is obtained. After the acceleration resolving correlation filter is closed, the mobile terminal can reduce the calculated amount of the acceleration correlation of the vehicle, so that other more important applications can obtain more calculation resources. Although fig. 3A shows only three filters, an average filter, a median filter, and a low pass filter, those skilled in the art will appreciate that more or fewer filters may be designed for the mobile terminal to achieve acceleration-resolved dependent filtering.
Since the gyroscope may have constant errors and random drift errors, although the accuracy of the output angle can be ensured in a short time, the output error thereof increases cumulatively with the increase of time. Thus, the user may also trigger calibration of the zero offset of the gyroscope by triggering a sensor calibration button as shown in fig. 3A.
In addition, calibration (e.g., filtering or fusion) can also be performed by fusing sensor data obtained from gyroscopes, accelerometers, and magnetometers. The accelerometer can accurately calculate the pitch angle and the roll angle by measuring the gravity field in a static or uniform state, and the accumulated error problem does not exist. However, since the mobile terminal is generally in a dynamic environment due to the running of the vehicle, the use of only an accelerometer may cause an output error. Under the condition that the magnetometer has no magnetic field interference, the heading angle can be accurately calculated, but the magnetic field interference is ubiquitous in actual use, and errors are caused to output.
For example, if a kalman filter enable signal is received, the mobile terminal may use kalman filtering for multi-sensor data fusion to achieve complementary advantages.
Optionally, if the above mentioned kalman filter enable signal is received, the kalman filter enable signal indicates whether the acceleration data set, the angular velocity data set, and the set of geomagnetic intensity and geomagnetic direction are filtered with a kalman filter.
The data of the gyroscope has good dynamic characteristics and can provide instantaneous angle change, so that an attitude angle (or quaternion) is used as a state value, and a quaternion differential equation is used for establishing a state equation:
where v is the error of the gyroscope. This value is obtained by the user after triggering the sensor calibration button as shown in fig. 3A to trigger the calibration of the zero offset of the gyroscope. If calibration of the zero offset is not triggered, v may be zero.
On the other hand, an observation equation is established using magnetometer data u t=[ux uy uz]T and triaxial acceleration data g t=[gx gy gz]T. In particular, since the gravitational acceleration is directed vertically downwards and the magnetic field force is directed north and obliquely towards the ground, the vector product of the magnetic field force and the gravitational acceleration can determine an east-west vector, i.eThen, a vector in the north-south direction, namely/>, can be determined by the gravity acceleration and the vector in the east-west directionThe observation matrix/>, at the moment, of the mobile terminal can be determined by combining the three vectors
Using equation of state (8) and observation matrixThe iteration of the recursion can be performed using a kalman filter algorithm. The specific iterative method is consistent with the traditional kalman iterative algorithm, and is not described herein.
In step S302, the mobile terminal may determine a conversion relationship between a coordinate system of the mobile terminal and a reference coordinate system based on the attitude angle data set.
As described above, the attitude angle t at each time corresponds to a conversion relationship between the coordinate system of the mobile terminal and the reference coordinate system at time t, and the conversion relationship between the coordinate system of the mobile terminal and the reference coordinate system is determined based on the attitude angle data set.
In step S303, the mobile terminal acquires a set of heading angle data of the vehicle based on the positioning data. At this time, a heading angle provided by a positioning component (GPS) of the mobile terminal may be used as a set of heading angle data of the vehicle.
In step S304, the mobile terminal determines a conversion relationship between the reference coordinate system and the coordinate system of the vehicle based on the set of heading angle data of the vehicle.
The coordinate system of the vehicle approximately coincides with the z-axis of the reference coordinate system, the heading angle gamma provided by the mobile terminal GPS corresponds to the attitude angle of the coordinate system of the vehicle relative to the reference coordinate system (the heading angle gamma is GPS, the pitch angle and the roll angle are 0), and a rotation matrix of the coordinate system of the vehicle and the reference coordinate system is calculated
Wherein,
In step S305, the mobile terminal determines a conversion relationship between the coordinate system of the mobile terminal and the coordinate system of the vehicle based on the conversion relationship between the coordinate system of the mobile terminal and the reference coordinate system, and the conversion relationship between the reference coordinate system and the coordinate system of the vehicle.
In step S306, the mobile terminal determines a running state of the vehicle on the coordinate system of the vehicle based on a conversion relationship between the coordinate system of the mobile terminal and the coordinate system of the vehicle.
Optionally, step S306 includes steps S3061 to S3063.
In step S3061, the mobile terminal converts the plurality of sensor data and the positioning data into a set of lateral acceleration, a set of longitudinal acceleration, and a set of heading angular velocity based on the coordinate system of the vehicle using a conversion relationship between the coordinate system of the mobile terminal and the coordinate system of the vehicle.
In step S3062, the mobile terminal extracts a time-behavior sequence of the vehicle from the set of lateral accelerations, the set of longitudinal accelerations, and the set of heading angular velocities.
In step S3063, the mobile terminal acquires a template time-behavior sequence and performs feature matching on the template time-behavior sequence and the time-behavior sequence of the vehicle to determine a running state of the vehicle on a coordinate system of the vehicle.
For example, the mobile terminal may collect driving state data of different category standards in advance as a template time-behavior sequence and then mark as straight driving, lane-changing to the left, lane-changing to the right, turning to the left, turning to the right, accelerating, decelerating, etc., respectively.
The application program runs to collect real-time track sequence, calculate similarity with template time-behavior sequence, in order to carry out the feature matching step shown in fig. 3C.
For example, when the user touches the record button in fig. 3A, the mobile terminal provides a driving data record setting interface to record different trajectories. Optionally, the user may also trigger a mode button, so that the mobile terminal provides a universal setting interface, which further sets the data augmentation and free fixation of the mobile terminal.
For example, suppose that the user wishes to be able to recognize further driving behavior at a later time, such as "lane change to the left continuously", "lane change to the right continuously", and so on. The user may enable the data augmentation function through a generic setup interface. After enabling the data augmentation function, the mobile terminal may provide a dialog box to prompt the user to name a new driving behavior. Suppose that the user wishes to recognize a driving behavior of "twice lane change to the left in succession" and names the driving behavior with "twice lane change to the left in succession". The user may then activate the record button in fig. 3A. At this time, as shown in fig. 3A, the "lane change to the left twice in succession" of the driving behavior that the user wishes to recognize may be displayed in the dialog box in the driving data record setting. Then, the mobile terminal will record data related to "twice successively changing tracks to the left" as template data. The template data may be further extracted into the template time-behavior sequence described above for later comparison.
Optionally, as shown in the general setup interface in fig. 3A, the user may also choose whether to turn on the free-standing function. If the free standing function is on, the mobile terminal will implement the above-described calculation of the coordinate system of the mobile terminal to the coordinate system of the vehicle to recognize the driving behavior of the vehicle without fixing the orientation of the mobile terminal. If the free standing function is off, the mobile terminal will not achieve the above-described resolution of the coordinate system of the mobile terminal to the coordinate system of the vehicle. At this time, the mobile terminal should be fixed and placed forward, and at this time, the coordinate system of the mobile terminal coincides with the coordinate system of the vehicle. Therefore, when the mobile terminal is placed in a fixed exceeding period, the calculation amount used by the coordinate system calculation is reduced, and the calculation accuracy is improved.
For example, a dynamic normalization (DTW) algorithm may be employed as a measure of similarity. The DTW can calculate the similarity of two time series, and is particularly suitable for time series with different durations and different rhythms. DTW will automatically warp the time series (i.e. locally scale on the time axis) so that the morphology of the two sequences is as consistent as possible, yielding the highest possible similarity. The characteristic sequences compared by the mobile terminal are the transverse acceleration, the longitudinal acceleration and the course angular velocity of the vehicle, different weights are given to the DTW similarity of the three sequences aiming at different driving behaviors, and the template sequence closest to the current sequence is judged to be the current driving state.
The embodiment of the disclosure provides a method for displaying a running state of a vehicle, which can solve the problem of high cost of combined inertial navigation equipment by using a sensor arranged in a mobile terminal as a signal source and not depending on vehicle-mounted equipment. Meanwhile, the embodiment of the disclosure can accurately and efficiently identify and display various driving states of the vehicle under the condition of not depending on the strict fixed position of the mobile terminal through multi-sensor fusion and coordinate system calculation. Embodiments of the present disclosure also provide a driver with vehicle travel-related information more intuitively by displaying an animation of a travel state of a vehicle on a mobile terminal.
As shown in fig. 4, an embodiment of the present disclosure also provides a mobile terminal 400 for displaying a driving state of a vehicle. Which includes a plurality of sensors 403, a positioning component 404, a display screen 402, and a processor 401.
The plurality of sensors 403 are configured to: a plurality of sensor data related to a running state of the vehicle is acquired, the plurality of sensor data being based on a coordinate system of a mobile terminal, the plurality of sensor data including acceleration data, angular velocity data, geomagnetic intensity data, and geomagnetic direction data.
The positioning component 404 is configured to: and acquiring positioning data of the mobile terminal.
The processor 401 is configured to: acquiring a total lane of a road on which the vehicle runs and an initial lane on which the vehicle runs; acquiring positioning data of the mobile terminal from the positioning component; acquiring the plurality of sensor data from the plurality of sensors; determining a driving state of the vehicle on a coordinate system of the vehicle based on the plurality of sensor data and the positioning data; a lane change of the vehicle is determined based on the initial lane in which the vehicle is traveling and the determined traveling state of the vehicle.
The display screen 402 is configured to: displaying a user interface, wherein the user interface comprises a road picture corresponding to the total lane, and the road picture corresponding to the total lane comprises a road picture corresponding to the initial lane; displaying a virtual animation object corresponding to the vehicle on the road picture, wherein the virtual animation object is displayed on the road picture corresponding to an initial lane where the vehicle runs; and displaying the animation of dynamically moving the virtual animation object from the road picture corresponding to the initial lane to the road picture corresponding to the changed lane.
The display screen 402 is further configured to display a dynamic text field on the user interface; and displaying the running state of the vehicle on the vehicle-based coordinate system in the dynamic text area in the text form; wherein the driving state includes: at least one of straight running, lane changing to the left, lane changing to the right, turning to the left, turning to the right, accelerating and decelerating.
The display screen 402 is further configured to display a vehicle instant data area on the user interface; displaying the running state of the vehicle on a coordinate system of the vehicle in the form of numerical values in the vehicle instant data area; wherein the running state includes at least a part of a lateral acceleration of the vehicle, a longitudinal acceleration of the vehicle, a heading angular velocity of the vehicle, a heading angle of the vehicle, a roll angle of the vehicle, and a pitch angle of the vehicle.
The display screen 402 is further configured to display a plurality of dynamic curve buttons on the user interface; when a first dynamic curve button in the plurality of dynamic curve buttons is triggered, a display interface of a dynamic curve corresponding to the first dynamic curve button is displayed; wherein the dynamic curve comprises: at least one of an acceleration profile based on a reference coordinate, an acceleration profile based on a coordinate system of the mobile terminal, an attitude angle profile based on a coordinate system of the mobile terminal, an angular velocity profile based on a coordinate system of the mobile terminal, a magnetic induction intensity profile based on a coordinate system of the mobile terminal, and a state parameter profile based on a coordinate system of the vehicle.
The display screen 402 is further configured to display a function switching region on the user interface, the function switching region including a plurality of custom setting buttons; displaying a custom setting interface corresponding to the triggered custom setting button under the condition that any one of the custom setting buttons is triggered; wherein, the custom setting includes: at least one part of the acceleration resolving setting, the attitude angle resolving setting, the general setting and the driving data recording setting.
The plurality of sensors 403 includes accelerometers, gyroscopes, and magnetometers. Wherein the acquiring the plurality of sensor data related to the running state of the vehicle further includes: acquiring acceleration data sets of at least three axial directions of the mobile terminal at a plurality of moments in the vehicle travelling process by using the accelerometer of the mobile terminal, wherein the at least three axial directions comprise a Y axis of the coordinate system based on the mobile terminal, an X axis of the coordinate system based on the mobile terminal and a Z axis of the coordinate system based on the mobile terminal; acquiring angular velocity data sets of the mobile terminal rotating around the at least three axial directions at the plurality of moments in time during the traveling of the vehicle by using a gyroscope of the mobile terminal; acquiring a set of geomagnetic intensity and geomagnetic direction of the mobile terminal at the plurality of moments in the running process of the vehicle by utilizing a magnetometer of the mobile terminal; and determining the plurality of sensor data related to the running state of the vehicle based on the acceleration data set, the angular velocity data set, and the sets of geomagnetic intensity and geomagnetic direction.
Wherein, providing the user-defined setting interface corresponding to the triggered user-defined setting button further comprises: and displaying a filter setting item on the custom setting interface, wherein the filter setting specified by the filter setting item is used for filtering the acceleration data set, the angular velocity data set and the geomagnetic intensity and geomagnetic direction set, and the filter setting item comprises a complementary filter enabling button which is used for triggering whether the acceleration data set, the angular velocity data set and the geomagnetic intensity and geomagnetic direction set are filtered by using a complementary filter. And/or wherein the filter setting entry comprises a kalman filter enable button for triggering whether to filter the acceleration data set, the angular velocity data set, and the set of geomagnetic intensities and geomagnetic directions with a kalman filter.
Wherein determining a driving state of the vehicle on a coordinate system of the vehicle based on the plurality of sensor data and the positioning data further comprises: acquiring attitude angle data sets at a plurality of moments based on acceleration data sets, angular velocity data sets and geomagnetic intensity and geomagnetic direction sets; determining a conversion relation between a coordinate system of the mobile terminal and a reference coordinate system based on the attitude angle data set; acquiring a set of heading angle data of the vehicle based on the positioning data; determining a conversion relationship between the reference coordinate system and a coordinate system of the vehicle based on the set of heading angle data of the vehicle; determining a conversion relation between the coordinate system of the mobile terminal and the coordinate system of the vehicle based on the conversion relation between the coordinate system of the mobile terminal and the reference coordinate system and the conversion relation between the reference coordinate system and the coordinate system of the vehicle; and determining a running state of the vehicle on the coordinate system of the vehicle based on a conversion relation between the coordinate system of the mobile terminal and the coordinate system of the vehicle.
The processes described above may also be implemented as computer software programs according to embodiments of the present disclosure. For example, embodiments of the present disclosure include a computer program product comprising a computer program tangibly embodied on a machine-readable medium, the computer program comprising program code for performing the method of the above-described process.
The embodiment of the disclosure provides a method for displaying a running state of a vehicle, a mobile terminal, electronic equipment and a computer storage medium for displaying the running state of the vehicle, which can solve the problem of high cost of combined inertial navigation equipment by using a sensor built in the mobile terminal as a signal source and not depending on vehicle-mounted equipment. Meanwhile, the embodiment of the disclosure can accurately and efficiently identify and display various driving states of the vehicle under the condition of not depending on the strict fixed position of the mobile terminal through multi-sensor fusion and coordinate system calculation. Embodiments of the present disclosure also provide a driver with vehicle travel-related information more intuitively by displaying an animation of a travel state of a vehicle on a mobile terminal.
According to yet another aspect of the present disclosure, there is also provided an apparatus for displaying a driving state of a vehicle. Fig. 5 shows a schematic diagram of an apparatus 2000 according to an embodiment of the present disclosure.
As shown in fig. 5, the apparatus 2000 may include one or more processors 2010, and one or more memories 2020. Wherein said memory 2020 has stored therein computer readable code which, when executed by said one or more processors 2010, can perform a method of demonstrating a driving state of a vehicle as described above.
The processor in embodiments of the present disclosure may be an integrated circuit chip having signal processing capabilities. The processor may be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. The disclosed methods, steps, and logic blocks in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like, and may be of the X86 architecture or ARM architecture.
In general, the various example embodiments of the disclosure may be implemented in hardware or special purpose circuits, software, firmware, logic, or any combination thereof. Some aspects may be implemented in hardware, while other aspects may be implemented in firmware or software which may be executed by a controller, microprocessor or other computing device. While aspects of the embodiments of the present disclosure are illustrated or described as block diagrams, flow charts, or using some other pictorial representation, it is well understood that the blocks, apparatus, systems, techniques or methods described herein may be implemented in, as non-limiting examples, hardware, software, firmware, special purpose circuits or logic, general purpose hardware or controller or other computing devices, or some combination thereof.
For example, a method or apparatus according to embodiments of the present disclosure may also be implemented by means of the architecture of computing device 3000 shown in fig. 6. As shown in fig. 6, computing device 3000 may include a bus 3010, one or more CPUs 3020, a Read Only Memory (ROM) 3030, a Random Access Memory (RAM) 3040, a communication port 3050 connected to a network, an input/output component 3060, a hard disk 3070, and the like. A storage device in the computing device 3000, such as the ROM 3030 or the hard disk 3070, may store various data or files for processing and/or communication use of the method for determining driving risk of a vehicle provided by the present disclosure and program instructions executed by the CPU. The computing device 3000 may also include a user interface 3080. Of course, the architecture shown in FIG. 6 is merely exemplary, and one or more components of the computing device shown in FIG. 6 may be omitted as may be practical in implementing different devices.
According to yet another aspect of the present disclosure, a computer-readable storage medium is also provided. Fig. 7 shows a schematic diagram 4000 of a storage medium according to the present disclosure.
As shown in fig. 7, the computer storage medium 4020 has stored thereon computer readable instructions 4010. When the computer-readable instructions 4010 are executed by the processor, a method of demonstrating a driving state of a vehicle according to an embodiment of the present disclosure described with reference to the above drawings may be performed. The computer readable storage medium in embodiments of the present disclosure may be volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The non-volatile memory may be read-only memory (ROM), programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), or flash memory. Volatile memory can be Random Access Memory (RAM), which acts as external cache memory. By way of example, and not limitation, many forms of RAM are available, such as Static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), synchronous Dynamic Random Access Memory (SDRAM), double data rate synchronous dynamic random access memory (ddr SDRAM), enhanced Synchronous Dynamic Random Access Memory (ESDRAM), synchronous Link Dynamic Random Access Memory (SLDRAM), and direct memory bus random access memory (DR RAM). It should be noted that the memory of the methods described herein is intended to comprise, without being limited to, these and any other suitable types of memory. It should be noted that the memory of the methods described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
Embodiments of the present disclosure also provide a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. A processor of a computer device reads the computer instructions from a computer-readable storage medium, and the processor executes the computer instructions, so that the computer device performs a method of exhibiting a driving state of a vehicle according to an embodiment of the present disclosure.
It is noted that the flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In general, the various example embodiments of the disclosure may be implemented in hardware or special purpose circuits, software, firmware, logic, or any combination thereof. Some aspects may be implemented in hardware, while other aspects may be implemented in firmware or software which may be executed by a controller, microprocessor or other computing device. While aspects of the embodiments of the present disclosure are illustrated or described as block diagrams, flow charts, or using some other pictorial representation, it is well understood that the blocks, apparatus, systems, techniques or methods described herein may be implemented in, as non-limiting examples, hardware, software, firmware, special purpose circuits or logic, general purpose hardware or controller or other computing devices, or some combination thereof.
The exemplary embodiments of the present disclosure described in detail above are illustrative only and are not limiting. Those skilled in the art will understand that various modifications and combinations of these embodiments or features thereof may be made without departing from the principles and spirit of the disclosure, and such modifications should fall within the scope of the disclosure.
Claims (14)
1. A method of exhibiting a driving state of a vehicle, the method being performed by a mobile terminal located in a driving vehicle, the method comprising:
acquiring a total lane of a road on which the vehicle runs and an initial lane on which the vehicle runs;
Displaying a user interface, wherein the user interface comprises a road picture corresponding to the total lane, and the road picture corresponding to the total lane comprises a road picture corresponding to the initial lane;
displaying a virtual animation object corresponding to the vehicle on the road picture, wherein the virtual animation object is displayed on the road picture corresponding to the initial lane;
Acquiring positioning data of the mobile terminal;
Acquiring a plurality of sensor data related to a running state of the vehicle, the plurality of sensor data being based on a coordinate system of a mobile terminal, the plurality of sensor data including acceleration data, angular velocity data, geomagnetic intensity data, and geomagnetic direction data;
Determining a driving state of the vehicle on a coordinate system of the vehicle based on the plurality of sensor data and the positioning data;
Dynamically updating and displaying the virtual animation object on a road picture corresponding to the total lane based on the initial lane in which the vehicle runs and the determined running state of the vehicle;
Displaying a function switching area on the user interface, wherein the function switching area comprises a plurality of custom setting buttons; and
Displaying a custom setting interface corresponding to the triggered custom setting button under the condition that any one of the custom setting buttons is triggered;
the custom setting interface corresponding to the custom setting button triggered by the display comprises:
displaying a filter setting item on the custom setting interface, wherein the filter setting specified by the filter setting item is used for filtering an acceleration data set, an angular velocity data set and a geomagnetic intensity and geomagnetic direction set,
Wherein the filter setting entry includes a complementary filter enable button for triggering whether to filter the acceleration data set, the angular velocity data set, and the set of geomagnetic intensity and geomagnetic direction with a complementary filter; and/or
The filter setting item comprises a Kalman filter enabling button, wherein the Kalman filter enabling button is used for triggering whether the acceleration data set, the angular velocity data set and the geomagnetic intensity and geomagnetic direction set are filtered by using a Kalman filter.
2. The method of claim 1, wherein dynamically updating the display state of the virtual animated object on the road screen based on the initial lane in which the vehicle is traveling and the determined travel state of the vehicle, further comprises:
Determining a lane change in which the vehicle is traveling based on the initial lane in which the vehicle is traveling and the determined traveling state of the vehicle; and
And displaying the animation of dynamically moving the virtual animation object from the road picture corresponding to the initial lane to the road picture corresponding to the changed lane.
3. The method of claim 1, wherein the acquiring the total lane of the road on which the vehicle is traveling and the initial lane on which the vehicle is traveling further comprises:
Acquiring a total lane of a road on which the vehicle travels and an initial lane on which the vehicle travels, which are input by a user;
acquiring a total lane of a road on which the vehicle runs and an initial lane on which the vehicle runs based on the positioning data of the mobile terminal and offline map information stored in the mobile terminal; or (b)
And inquiring information of the road on which the vehicle runs based on the positioning data of the mobile terminal so as to acquire a total lane of the road on which the vehicle runs and an initial lane on which the vehicle runs.
4. A method as in any of claims 1-3, further comprising:
Displaying a dynamic text area on the user interface; and
Displaying the running state of the vehicle on the coordinate system of the vehicle in the dynamic text area in the text form;
Wherein the driving state includes: at least one of straight running, lane changing to the left, lane changing to the right, turning to the left, turning to the right, accelerating and decelerating.
5. A method as in any of claims 1-3, further comprising:
displaying a vehicle instant data area on the user interface;
Displaying the running state of the vehicle on a coordinate system of the vehicle in the form of numerical values in the vehicle instant data area;
Wherein the running state includes at least a part of a lateral acceleration of the vehicle, a longitudinal acceleration of the vehicle, a heading angular velocity of the vehicle, a heading angle of the vehicle, a roll angle of the vehicle, and a pitch angle of the vehicle.
6. The method of claim 1, further comprising:
displaying a plurality of dynamic curve buttons on the user interface;
When a first dynamic curve button in the plurality of dynamic curve buttons is triggered, a display interface of a dynamic curve corresponding to the first dynamic curve button is displayed;
Wherein the dynamic curve comprises: at least one of an acceleration profile based on a reference coordinate, an acceleration profile based on a coordinate system of the mobile terminal, an attitude angle profile based on a coordinate system of the mobile terminal, an angular velocity profile based on a coordinate system of the mobile terminal, a magnetic induction intensity profile based on a coordinate system of the mobile terminal, and a state parameter profile based on a coordinate system of the vehicle.
7. A method according to any of claims 1-3, wherein the custom settings comprise: at least one part of the acceleration resolving setting, the attitude angle resolving setting, the general setting and the driving data recording setting.
8. The method of claim 7, wherein the acquiring a plurality of sensor data related to a driving state of the vehicle further comprises:
Acquiring acceleration data sets of at least three axial directions of the mobile terminal at a plurality of moments in the vehicle travelling process by using the accelerometer of the mobile terminal, wherein the at least three axial directions comprise a Y axis of the coordinate system based on the mobile terminal, an X axis of the coordinate system based on the mobile terminal and a Z axis of the coordinate system based on the mobile terminal;
acquiring angular velocity data sets of the mobile terminal rotating around the at least three axial directions at the plurality of moments in time during the traveling of the vehicle by using a gyroscope of the mobile terminal;
Acquiring a set of geomagnetic intensity and geomagnetic direction of the mobile terminal at the plurality of moments in the running process of the vehicle by utilizing a magnetometer of the mobile terminal; and
The plurality of sensor data related to the running state of the vehicle is determined based on the acceleration data set, the angular velocity data set, and the sets of geomagnetic intensity and geomagnetic direction.
9. The method of claim 8, wherein determining a travel state of the vehicle on a coordinate system of the vehicle based on the plurality of sensor data and the positioning data further comprises:
Acquiring attitude angle data sets at a plurality of moments based on the acceleration data sets, the angular velocity data sets, and the geomagnetic intensity and geomagnetic direction sets;
Determining a conversion relation between a coordinate system of the mobile terminal and a reference coordinate system based on the attitude angle data sets at a plurality of moments;
Acquiring a set of heading angle data of the vehicle based on the positioning data;
determining a conversion relationship between the reference coordinate system and a coordinate system of the vehicle based on the set of heading angle data of the vehicle;
determining a conversion relation between the coordinate system of the mobile terminal and the coordinate system of the vehicle based on the conversion relation between the coordinate system of the mobile terminal and the reference coordinate system and the conversion relation between the reference coordinate system and the coordinate system of the vehicle; and
And determining the running state of the vehicle on the coordinate system of the vehicle based on the conversion relation between the coordinate system of the mobile terminal and the coordinate system of the vehicle.
10. The method of claim 9, the determining the running state of the vehicle on the coordinate system of the vehicle based on the conversion relationship between the coordinate system of the mobile terminal and the coordinate system of the vehicle further comprising:
Converting the plurality of sensor data and the positioning data into a set of lateral acceleration, a set of longitudinal acceleration and a set of heading angular velocity based on a coordinate system of the vehicle using a conversion relationship between the coordinate system of the mobile terminal and the coordinate system of the vehicle;
Extracting a time-behavior sequence of the vehicle from the set of lateral accelerations, the set of longitudinal accelerations, and the set of heading angular velocities;
A template time-behavior sequence is acquired, and feature matching is performed on the template time-behavior sequence and the time-behavior sequence of the vehicle to determine a running state of the vehicle on a coordinate system of the vehicle.
11. A mobile terminal for displaying a driving state of a vehicle includes a display screen, a plurality of sensors, a positioning assembly, and a processor, wherein,
The plurality of sensors are configured to:
Acquiring a plurality of sensor data related to a running state of the vehicle, the plurality of sensor data being based on a coordinate system of a mobile terminal, the plurality of sensor data including acceleration data, angular velocity data, geomagnetic intensity data, and geomagnetic direction data;
The positioning assembly is configured to:
Acquiring positioning data of the mobile terminal;
the processor is configured to:
acquiring a total lane of a road on which the vehicle runs and an initial lane on which the vehicle runs;
acquiring positioning data of the mobile terminal from the positioning component;
acquiring the plurality of sensor data from the plurality of sensors;
Determining a driving state of the vehicle on a coordinate system of the vehicle based on the plurality of sensor data and the positioning data;
determining a lane change in which the vehicle is traveling based on the initial lane in which the vehicle is traveling and the determined traveling state of the vehicle;
The display screen is configured to:
Displaying a user interface, wherein the user interface comprises a road picture corresponding to the total lane, and the road picture corresponding to the total lane comprises a road picture corresponding to the initial lane;
Displaying a virtual animation object corresponding to the vehicle on the road picture, wherein the virtual animation object is displayed on the road picture corresponding to an initial lane where the vehicle runs;
Displaying the animation of dynamically moving the virtual animation object from the road picture corresponding to the initial lane to the road picture corresponding to the changed lane;
Displaying a function switching area, wherein the function switching area comprises a plurality of custom setting buttons;
Displaying a custom setting interface corresponding to the triggered custom setting button under the condition that any one of the custom setting buttons is triggered;
the custom setting interface corresponding to the custom setting button triggered by the display comprises:
displaying a filter setting item on the custom setting interface, wherein the filter setting specified by the filter setting item is used for filtering an acceleration data set, an angular velocity data set and a geomagnetic intensity and geomagnetic direction set,
Wherein the filter setting entry includes a complementary filter enable button for triggering whether to filter the acceleration data set, the angular velocity data set, and the set of geomagnetic intensity and geomagnetic direction with a complementary filter; and/or
The filter setting item comprises a Kalman filter enabling button, wherein the Kalman filter enabling button is used for triggering whether the acceleration data set, the angular velocity data set and the geomagnetic intensity and geomagnetic direction set are filtered by using a Kalman filter.
12. The mobile terminal of claim 11, wherein the display screen is further configured to:
displaying a dynamic text region on the user interface, and displaying a driving state of the vehicle on a coordinate system of the vehicle in the dynamic text region in a text form, wherein the driving state comprises: at least one of straight running, lane changing to the left, lane changing to the right, turning to the left, turning to the right, accelerating and decelerating; and/or
A vehicle instant data area is displayed on the user interface, and in the vehicle instant data area, a running state of the vehicle on a coordinate system of the vehicle is displayed in the form of a numerical value, wherein the running state comprises at least one part of lateral acceleration of the vehicle, longitudinal acceleration of the vehicle, heading angular velocity of the vehicle, heading angle of the vehicle, roll angle of the vehicle and pitch angle of the vehicle.
13. An electronic device, comprising:
One or more processors; and
One or more memories, wherein the memories have stored therein computer readable code, which when executed by the one or more processors, causes the electronic device to perform the method of any of claims 1-9.
14. A computer readable storage medium having stored thereon computer executable instructions which, when executed by a processor of an electronic device, cause the electronic device to implement the method of any of claims 1-9.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011180455.8A CN112304330B (en) | 2020-10-29 | 2020-10-29 | Method for displaying running state of vehicle and electronic equipment |
PCT/CN2021/116170 WO2022088973A1 (en) | 2020-10-29 | 2021-09-02 | Method for displaying vehicle driving state, and electronic device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011180455.8A CN112304330B (en) | 2020-10-29 | 2020-10-29 | Method for displaying running state of vehicle and electronic equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112304330A CN112304330A (en) | 2021-02-02 |
CN112304330B true CN112304330B (en) | 2024-05-24 |
Family
ID=74331437
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011180455.8A Active CN112304330B (en) | 2020-10-29 | 2020-10-29 | Method for displaying running state of vehicle and electronic equipment |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN112304330B (en) |
WO (1) | WO2022088973A1 (en) |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112304330B (en) * | 2020-10-29 | 2024-05-24 | 腾讯科技(深圳)有限公司 | Method for displaying running state of vehicle and electronic equipment |
CN113619589B (en) * | 2021-07-22 | 2022-11-15 | 中汽创智科技有限公司 | Method and device for determining driving behavior information, electronic equipment and storage medium |
CN113790732B (en) * | 2021-08-06 | 2023-09-01 | 荣耀终端有限公司 | Method and device for generating position information |
CN114279446B (en) * | 2021-12-22 | 2023-11-03 | 广东汇天航空航天科技有限公司 | Aerocar navigation attitude measurement method and device and aerocar |
CN114291106A (en) * | 2021-12-30 | 2022-04-08 | 阿波罗智联(北京)科技有限公司 | Information display method and device for vehicle, electronic equipment and storage medium |
CN115293301B (en) * | 2022-10-09 | 2023-01-31 | 腾讯科技(深圳)有限公司 | Estimation method and device for lane change direction of vehicle and storage medium |
CN115607955A (en) * | 2022-10-13 | 2023-01-17 | 蔚来汽车科技(安徽)有限公司 | Vehicle machine system, method for realizing extended display and storage medium |
CN116105747B (en) * | 2023-04-07 | 2023-07-04 | 江苏泽景汽车电子股份有限公司 | Dynamic display method for navigation path, storage medium and electronic equipment |
CN118331677B (en) * | 2024-06-12 | 2024-09-20 | 华域视觉科技(上海)有限公司 | Animation object display method, electronic device, storage medium and vehicle |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5061264B1 (en) * | 2012-03-23 | 2012-10-31 | 国立大学法人 千葉大学 | Small attitude sensor |
CN103818333A (en) * | 2012-11-16 | 2014-05-28 | 西安众智惠泽光电科技有限公司 | Rollover prevention pre-warning system of vehicle |
CN106205175A (en) * | 2015-05-28 | 2016-12-07 | Lg电子株式会社 | display device and vehicle for vehicle |
CN110320891A (en) * | 2019-07-09 | 2019-10-11 | 中车青岛四方车辆研究所有限公司 | Rail vehicle braking system CAN bus message maintenance monitoring system and monitoring method |
CN110332979A (en) * | 2019-06-12 | 2019-10-15 | 南京国科软件有限公司 | Vehicular vibration monitoring alarm |
CN110466516A (en) * | 2019-07-11 | 2019-11-19 | 北京交通大学 | A kind of curved road automatic vehicle lane-change method for planning track based on Non-Linear Programming |
CN111034164A (en) * | 2017-06-04 | 2020-04-17 | 苹果公司 | User interface camera effects |
CN111551174A (en) * | 2019-12-18 | 2020-08-18 | 无锡北微传感科技有限公司 | High-dynamic vehicle attitude calculation method and system based on multi-sensor inertial navigation system |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3018030B1 (en) * | 2014-11-10 | 2019-04-24 | Magneti Marelli S.p.A. | Device for the detection of the attitude of motor vehicles |
KR101942793B1 (en) * | 2015-07-03 | 2019-01-28 | 엘지전자 주식회사 | Driver Assistance Apparatus and Vehicle Having The Same |
KR20180116574A (en) * | 2017-04-17 | 2018-10-25 | 엘지전자 주식회사 | Mobile terminal |
KR20190023550A (en) * | 2017-08-29 | 2019-03-08 | 현대자동차주식회사 | Driving assist system using navigation information and operating method thereof |
CN107843255A (en) * | 2017-10-24 | 2018-03-27 | 燕山大学 | Towards the engineering truck driving posture measuring system and method for motion reappearance |
US10921135B2 (en) * | 2018-10-12 | 2021-02-16 | Baidu Usa Llc | Real-time map generation scheme for autonomous vehicles based on prior driving trajectories |
CN110455300B (en) * | 2019-09-03 | 2021-02-19 | 广州小鹏汽车科技有限公司 | Navigation method, navigation display device, vehicle and machine readable medium |
CN111137298B (en) * | 2020-01-02 | 2021-11-16 | 中车株洲电力机车有限公司 | Vehicle automatic driving method, device, system and storage medium |
CN111623795B (en) * | 2020-05-28 | 2022-04-15 | 阿波罗智联(北京)科技有限公司 | Live-action navigation icon display method, device, equipment and medium |
CN112304330B (en) * | 2020-10-29 | 2024-05-24 | 腾讯科技(深圳)有限公司 | Method for displaying running state of vehicle and electronic equipment |
-
2020
- 2020-10-29 CN CN202011180455.8A patent/CN112304330B/en active Active
-
2021
- 2021-09-02 WO PCT/CN2021/116170 patent/WO2022088973A1/en active Application Filing
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5061264B1 (en) * | 2012-03-23 | 2012-10-31 | 国立大学法人 千葉大学 | Small attitude sensor |
CN103818333A (en) * | 2012-11-16 | 2014-05-28 | 西安众智惠泽光电科技有限公司 | Rollover prevention pre-warning system of vehicle |
CN106205175A (en) * | 2015-05-28 | 2016-12-07 | Lg电子株式会社 | display device and vehicle for vehicle |
CN111034164A (en) * | 2017-06-04 | 2020-04-17 | 苹果公司 | User interface camera effects |
CN110332979A (en) * | 2019-06-12 | 2019-10-15 | 南京国科软件有限公司 | Vehicular vibration monitoring alarm |
CN110320891A (en) * | 2019-07-09 | 2019-10-11 | 中车青岛四方车辆研究所有限公司 | Rail vehicle braking system CAN bus message maintenance monitoring system and monitoring method |
CN110466516A (en) * | 2019-07-11 | 2019-11-19 | 北京交通大学 | A kind of curved road automatic vehicle lane-change method for planning track based on Non-Linear Programming |
CN111551174A (en) * | 2019-12-18 | 2020-08-18 | 无锡北微传感科技有限公司 | High-dynamic vehicle attitude calculation method and system based on multi-sensor inertial navigation system |
Non-Patent Citations (4)
Title |
---|
Hyun-Eui Kim ; Jeho Nam ; Keehoon Hong ; Jinwoong Kim.Noise-filtering method for large-scale holographic 3D display.2018 IEEE International Conference on Imaging Systems and Techniques.2018,全文. * |
一种车辆运动姿态参数的实时测量方法;杨澜;郝茹茹;王润民;戚秀真;;中国科技论文;20161031;11(19);全文 * |
付雷,章政,余义.基于改进型显性互补滤波的MEMS姿态解算.自动化与仪表.2018,(第11期),全文. * |
刘军 ; 聂斐 ; 蔡骏宇 ; 熊明路 ; 陶昌岭 ; .基于安卓的汽车状态测量及预测系统的研究.汽车技术.2015,(02),全文. * |
Also Published As
Publication number | Publication date |
---|---|
CN112304330A (en) | 2021-02-02 |
WO2022088973A1 (en) | 2022-05-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112304330B (en) | Method for displaying running state of vehicle and electronic equipment | |
CN110160542B (en) | Method and device for positioning lane line, storage medium and electronic device | |
CN110556012B (en) | Lane positioning method and vehicle positioning system | |
CN113554698B (en) | Vehicle pose information generation method and device, electronic equipment and storage medium | |
CN111959495B (en) | Vehicle control method and device and vehicle | |
CN106814753B (en) | Target position correction method, device and system | |
US20210370970A1 (en) | Positioning Method And Apparatus, Autonomous Driving Vehicle, Electronic Device And Storage Medium | |
JP6335556B2 (en) | Information query by pointing | |
CN108801276A (en) | Accurately drawing generating method and device | |
CN111562603B (en) | Navigation positioning method, equipment and storage medium based on dead reckoning | |
CN106705959A (en) | Method and device for detecting course of mobile terminal | |
KR20230008000A (en) | Positioning method and apparatus based on lane line and feature point, electronic device, storage medium, computer program and autonomous vehicle | |
CN108235809A (en) | End cloud combination positioning method and device, electronic equipment and computer program product | |
US20230392938A1 (en) | Navigation information processing method, electronic device, and storage medium | |
WO2024139716A1 (en) | Data processing method and apparatus, and device, computer-readable storage medium and computer program product | |
Zhou et al. | DeepVIP: Deep learning-based vehicle indoor positioning using smartphones | |
CN107942090A (en) | A kind of spacecraft Attitude rate estimator method based on fuzzy star chart extraction Optic flow information | |
CN112363196B (en) | Vehicle attribute determining method, device, storage medium and electronic equipment | |
CN110780325A (en) | Method and device for positioning moving object and electronic equipment | |
CN115560744A (en) | Robot, multi-sensor-based three-dimensional mapping method and storage medium | |
CN110068341A (en) | A kind of automobile inertial navigation application method and device | |
CN104699987A (en) | Inertial arm action capture data fusion method | |
CN116718196B (en) | Navigation method, device, equipment and computer readable storage medium | |
Trojnacki et al. | Determination of motion parameters with inertial measurement units–Part 1: mathematical formulation of the algorithm | |
TWI811733B (en) | Attitude measurement method, navigation method and system of transportation vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
REG | Reference to a national code |
Ref country code: HK Ref legal event code: DE Ref document number: 40038332 Country of ref document: HK |
|
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |