CN114882579A - Control method and device of vehicle-mounted screen and vehicle - Google Patents

Control method and device of vehicle-mounted screen and vehicle Download PDF

Info

Publication number
CN114882579A
CN114882579A CN202110078963.3A CN202110078963A CN114882579A CN 114882579 A CN114882579 A CN 114882579A CN 202110078963 A CN202110078963 A CN 202110078963A CN 114882579 A CN114882579 A CN 114882579A
Authority
CN
China
Prior art keywords
vehicle
screen
human eyes
seat
mounted screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202110078963.3A
Other languages
Chinese (zh)
Inventor
唐帅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Audi Stock Co
Original Assignee
Audi Stock Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Audi Stock Co filed Critical Audi Stock Co
Priority to CN202110078963.3A priority Critical patent/CN114882579A/en
Publication of CN114882579A publication Critical patent/CN114882579A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Remote Sensing (AREA)
  • Computer Graphics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)

Abstract

The invention discloses a vehicle-mounted screen control method, a device and a vehicle, wherein the method comprises the following steps: acquiring three-dimensional coordinates of human eyes in the vehicle; determining an in-vehicle seat associated with the three-dimensional coordinates of the human eye; determining the vehicle-mounted screen controlled by the human eyes according to the seat and the incidence relation between the seat and the vehicle-mounted screen; determining the sight line characteristics of the human eyes; and sending a specific control instruction to the vehicle-mounted screen controlled by the human eyes according to the determined sight characteristics of the human eyes. Through this disclosure can only be controlled the screen by the passenger in the seat that the screen is relevant, improve the accuracy degree of on-vehicle screen sight control to whether open the screen based on the passenger watches the screen decision, practice thrift battery power, reduce the electric quantity loss, more can avoid under the darker circumstances of light, the passenger does not watch the screen, and the screen keeps the light stimulation and uncomfortable sense that the open mode brought, improves and uses the car to experience.

Description

Control method and device of vehicle-mounted screen and vehicle
Technical Field
The invention relates to the technical field of vehicle control, in particular to a control method and device for a vehicle-mounted screen and a vehicle.
Background
With the development of communication technology and the great abundance of audio and video contents, in order to make people more comfortable and comfortable to ride and use vehicles, a vehicle-mounted screen capable of being networked is usually arranged in front of a seat in modern vehicle design, so that people can perform activities such as audio and video entertainment and the like when using the vehicle. The power supply of the vehicle-mounted screen is mainly provided by a vehicle-mounted battery. Therefore, when the passenger does not watch or use the vehicle-mounted screen, the vehicle-mounted screen continues to play, the electric quantity of the battery is easily wasted, and the loss of the vehicle battery is increased. Meanwhile, in a dark environment, such as at night, when the passenger does not watch the vehicle-mounted screen, the light of the continuously-opened vehicle-mounted screen may also cause discomfort to the passenger. Therefore, a technical scheme for conveniently, accurately and flexibly controlling the vehicle-mounted screen used by the vehicle-mounted screen is needed to be provided for a vehicle user.
Disclosure of Invention
An object of the present invention is to provide a new solution for on-board screen control.
According to a first aspect of the present disclosure, there is provided an in-vehicle screen control method including:
acquiring three-dimensional coordinates of human eyes in the vehicle;
determining an in-vehicle seat associated with the three-dimensional coordinates of the human eye;
determining the vehicle-mounted screen controlled by the human eyes according to the seat and the incidence relation between the seat and the vehicle-mounted screen;
determining the sight line characteristics of the human eyes;
and sending a specific control instruction to the vehicle-mounted screen controlled by the human eyes according to the determined sight characteristics of the human eyes.
Preferably, the method further comprises the step of establishing the relationship between the seat and the vehicle screen:
acquiring a three-dimensional coordinate range of each seat in the vehicle and a three-dimensional coordinate range of each vehicle-mounted screen;
and establishing the association relationship between the seats and the vehicle-mounted screen according to a preset rule, the three-dimensional coordinate range of each seat in the vehicle and the three-dimensional coordinate range of each vehicle-mounted screen.
Preferably, the method further comprises:
and the three-dimensional coordinate range of each seat and the three-dimensional coordinate range of each vehicle-mounted screen are adjusted in real time according to the adjustment of the position of the seat.
Preferably, in the method, in the association relationship between the seat and the vehicle-mounted screen: each seat is associated with at least one in-vehicle screen.
Preferably, the sight line feature of the human eye comprises:
the vehicle-mounted screen has no sight line, the sight line does not stay in the three-dimensional coordinate range of the vehicle-mounted screen, and the sight line stays in the three-dimensional coordinate range of the vehicle-mounted screen.
Preferably, the control instruction includes: an open command and a close command.
Preferably, in the method, determining the sight line characteristics of the human eyes and sending a specific control instruction to the vehicle-mounted screen controlled by the human eyes according to the determined sight line characteristics of the human eyes include:
determining that there is no line of sight for the human eye;
and sending a closing instruction to the vehicle-mounted screen controlled by the human eyes.
Preferably, in the method, determining the sight line characteristics of the human eyes and sending a specific control instruction to the vehicle-mounted screen controlled by the human eyes according to the determined sight line characteristics of the human eyes include:
determining that the sight of the human eyes stays in the three-dimensional coordinate range of the vehicle-mounted screen;
determining that the duration of the sight line of the human eyes staying in the three-dimensional coordinate range of the vehicle-mounted screen exceeds a preset threshold;
and sending an opening instruction to the vehicle-mounted screen controlled by the human eyes.
Preferably, in the method, determining the sight line characteristics of the human eyes and sending a specific control instruction to the vehicle-mounted screen controlled by the human eyes according to the determined sight line characteristics of the human eyes include:
determining that the sight of the human eyes does not stay in the three-dimensional coordinate range of the vehicle-mounted screen;
determining that the duration that the sight of the human eyes does not stay in the three-dimensional coordinate range of the vehicle-mounted screen exceeds a preset threshold;
and sending a closing instruction to the vehicle-mounted screen controlled by the human eyes.
According to a second aspect of the present disclosure, there is also provided an in-vehicle screen control device, characterized by comprising:
the coordinate acquisition module is used for acquiring three-dimensional coordinates of human eyes in the vehicle;
a seat determination module for determining an in-vehicle seat associated with the three-dimensional coordinates of the human eye;
the screen determining module is used for determining the vehicle-mounted screen controlled by the human eyes according to the seat and the incidence relation between the seat and the vehicle-mounted screen;
the characteristic determining module is used for determining the sight line characteristic of the human eyes; and
and the communication module is used for sending a specific control instruction to the vehicle-mounted screen controlled by the human eyes according to the determined sight characteristics of the human eyes.
According to a third aspect of the present disclosure, there is also provided a vehicle, including the on-board screen control device disclosed in the second aspect, or including a camera, an on-board screen, a memory and a processor, where the memory stores computer instructions, and the computer instructions are executed by the processor to implement the method for controlling the on-board screen in any one of the first aspect of the present disclosure.
The method has the advantages that the three-dimensional coordinates of human eyes in the vehicle are obtained; determining an in-vehicle seat associated with the three-dimensional coordinates of the human eye; determining the vehicle-mounted screen controlled by the human eyes according to the seat and the incidence relation between the seat and the vehicle-mounted screen; determining the sight line characteristics of the human eyes; and sending a specific control instruction to the vehicle-mounted screen controlled by the human eyes according to the determined sight characteristics of the human eyes. The screen can be controlled only by the passenger in the seat that the screen is relevant, improves the accuracy degree of on-vehicle screen sight control to whether open the screen based on whether the passenger watches the screen decision, practice thrift battery power, reduce the power loss. Meanwhile, the situation that the passenger does not watch the screen under the condition of dark light and the screen is kept in an open state to cause light stimulation and uncomfortable feeling can be avoided, and the vehicle using experience is improved.
Other features of the present invention and advantages thereof will become apparent from the following detailed description of exemplary embodiments thereof, which proceeds with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description, serve to explain the principles of the invention.
Fig. 1 is a block diagram showing an example of a hardware configuration that can be used to implement an in-vehicle screen control method of an embodiment of the present invention;
FIG. 2 is a flowchart illustrating a control method for a vehicle screen according to an embodiment of the present disclosure;
FIG. 3 is a diagram illustrating a method for controlling an in-vehicle screen according to an embodiment of the present invention;
FIG. 4 is a block schematic diagram of a control device of an in-vehicle screen according to one embodiment;
FIG. 5 is a schematic illustration of a vehicle according to one embodiment.
Detailed Description
Various exemplary embodiments of the present invention will now be described in detail with reference to the accompanying drawings. It should be noted that: the relative arrangement of the components and steps, the numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present invention unless specifically stated otherwise.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the invention, its application, or uses.
Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate.
In all examples shown and discussed herein, any particular value should be construed as merely illustrative, and not limiting. Thus, other examples of the exemplary embodiments may have different values.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
< implementation Environment and hardware configuration >
Fig. 1 is an exemplary block diagram showing a hardware configuration of an in-vehicle screen control system 100 that can be used to implement an embodiment of the present invention.
As shown in fig. 1, the in-vehicle screen control system 100 includes a vehicle 1000, a server 2000, and a network 3000. It will be appreciated that the overall architecture, arrangement, and operation, as well as the individual components of the system as illustrated herein, are well known in the art. Thus, the following paragraphs merely provide an overview of one such exemplary environment, and other systems that include or employ the autopilot system architecture shown in the present implementation environment or that have the associated functionality described herein may also serve as an implementation environment for the present subject matter.
The Vehicle 1000 may be, for example, various types of automobiles, multi-Purpose Vehicles (MPVs), Sport Utility Vehicles (SUVs), Cross Utility Vehicles (CUVs), Recreational Vehicles (RVs), Autonomous Vehicles (AVs), trucks, other mobile machines for transporting persons or goods, and the like. In many cases, the vehicle 1000 may be powered by, for example, an internal combustion engine. The Vehicle 1000 may also be a Hybrid Electric Vehicle (HEV) powered by both an internal combustion engine and one or more Electric motors, such as a Series Hybrid Electric Vehicle (SHEV), a Parallel Hybrid Electric Vehicle (PHEV), a Parallel-Series Hybrid Electric Vehicle (PSHEV), and so forth. The type of vehicle 1000, the manner of providing power, etc. may be of any form, and the foregoing examples are not intended to be limiting.
The vehicle 1000 may be provided with an electronic system including, for example: a processor 1100, a memory 1200, an interface device 1300, a communication device 1400, an output device 1500, an input device 1600, a navigation device 1700, and the like. The processor 1100 may be a microprocessor MCU or the like. The memory 1200 includes, for example, a ROM (read only memory), a RAM (random access memory), a nonvolatile memory such as a hard disk, and the like. The interface device 1300 includes, for example, a USB interface, a headphone interface, and the like. The communication device 1400 is capable of wired or wireless communication, for example, and also capable of short-range and long-range communication, for example.
The output device 1500 may be, for example, a device that outputs a signal, and may be a display device such as a liquid crystal display, a touch display, a speaker, or the like. The input device 1600 may include, for example, a touch screen, buttons, knobs, a keyboard, a microphone, a camera, and the like. The input device 1600 may acquire the status of the vehicle occupant in real time when it is a camera, and it may be matched to an embedded voice processing unit using Human Machine Interface (HMI) technology known in the art, or may be an independent component.
The Navigation device 1700 includes, for example, a receiver such as a global Navigation Satellite system gnss (global Navigation Satellite system), a global Positioning system gps (global Positioning system), or a beidou Navigation Satellite system bds (beidou Navigation Satellite system), a Navigation hmi (human Machine interface), and a route determination unit. The navigation device 1700 stores the map information in a storage device such as an hdd (hard Disk drive) or flash memory. The receiver determines the position of the vehicle 1000 based on signals received from satellites such as GNSS. The position of the vehicle 1000 may be determined or supplemented by ins (inertial Navigation system) using outputs of vehicle sensors and the like. The route determination unit determines a route from the position of the vehicle 1000 (or an arbitrary input position) specified by a receiver such as a GNSS to the destination input by the passenger using the navigation HMI, for example, with reference to map information. The map information is, for example, information of a road shape expressed by a link indicating a road and nodes connected by the link. The map information may include curvature Of a road, poi (point Of interest), geographical coordinate information Of each position in the road, driving road information, and the like. The navigation device 1700 may be implemented by a function of a terminal device such as a smartphone or a tablet terminal that is held by a passenger. The navigation apparatus 1700 may transmit the current position and the destination to the navigation server via the communication apparatus 1400, and acquire a route equivalent to the route on the map from the navigation server. The vehicle electronic system can determine the geographic position of the vehicle through the navigation device, plan the driving path and upload the geographic position data to the server.
Although a plurality of devices of the vehicle 1000 are shown in fig. 1, the present technical solution may use only some of the devices therein, for example, the vehicle 1000 only involves the input device 1600 and the output device 1500. Alternatively, a light system not shown in fig. 1 controlled by the processor 1100, a sensor device for detecting the surroundings of the vehicle, and the like may be further included.
The server 2000 provides a service point for processes, databases, communication facilities, and the like. The server 2000 may comprise a unitary server or a distributed server across multiple computers or computer data centers. The server may be of various types, such as, but not limited to, a web server, a news server, a mail server, a message server, an advertisement server, a file server, an application server, an interaction server, a database server, or a proxy server. In some embodiments, each server may include hardware, software, or embedded logic components or a combination of two or more such components for performing the appropriate functions supported or implemented by the server. For example, a server, such as a blade server, a cloud server, etc., or may be a server group consisting of a plurality of servers, which may include one or more of the above types of servers, etc.
In one embodiment, the server 2000 may be as shown in fig. 1, including a processor 2100, a memory 2200, an interface device 2300, a communication device 2400, a display device 2500, an input device 2600. In other embodiments, the server 2000 may further include a speaker, a microphone, and the like, which are not limited herein.
The processor 2100 may be a dedicated server processor, or may be a desktop processor, a mobile version processor, or the like that meets performance requirements, and is not limited herein. The memory 2200 includes, for example, a ROM (read only memory), a RAM (random access memory), a nonvolatile memory such as a hard disk, and the like. The interface device 2300 includes, for example, various bus interfaces, such as a serial bus interface (including a USB interface), a parallel bus interface, and the like. Communication device 2400 is capable of wired or wireless communication, for example. The display device 2500 is, for example, a liquid crystal display panel, a touch panel, or the like. The input device 2600 may include, for example, a touch screen, a keyboard, and the like. Although a plurality of devices of the server 2000 are illustrated in fig. 1, the present invention may relate to only some of the devices, for example, the server 2000 relates to only the memory 2200 and the processor 2100.
The network 3000 may include not only a wireless communication network and a wired communication network, but also generally refers to all communication modes that can perform communication, for example, one or more of optical fiber communication, general microwave communication, power line carrier communication, wired audio cable communication, ultrahigh frequency radio station communication, wireless spread spectrum communication, infrared ray, bluetooth, Radio Frequency Identification (RFID), keyless entry, and smart key, and the like. In the autonomous driving system 100 shown in fig. 1, the communication method between the vehicle 1000 and the server 2000 may be wireless communication via the network 3000, and the communication method between the vehicle 1000 and another vehicle may be wireless communication, a bluetooth method, or the like. The vehicle 1000 and the server 2000 may be the same or different, and the network 3000 through which the vehicle 1000 communicates with other vehicles may be different.
It should be understood that although fig. 1 shows only one vehicle 1000, server 2000, network 3000, it is not meant to limit the number of each, and multiple vehicles 1000, multiple servers 2000, multiple networks 3000 may be included in the on-board screen control system 100.
In the above description, the skilled person can design the instructions according to the solutions provided in the present disclosure. How the instructions control the operation of the processor is well known in the art and will not be described in detail herein.
The computing system shown in FIG. 1 is illustrative only and is not intended to limit the invention, its application, or uses in any way.
< method examples >
FIG. 2 is a flow diagram of an in-vehicle screen control method according to one embodiment. In this embodiment, a vehicle 1000 is taken as an example to describe the on-vehicle screen control method of the present embodiment.
As shown in fig. 2, the in-vehicle screen control method of the present embodiment may include the following steps S210 to S250:
in step S210, three-dimensional coordinates of human eyes in the vehicle are acquired.
In the embodiment, in order to meet the requirements of audio and video and entertainment when people take a car, a vehicle-mounted screen which can be networked is usually arranged in front of each seat in the modern vehicle design besides a vehicle-mounted screen operated by a driver, so that passengers can take the car. The common states of the vehicle-mounted screen can be an open state and a closed state. The power supply of vehicle-mounted screen is usually provided by vehicle-mounted battery, therefore, does not have the passenger at the seat that vehicle-mounted screen corresponds, perhaps when the passenger does not watch vehicle-mounted screen, in time close and vehicle-mounted screen can practice thrift battery power, reduces extravagantly, and when the passenger need watch again, in time open vehicle-mounted screen and be favorable to the passenger to continue to use, improves the passenger and uses and experience. The on-board screen here may also include display means of augmented reality, projection, etc.
In this embodiment, in order to control the car-mounted screen by the corresponding user and improve the control accuracy of the car-mounted screen, it is necessary to confirm that a passenger is present in the car before controlling the car-mounted screen. The sight line control technology is adopted in the method, therefore, the three-dimensional coordinates of human eyes in the vehicle are firstly obtained so as to accurately judge that passengers exist in the vehicle, and further the positions of the seats where the passengers are located are judged.
In this embodiment, the manner of acquiring the three-dimensional coordinates of human eyes may be, for example, that a central point of a rear axle of the vehicle is used as an origin of the three-dimensional coordinates, a direction of a head of the vehicle is an X-axis, a direction of a left side of the vehicle is a Y-axis, and a direction of a roof perpendicular to the X-axis and the Y-axis is a Z-axis. And acquiring three-dimensional coordinates of human eyes in the vehicle through one or more cameras installed in the vehicle.
Step S220, determining the in-vehicle seat associated with the three-dimensional coordinates of the human eye.
In this embodiment, according to the vehicle interior three-dimensional coordinate system set by the foregoing example, the three-dimensional coordinate range of each seat in the vehicle can be acquired. And matching the three-dimensional coordinates of the human eyes acquired in the step S210 with the three-dimensional coordinate range of the in-vehicle seat, and determining the three-dimensional coordinate range of the seat in which the three-dimensional coordinates of the human eyes fall, wherein the seat is taken as the in-vehicle seat associated with the three-dimensional coordinates of the human eyes.
And step S230, determining the vehicle-mounted screen controlled by the human eyes according to the seat and the incidence relation between the seat and the vehicle-mounted screen.
The seat-to-vehicle screen association in this embodiment refers to an association that can identify which screens each seat is associated with, and for example, the association may be a look-up table. Through the association relationship, the vehicle-mounted screen which can be correspondingly controlled by the human eye sight in each seat can be confirmed, and the corresponding vehicle-mounted screen can be controlled only by the human eye sight in the seat which is determined in the association relationship, and the vehicle-mounted screen which does not have the association relationship with the seat can not be controlled by the human eye sight in the seat. The accuracy of screen control is improved.
In one embodiment, the on-board screen control method may further include, for example, the step of establishing the relationship between the seat and the on-board screen. According to the vehicle interior three-dimensional coordinate system set by the foregoing example, the three-dimensional coordinate range of each seat and the three-dimensional coordinate range of each on-vehicle screen in the vehicle can be acquired. And establishing the association relationship between the seats and the vehicle-mounted screen according to a preset rule, the three-dimensional coordinate range of each seat in the vehicle and the three-dimensional coordinate range of each vehicle-mounted screen.
The preset rule may be, for example, to ensure that an association relationship is established between the seat and the vehicle-mounted screen that is closest to the front of the seat according to the positions of the seat and the screen in the vehicle. For example, it may be a rule that is preset to preferentially facilitate the passenger to view and use the in-vehicle screen while riding in the car.
In this embodiment, in the established association relationship between the seat and the vehicle-mounted screen, the three-dimensional coordinate range of the seat and the three-dimensional coordinate range of the vehicle-mounted screen may form a spatial correspondence relationship without occlusion in space.
In one embodiment, since the on-board screen for the rear passenger is generally disposed behind the front seat in the vehicle, the on-board screen for the rear passenger may move along with the front seat, and to ensure the relationship between the seats and the on-board screen, the on-board screen control method may further include, for example, adjusting the three-dimensional coordinate range of each seat and the three-dimensional coordinate range of each on-board screen in real time according to the adjustment of the seat position. When the vehicle-mounted screen is used during the riding of passengers, the vehicle-mounted screen related to the seat can be accurately controlled.
In this embodiment, the real-time adjustment may be, for example, to obtain mechanical parameters of the seat and the vehicle-mounted screen, and adjust the three-dimensional coordinate ranges of the seat and the vehicle-mounted screen according to the change of the mechanical parameters. For example, the three-dimensional coordinate ranges of the adjusted seat and the vehicle-mounted screen can be obtained through the camera and the in-vehicle three-dimensional coordinate system, and the association relationship between the seat and the vehicle-mounted screen can be established according to a preset rule, the three-dimensional coordinate range of the adjusted seat and the three-dimensional coordinate range of the vehicle-mounted screen.
Since there may be multiple on-board screens within the vehicle that are convenient for the passenger to view directly, to ensure that the passenger is convenient to use, in one embodiment, each seat within the vehicle may be associated with at least one on-board screen. For example, the passenger seat may be associated with a vehicle screen directly in front of it, and may also be associated with a vehicle center vehicle screen used by the driver.
In this embodiment, when the passenger seat is associated with the vehicle center on-board screen used by the driver, a priority may be set, the priority of the driver seat-associated vehicle center on-board screen being higher than the priority of the passenger seat. The driver is guaranteed to use preferentially, and the driving safety is guaranteed.
Step S240, determining the sight line characteristics of the human eyes.
In this embodiment, in order to distinguish the control of human eyes on the vehicle-mounted screen, the sight characteristics of human eyes may be divided first, and different control operations on the vehicle-mounted screen are correspondingly set for different sight characteristics. The accuracy of the passenger to the control of the vehicle-mounted screen is improved, and the passenger operation is facilitated.
In one embodiment, the eye-gaze feature of the human eye is no line of sight when no occupant is in the seat or when the occupant is in a closed eye state. Thus, the sight line characteristic of a human eye may include no sight line. Because the control method of the vehicle-mounted screen is related to the sight characteristics and the vehicle-mounted screen, for example, the sight of human eyes stays in the three-dimensional coordinate range of the vehicle-mounted screen and the sight does not stay in the three-dimensional coordinate range of the vehicle-mounted screen, so that the vehicle-mounted screen can be conveniently controlled according to the sight characteristics of the human eyes.
In addition, the sight line characteristics of the human eyes can also include, for example, characteristics that sight line breaks occur due to blinking actions, sight line characteristics that sight lines blink one or more times in a three-dimensional coordinate range of the vehicle-mounted screen, and the like.
And step S250, sending a specific control instruction to the vehicle-mounted screen controlled by the human eyes according to the determined sight characteristics of the human eyes.
In this embodiment, the specific control instruction refers to a control instruction corresponding to a sight line characteristic of human eyes, where different sight line characteristics of human eyes may correspond to different vehicle-mounted screen control instructions, so that a passenger can perform different control operations on a vehicle-mounted screen based on the sight line.
In one embodiment, the control instruction in the on-board screen control method is used for controlling the on-board screen, so the control instruction can comprise an opening instruction for controlling the on-board screen to be opened and can comprise a closing instruction for controlling the on-board screen to be closed.
In addition, the control instruction can also comprise instructions which can be used for controlling the vehicle-mounted screen, such as a fast forward playing instruction, a fast backward playing instruction, a pause instruction, a volume adjusting instruction, a picture brightness adjusting instruction and the like in the on-vehicle screen opening state. Moreover, the corresponding modes of the control commands and the sight line characteristics of human eyes can have various combination modes. The method is beneficial to improving the various operations of controlling the vehicle-mounted screen by human eyes.
In one embodiment, because the vehicle-mounted screen does not need to be opened when no sight exists, when the sight characteristic of human eyes is determined to be no sight, a closing instruction is sent to the vehicle-mounted screen controlled by the human eyes, so that the electric quantity of a vehicle-mounted battery is saved, and the resource waste is reduced.
In one embodiment, it is determined that the passenger intends to view the in-vehicle screen while the passenger's line of sight remains on the in-vehicle screen. Therefore, the sight line of the human eyes is characterized in that the sight line stays in the three-dimensional coordinate range of the vehicle-mounted screen, and the corresponding control instruction is an opening instruction.
And because the passenger may have a sight line glancing over the vehicle-mounted screen when the passenger does not want to watch the vehicle-mounted screen, a preset threshold value can be set to avoid sending the control instruction to the vehicle-mounted screen by mistake, so that the passenger is ensured to control the vehicle-mounted screen consciously and the control instruction is allowed to be sent to the vehicle-mounted screen according to the sight line characteristic of human eyes when the time exceeds the preset threshold value. For example, the preset threshold may be set to 3s, that is, when the time period that the sight line of the eyes of the passenger stays in the vehicle-mounted screen controlled by the eyes of the passenger exceeds 3s, a control opening instruction is sent to the vehicle-mounted screen.
In the embodiment, the vehicle-mounted screen is controlled timely and accurately according to the passenger sight line by determining that the sight line of the human eye stays in the three-dimensional coordinate range of the vehicle-mounted screen, determining that the time length of the sight line of the human eye staying in the three-dimensional coordinate range of the vehicle-mounted screen exceeds a preset threshold value, and sending an opening instruction to the vehicle-mounted screen controlled by the human eye.
In one embodiment, it is determined that the passenger intends to stop viewing the in-vehicle screen when the passenger's line of sight does not stay on the in-vehicle screen. Therefore, the sight line of the human eyes is characterized in that the sight line does not stay in the three-dimensional coordinate range of the vehicle-mounted screen, and the corresponding control command is a closing command.
And because the situation that the sight line leaves the vehicle-mounted screen can also occur when the passenger watches the vehicle-mounted screen, a preset threshold value can be set to avoid sending the control instruction to the vehicle-mounted screen by mistake, so that the passenger is ensured to control the vehicle-mounted screen consciously and the control instruction is allowed to be sent to the vehicle-mounted screen according to the sight line characteristics of human eyes when the time exceeds the preset threshold value. For example, the preset threshold may be set to 3s, that is, when the time period that the line of sight of the human eyes of the passengers does not stay in the vehicle-mounted screen controlled by the human eyes of the passengers exceeds 3s, a control closing instruction is sent to the vehicle-mounted screen.
In this embodiment, by determining that the sight line of the human eye does not stay within the three-dimensional coordinate range of the vehicle-mounted screen, determining that the time period for which the sight line of the human eye does not stay within the three-dimensional coordinate range of the vehicle-mounted screen exceeds a predetermined threshold, and sending a closing instruction to the vehicle-mounted screen controlled by the human eye.
In the above embodiment, the manner of determining whether the sight line of the human eye stays within the three-dimensional coordinate range of the vehicle-mounted screen may be to confirm the sight line direction of the human eye through an existing sight line detection technology, for example, sight line direction detection based on video image processing, and then determine whether the sight line ray and the vehicle-mounted screen have an intersection point, that is, whether the sight line ray falls within the three-dimensional coordinate range of the vehicle-mounted screen.
Through this disclosed embodiment can only be controlled the screen by the passenger in the seat that the screen is relevant, improve the accuracy degree of on-vehicle screen sight control to whether open the screen based on whether the passenger watches the screen decision, practice thrift battery power, reduce the power loss. Meanwhile, the situation that the passenger does not watch the screen under the condition of dark light and the screen is kept in an open state to cause light stimulation and uncomfortable feeling can be avoided, and the vehicle using experience is improved.
< example >
Fig. 3 illustrates an example of on-board screen control in the implementation of the method, using a vehicle 300 as an example.
The vehicle 300 in this example includes four seats, c1, c2, c3, c 4; the vehicle 300 comprises four vehicle-mounted screens, namely p1, p2, p3 and p4, wherein the p4 screen is in an open state, and the p1, p2 and p3 screens are in a closed state.
The seats in vehicle 300 each have a passenger seated therein, e1, e2, e3, e4, respectively, wherein the e1 passenger is in a driving position, looking forward, not looking at any on-board screen; e2 passenger is in eye-closed state, not looking at the screen; e3 passenger is looking at p2 screen; the e4 passenger is looking at the p4 screen.
In this example, three-dimensional coordinates of human eyes in the vehicle are obtained first; determining an in-vehicle seat associated with the three-dimensional coordinates of the human eye; and determining the vehicle-mounted screen controlled by the human eyes according to the seat and the incidence relation between the seat and the vehicle-mounted screen.
Acquiring a three-dimensional coordinate range of each seat in the vehicle and a three-dimensional coordinate range of each vehicle-mounted screen; and establishing the association relationship between the seats and the vehicle-mounted screen according to a preset rule, the three-dimensional coordinate range of each seat in the vehicle and the three-dimensional coordinate range of each vehicle-mounted screen. The seat-to-vehicle screen association relationship in this example is a c1 seat association p1 screen, c2 seat association p1 and p2 screens, c3 seat association p3 screen, and c4 seat association p4 screen.
It is determined that the sight line characteristic of the human eye of the passenger e1 in the c1 seat is that the sight line does not stay within the three-dimensional coordinate range of the in-vehicle screen p1, and it is determined that the length of time for which the sight line of the passenger e1 does not stay within the three-dimensional coordinate range of the in-vehicle screen p1 exceeds a predetermined threshold value. And sending a closing instruction to the vehicle-mounted screen p1 associated with the c1 seat according to the sight characteristics of the eyes of the passenger e 1.
And determining that the sight line characteristic of the human eye of the passenger e2 in the c2 seat is no sight line, and sending a closing instruction to the vehicle-mounted screens p1 and p2 associated with the c2 seat according to the sight line characteristic of the human eye of the passenger e 2.
Determining that the sight line of the passenger e3 in the c3 seat is characterized by not staying in the three-dimensional coordinate range of the vehicle-mounted screen, determining that the time length for which the sight line of the passenger e3 does not stay in the three-dimensional coordinate range of the vehicle-mounted screen p3 exceeds a preset threshold value, and sending a closing instruction to the vehicle-mounted screen p3 associated with the c3 seat according to the sight line characteristic of the passenger e 3. At this time, although the eye sight of the passenger e3 stays in the three-dimensional coordinate range of the in-vehicle screen p2, since the seat c3 and the in-vehicle screen p2 do not have an association relationship, the sight feature of the passenger e3 in the seat c3 cannot control to turn on the in-vehicle screen p 2.
The sight line of the passenger e4 in the seat of the c4 is determined to be characterized by the sight line staying within the three-dimensional coordinate range of the vehicle-mounted screen p4, and the length of time for which the sight line of the passenger e4 stays within the three-dimensional coordinate range of the vehicle-mounted screen p4 is determined to exceed a predetermined threshold value. And sending an opening instruction to the vehicle-mounted screen p4 associated with the c4 seat according to the sight characteristics of the eyes of the passenger e 4.
< apparatus embodiment >
Fig. 4 shows an in-vehicle screen control apparatus 400, which apparatus 400 may include a coordinate acquisition module 410, a seat determination module 420, a screen determination module 430, a feature determination module 440, and a communication module 450.
The coordinate obtaining module 410 is configured to obtain three-dimensional coordinates of human eyes in the vehicle.
The seat determination module 420 is configured to determine an in-vehicle seat associated with the three-dimensional coordinates of the human eye.
The screen determining module 430 is configured to determine the vehicle-mounted screen controlled by the human eye according to the seat and the association relationship between the seat and the vehicle-mounted screen.
The feature determination module 440 is configured to determine a gaze feature of the human eye.
The communication module 450 is configured to send a specific control instruction to the vehicle-mounted screen controlled by the human eyes according to the determined sight characteristics of the human eyes.
In one embodiment, the car screen control device 400 further comprises an establishing module for establishing the relationship between the seat and the car screen. The establishing module is used for acquiring the three-dimensional coordinate range of each seat and the three-dimensional coordinate range of each vehicle-mounted screen in the vehicle and establishing the association relationship between the seat and the vehicle-mounted screen according to a preset rule and the three-dimensional coordinate range of each seat and the three-dimensional coordinate range of each vehicle-mounted screen in the vehicle.
In one embodiment, the on-board screen control device 400 further comprises a coordinate adjusting module, wherein the coordinate adjusting module is used for adjusting the three-dimensional coordinate range of each seat and the three-dimensional coordinate range of each on-board screen in real time according to the adjustment of the seat position.
In one embodiment, in the in-vehicle screen control apparatus 400, in the seat-in-vehicle screen association relationship: each seat is associated with at least one on-board screen.
In one embodiment, the sight characteristics of the human eyes comprise no sight, no sight staying in the three-dimensional coordinate range of the vehicle-mounted screen, and sight staying in the three-dimensional coordinate range of the vehicle-mounted screen.
In one embodiment, the control instructions include an open instruction and a close instruction.
In one embodiment, the feature determination module 440 is further configured to determine that there is no line of sight for the human eye; the communication module 450 is used for sending a closing instruction to the vehicle-mounted screen controlled by the human eyes.
In one embodiment, the feature determination module 440 is further configured to determine that the line of sight of the human eye stays within the three-dimensional coordinate range of the on-board screen; the time length for determining that the sight of the human eyes stays in the three-dimensional coordinate range of the vehicle-mounted screen exceeds a preset threshold value; the communication module 450 is used for sending an opening instruction to the vehicle-mounted screen controlled by the human eyes.
In one embodiment, the feature determination module 440 is further configured to determine that the line of sight of the human eye does not stay within the three-dimensional coordinate range of the in-vehicle screen; the time length for determining that the sight line of the human eyes does not stay in the three-dimensional coordinate range of the vehicle-mounted screen exceeds a preset threshold value; the communication module 450 is used for sending a closing instruction to the vehicle-mounted screen controlled by the human eyes.
< vehicle embodiment >
FIG. 5 is a schematic illustration of a vehicle according to one embodiment. In the present embodiment, a vehicle 500 is provided. The vehicle 500 may be the vehicle 1000 shown in fig. 1, or may be a vehicle of another structure, which is not limited herein.
As shown in fig. 5, the vehicle 500 may comprise, for example, an on-board screen control device 500 according to fig. 4, the on-board screen control device 500 performing an on-board screen control method according to any method embodiment.
In another embodiment, the vehicle 500 may also include a camera, an on-board screen, a memory storing a computer program, and a processor for executing the vehicle lane-change control method according to any of the method embodiments under the control of the computer program.
It will be appreciated by those skilled in the art that the coordinate acquisition module 410, the seat determination module 420, the screen determination module 430, the feature determination module 440, and the communication module 450 may be implemented in a variety of ways. For example, the coordinate acquisition module 410 and/or the seat determination module 420 and/or the screen determination module 430 and/or the feature determination module 440 and/or the communication module 450 may be implemented by an instruction configuration processor. For example, instructions may be stored in ROM and read from ROM into a programmable device to implement the coordinate acquisition module 410 and/or the seat determination module 420 and/or the screen determination module 430 and/or the feature determination module 440 and/or the communication module 450 when the device is started. For example, the coordinate acquisition module 410 and/or the seat determination module 420 and/or the screen determination module 430 and/or the feature determination module 440 and/or the communication module 450 may be incorporated into a dedicated device (e.g., an ASIC). The coordinate acquisition module 410 and/or the seat determination module 420 and/or the screen determination module 430 and/or the feature determination module 440 and/or the communication module 450 may be separated into separate units or may be implemented by being combined together. The coordinate acquisition module 410 and/or the seat determination module 420 and/or the screen determination module 430 and/or the feature determination module 440 and/or the communication module 450 may be implemented by one of the various implementations described above, or may be implemented by a combination of two or more of the various implementations described above.
The present invention may be a system, method and/or computer program product. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied therewith for causing a processor to implement various aspects of the present invention.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
The computer program instructions for carrying out operations of the present invention may be assembler instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, aspects of the present invention are implemented by personalizing an electronic circuit, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA), with state information of computer-readable program instructions, which can execute the computer-readable program instructions.
Aspects of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions. It is well known to those skilled in the art that implementation by hardware, by software, and by a combination of software and hardware are equivalent.
Having described embodiments of the present invention, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terms used herein were chosen in order to best explain the principles of the embodiments, the practical application, or technical improvements to the techniques in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein. The scope of the invention is defined by the appended claims.

Claims (11)

1. A vehicle-mounted screen control method is characterized by comprising the following steps:
acquiring three-dimensional coordinates of human eyes in the vehicle;
determining an in-vehicle seat associated with the three-dimensional coordinates of the human eye;
determining the vehicle-mounted screen controlled by the human eyes according to the seat and the incidence relation between the seat and the vehicle-mounted screen;
determining the sight line characteristics of the human eyes; and
and sending a specific control instruction to the vehicle-mounted screen controlled by the human eyes according to the determined sight characteristics of the human eyes.
2. The method of claim 1, further comprising the step of establishing the seat-to-vehicle screen association:
acquiring a three-dimensional coordinate range of each seat in the vehicle and a three-dimensional coordinate range of each vehicle-mounted screen;
and establishing the association relationship between the seats and the vehicle-mounted screen according to a preset rule, the three-dimensional coordinate range of each seat in the vehicle and the three-dimensional coordinate range of each vehicle-mounted screen.
3. The method of claim 2, further comprising:
and the three-dimensional coordinate range of each seat and the three-dimensional coordinate range of each vehicle-mounted screen are adjusted in real time according to the adjustment of the position of the seat.
4. The method of claim 1, wherein in the seat-to-screen association: each seat is associated with at least one on-board screen.
5. The method of claim 2, wherein the eye gaze characteristics of the human eye comprise:
the vehicle-mounted screen has no sight line, the sight line does not stay in the three-dimensional coordinate range of the vehicle-mounted screen, and the sight line stays in the three-dimensional coordinate range of the vehicle-mounted screen.
6. The method of claim 1, wherein the control instruction comprises:
an open command and a close command.
7. The method of claim 1, wherein determining the sight line characteristics of the human eyes and issuing specific control instructions to the human eye-controlled vehicle-mounted screen according to the determined sight line characteristics of the human eyes comprise:
determining that there is no line of sight for the human eye;
and sending a closing instruction to the vehicle-mounted screen controlled by the human eyes.
8. The method of claim 2, wherein determining the sight line characteristics of the human eyes and issuing specific control instructions to the human eye-controlled vehicle-mounted screen according to the determined sight line characteristics of the human eyes comprise:
determining that the sight of the human eyes stays in the three-dimensional coordinate range of the vehicle-mounted screen;
determining that the duration of the sight line of the human eyes staying in the three-dimensional coordinate range of the vehicle-mounted screen exceeds a preset threshold;
and sending an opening instruction to the vehicle-mounted screen controlled by the human eyes.
9. The method of claim 2, wherein determining the sight line characteristics of the human eyes and issuing specific control instructions to the human eye-controlled vehicle-mounted screen according to the determined sight line characteristics of the human eyes comprise:
determining that the sight of the human eyes does not stay in the three-dimensional coordinate range of the vehicle-mounted screen;
determining that the duration that the sight of the human eyes does not stay in the three-dimensional coordinate range of the vehicle-mounted screen exceeds a preset threshold;
and sending a closing instruction to the vehicle-mounted screen controlled by the human eyes.
10. An on-vehicle screen control device, characterized by comprising:
the coordinate acquisition module is used for acquiring three-dimensional coordinates of human eyes in the vehicle;
a seat determination module for determining an in-vehicle seat associated with the three-dimensional coordinates of the human eye;
the screen determining module is used for determining the vehicle-mounted screen controlled by the human eyes according to the seat and the incidence relation between the seat and the vehicle-mounted screen;
the characteristic determining module is used for determining the sight characteristics of the human eyes; and
and the communication module is used for sending a specific control instruction to the vehicle-mounted screen controlled by the human eyes according to the determined sight characteristics of the human eyes.
11. A vehicle comprising the control device of the in-vehicle screen of claim 10, or comprising a camera, an in-vehicle screen, a memory and a processor, the memory storing computer instructions which, when executed by the processor, implement the in-vehicle screen control method of any one of claims 1-9.
CN202110078963.3A 2021-01-21 2021-01-21 Control method and device of vehicle-mounted screen and vehicle Withdrawn CN114882579A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110078963.3A CN114882579A (en) 2021-01-21 2021-01-21 Control method and device of vehicle-mounted screen and vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110078963.3A CN114882579A (en) 2021-01-21 2021-01-21 Control method and device of vehicle-mounted screen and vehicle

Publications (1)

Publication Number Publication Date
CN114882579A true CN114882579A (en) 2022-08-09

Family

ID=82667541

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110078963.3A Withdrawn CN114882579A (en) 2021-01-21 2021-01-21 Control method and device of vehicle-mounted screen and vehicle

Country Status (1)

Country Link
CN (1) CN114882579A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024032131A1 (en) * 2022-08-12 2024-02-15 中兴通讯股份有限公司 Screen control method, vehicle-mounted device, computer storage medium, and vehicle
CN118358487A (en) * 2024-04-29 2024-07-19 镁佳(武汉)科技有限公司 Vehicle central control system, central control screen connection method, device and equipment
CN118665283A (en) * 2024-08-21 2024-09-20 比亚迪股份有限公司 Control method and system of vehicle, storage medium, program product and vehicle

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024032131A1 (en) * 2022-08-12 2024-02-15 中兴通讯股份有限公司 Screen control method, vehicle-mounted device, computer storage medium, and vehicle
CN118358487A (en) * 2024-04-29 2024-07-19 镁佳(武汉)科技有限公司 Vehicle central control system, central control screen connection method, device and equipment
CN118665283A (en) * 2024-08-21 2024-09-20 比亚迪股份有限公司 Control method and system of vehicle, storage medium, program product and vehicle

Similar Documents

Publication Publication Date Title
US11417121B1 (en) Apparatus, systems and methods for classifying digital images
KR102204250B1 (en) Method for calculating an augmented reality-overlaying for displaying a navigation route on ar-display unit, device for carrying out the method, as well as motor vehicle and computer program
US20200293041A1 (en) Method and system for executing a composite behavior policy for an autonomous vehicle
CN114882579A (en) Control method and device of vehicle-mounted screen and vehicle
US20180208209A1 (en) Comfort profiles
US20200018976A1 (en) Passenger heads-up displays for vehicles
US11054818B2 (en) Vehicle control arbitration
KR20190076731A (en) Method for Outputting Contents via Checking Passenger Terminal and Distraction
US11285974B2 (en) Vehicle control system and vehicle
US10562539B2 (en) Systems and methods for control of vehicle functions via driver and passenger HUDs
US20160023602A1 (en) System and method for controling the operation of a wearable computing device based on one or more transmission modes of a vehicle
KR20180026316A (en) System and method for vehicular and mobile communication device connectivity
US20200394923A1 (en) Vehicle to vehicle navigation syncing system
CN114148341A (en) Control device and method for vehicle and vehicle
CN112977460A (en) Method and apparatus for preventing motion sickness when viewing image content in a moving vehicle
US20220277556A1 (en) Information processing device, information processing method, and program
WO2024000391A1 (en) Control method and device, and vehicle
CN114148342A (en) Automatic driving judgment system, automatic driving control system and vehicle
US11491993B2 (en) Information processing system, program, and control method
US20240217338A1 (en) Dynamically displaying driver vehicle information for vehicles
US12097890B2 (en) Middleware software layer for vehicle autonomy subsystems
US11853232B2 (en) Device, method and computer program
US20240025437A1 (en) Driver assistance system for vehicle
US20240118098A1 (en) High-definition energy consumption map for vehicles
US20240239265A1 (en) Rear display enhancements

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20220809