CN117055734A - Control method and device of vehicle-mounted screen, computer equipment and storage medium - Google Patents

Control method and device of vehicle-mounted screen, computer equipment and storage medium Download PDF

Info

Publication number
CN117055734A
CN117055734A CN202311057078.2A CN202311057078A CN117055734A CN 117055734 A CN117055734 A CN 117055734A CN 202311057078 A CN202311057078 A CN 202311057078A CN 117055734 A CN117055734 A CN 117055734A
Authority
CN
China
Prior art keywords
vehicle
eye movement
glasses
movement data
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311057078.2A
Other languages
Chinese (zh)
Inventor
张高然
周骏
樊文昕
杨大成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Geely Holding Group Co Ltd
Zhejiang Zeekr Intelligent Technology Co Ltd
Original Assignee
Zhejiang Geely Holding Group Co Ltd
Zhejiang Zeekr Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Geely Holding Group Co Ltd, Zhejiang Zeekr Intelligent Technology Co Ltd filed Critical Zhejiang Geely Holding Group Co Ltd
Priority to CN202311057078.2A priority Critical patent/CN117055734A/en
Publication of CN117055734A publication Critical patent/CN117055734A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/02Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
    • B60R11/0229Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof for displays, e.g. cathodic tubes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R2011/0042Arrangements for holding or mounting articles, not otherwise provided for characterised by mounting means
    • B60R2011/008Adjustable or movable supports
    • B60R2011/0092Adjustable or movable supports with motorization

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application relates to a control method and device of a vehicle-mounted screen, computer equipment and a storage medium. The method comprises the following steps: receiving a first interaction instruction sent by AR (augmented reality) glasses, and extracting first eye movement data of a wearing object in the first interaction instruction; determining a sight-line target according to the first eye movement data, judging whether the sight-line target is over against one of target areas of the vehicle-mounted screen, and outputting a judging result; when the sight line target in the judgment result is over against one of the target areas, receiving a second interaction instruction sent by the AR glasses, and extracting second eye movement data of the wearing object in the second interaction instruction; and when the second eye movement data is in the preset eye movement mode, controlling the driving motor to start so as to adjust the expansion or contraction of the vehicle-mounted screen. By adopting the method, the vehicle-mounted screen can be controlled to expand or contract after the vehicle occupant wears the AR glasses, so that the size requirement of the vehicle occupant on the vehicle-mounted screen is met.

Description

Control method and device of vehicle-mounted screen, computer equipment and storage medium
Technical Field
The present application relates to the field of automotive technologies, and in particular, to a method and apparatus for controlling a vehicle-mounted screen, a computer device, and a storage medium.
Background
With the development and popularization of intelligent automobiles, vehicle-mounted screens play an increasingly important role in the automobile scene, and the sizes and the number of the vehicle-mounted screens are continuously changed. When a vehicle is provided with a vehicle-mounted screen and the vehicle-mounted screen is not used by a driver during driving, other passengers in the vehicle cannot effectively use the vehicle-mounted screen, and passengers at different seat positions cannot effectively see the vehicle-mounted screen. Therefore, how to adjust the size of the vehicle-mounted screen to meet the use requirements of passengers at various positions in the vehicle is a problem to be solved.
Disclosure of Invention
Based on this, it is necessary to provide a control method, device, computer device and storage medium for a vehicle-mounted screen, which can adjust the size of the vehicle-mounted screen through AR glasses worn by a passenger, so as to solve the size requirement of the passenger for the vehicle-mounted screen.
In a first aspect, the present application provides a control method for a vehicle-mounted screen, applied to a vehicle, where the vehicle is connected to AR glasses and a driving motor, and the driving motor is used for adjusting the size of the vehicle-mounted screen, the method includes:
receiving a first interaction instruction sent by AR (augmented reality) glasses, and extracting first eye movement data of a wearing object in the first interaction instruction;
determining a sight-line target according to the first eye movement data, judging whether the sight-line target is right opposite to one of target areas of the vehicle-mounted screen, and outputting a judging result;
when the sight line target in the judging result is right opposite to one of the target areas, receiving a second interaction instruction sent by the AR glasses, and extracting second eye movement data of the wearing object in the second interaction instruction;
and when the second eye movement data is in the preset eye movement mode, controlling the driving motor to start so as to adjust the expansion or contraction of the vehicle-mounted screen.
In one embodiment, the step of determining the gaze target from the first eye movement data comprises:
determining a sight line target corresponding to the first eye movement data by utilizing an eye movement tracking technology;
determining the seat position of the current wearing object according to the sight line target;
and when the seat position is the driving position, selecting whether to execute the pre-configuration process according to the running state of the current vehicle, and if not, directly executing the pre-configuration process.
In one embodiment, when the seat position is a driving position, the step of selecting whether to execute the pre-configuration process according to the current driving state of the vehicle includes:
judging whether the seat position is a driving position or not, and outputting a judging result;
if the seat position in the judgment result is the driving position, acquiring current state data of the vehicle;
if the state data of the vehicle is in a driving state, interrupting interaction with the AR glasses;
and if the state data of the vehicle is in a parking state, extracting a sight line target in the first eye movement data.
In one embodiment, the step of determining the sight-line target according to the first eye movement data, determining whether the sight-line target is opposite to one of the target areas of the vehicle-mounted screen, and outputting the determination result includes:
a plurality of target areas are divided into a vehicle-mounted screen in advance, and different target areas are used for matching different screen adjusting positions;
matching the current selected area fed back by the AR glasses with each target area divided in advance, and if the current selected area is matched with one of the target areas, indicating that the current sight target is opposite to one of the target areas of the vehicle-mounted screen;
before the currently selected area fed back by the AR glasses, moving a virtual cursor of the AR glasses by using an eye tracking technology, so that the virtual cursor moves in a visual field range, and performing marking processing on a moving area of the virtual cursor in real time to generate first eye movement data.
In one embodiment, when the sight-line target in the judgment result is opposite to one of the target areas, the step of receiving a second interaction instruction sent by the AR glasses and extracting second eye movement data of the wearing object in the second interaction instruction includes:
responding to the fact that the sight line target is right opposite to one of the target areas, feeding back a determining selection instruction to the AR glasses, so that the wearing object executes preset eye movements according to the determining selection instruction, and outputting a second interaction instruction after the AR glasses generate second eye movement data according to the eye movements;
and receiving a second interaction instruction sent by the AR glasses, and extracting second eye movement data in the second interaction instruction.
In one embodiment, when the second eye movement data is the preset eye movement mode, the step of controlling the driving motor to start to adjust the expansion or contraction of the vehicle-mounted screen includes:
when the sight line target is opposite to one of the target areas, determining a screen adjusting position of the vehicle-mounted screen;
when the second eye movement data is in the preset eye movement mode, a starting signal is generated, and after the driving motor is controlled to start, the vehicle-mounted screen is controlled to expand or contract.
In one embodiment, when the sight-line target in the judgment result is opposite to one of the target areas, the step of receiving a second interaction instruction sent by the AR glasses and extracting second eye movement data of the wearing object in the second interaction instruction further includes:
responding to the fact that the sight line target is opposite to one of the target areas, feeding back the expandable range of the vehicle-mounted screen to the AR glasses, so that the wearing object can execute preset eye movements after adjusting the sight line target according to the expandable range, and further the AR glasses can generate second eye movement data according to the eye movements.
In a second aspect, the present application provides a control apparatus for an in-vehicle screen, the apparatus comprising: the device comprises a first extraction module, a sight line judgment module, a second extraction module and a screen adjustment module, wherein,
the first extraction module is used for receiving a first interaction instruction sent by the AR glasses and extracting first eye movement data of a wearing object in the first interaction instruction;
the sight line judging module is used for determining a sight line target according to the first eye movement data, judging whether the sight line target is right opposite to one of target areas of the vehicle-mounted screen, and outputting a judging result;
the second extraction module is used for receiving a second interaction instruction sent by the AR glasses when the sight line target is right opposite to one of the target areas in the judgment result, and extracting second eye movement data of the wearing object in the second interaction instruction;
and the screen adjusting module is used for controlling the driving motor to start when the second eye movement data is in a preset eye movement mode so as to adjust the expansion or contraction of the vehicle-mounted screen.
In a third aspect, the present application provides a computer device comprising a memory storing a computer program and a processor implementing the following steps when executing the computer program:
receiving a first interaction instruction sent by AR (augmented reality) glasses, and extracting first eye movement data of a wearing object in the first interaction instruction;
determining a sight-line target according to the first eye movement data, judging whether the sight-line target is right opposite to one of target areas of the vehicle-mounted screen, and outputting a judging result;
when the sight line target in the judging result is right opposite to one of the target areas, receiving a second interaction instruction sent by the AR glasses, and extracting second eye movement data of the wearing object in the second interaction instruction;
and when the second eye movement data is in the preset eye movement mode, controlling the driving motor to start so as to adjust the expansion or contraction of the vehicle-mounted screen.
In a fourth aspect, the present application provides a computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of:
receiving a first interaction instruction sent by AR (augmented reality) glasses, and extracting first eye movement data of a wearing object in the first interaction instruction;
determining a sight-line target according to the first eye movement data, judging whether the sight-line target is right opposite to one of target areas of the vehicle-mounted screen, and outputting a judging result;
when the sight line target in the judging result is right opposite to one of the target areas, receiving a second interaction instruction sent by the AR glasses, and extracting second eye movement data of the wearing object in the second interaction instruction;
and when the second eye movement data is in the preset eye movement mode, controlling the driving motor to start so as to adjust the expansion or contraction of the vehicle-mounted screen.
The control method, the device, the computer equipment and the storage medium of the vehicle-mounted screen have the following technical effects:
because the first interaction instruction sent by the AR glasses is received, first eye movement data of the wearing object in the first interaction instruction is extracted; determining a sight-line target according to the first eye movement data, judging whether the sight-line target is over against one of target areas of the vehicle-mounted screen, and outputting a judging result; when the sight line target in the judgment result is over against one of the target areas, receiving a second interaction instruction sent by the AR glasses, and extracting second eye movement data of the wearing object in the second interaction instruction; and when the second eye movement data is in the preset eye movement mode, controlling the driving motor to start so as to adjust the expansion or contraction of the vehicle-mounted screen. According to the application, the first eye movement data of the wearing object is collected through the AR glasses and sent to the vehicle so as to collect the first eye movement data of the wearing object for real-time monitoring, the expansion of the vehicle-mounted screen depends on the first eye movement data and the second eye movement data of the AR glasses, and the expansion or contraction of the vehicle-mounted screen is controlled after the vehicle-mounted screen is worn by the vehicle occupant, so that the size requirement of the vehicle-mounted screen by the vehicle occupant is met.
Drawings
FIG. 1 is an application environment diagram of a control method of an in-vehicle screen in one embodiment;
FIG. 2 is a flow chart of a method for controlling an on-vehicle screen in one embodiment;
FIG. 3 is a schematic view of a position direction of each seat position relative to a vehicle screen in one embodiment;
FIG. 4 is a schematic diagram of a target area location in a vehicle screen according to one embodiment;
FIG. 5 is a schematic view of an installation scenario of an on-board screen in one embodiment;
FIG. 6 is a schematic diagram of an extended status change of an on-board screen in one embodiment;
FIG. 7 is a block diagram showing a control apparatus of an in-vehicle screen in one embodiment;
fig. 8 is an internal structural diagram of a computer device in one embodiment.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
The control method of the vehicle-mounted screen provided by the application can be applied to an application environment shown in fig. 1. Any passenger in the car wears AR glasses 20, and AR glasses 20 and car machine 10 communication connection after the start, car 10 machine and driving motor 30 are connected, and driving motor 30 is used for adjusting on-vehicle screen 40's size. The AR glasses 20 generate eye movement data of the wearing object using an eye movement tracking technique, thereby generating an interaction instruction. In this embodiment, the AR glasses 20 are used in combination with the vehicle 10 to generate an interaction instruction by using eyes of an occupant in the vehicle, and then the interaction instruction is sent to the vehicle 10, and the vehicle controls the driving motor to start so as to control the vehicle-mounted screen 40 correspondingly. Further, the vehicle is connected with a vehicle-mounted screen, and the vehicle-mounted screen adopts an expandable screen. In this embodiment, expansion or contraction of the in-vehicle screen is controlled by the eye of the wearing subject wearing the AR glasses.
In one embodiment, as shown in fig. 2, a control method of a vehicle-mounted screen is provided, and an example of application of the method to the vehicle in fig. 1 is described, including the following steps:
step S101, receiving a first interaction instruction sent by AR glasses, and extracting first eye movement data of a wearing object in the first interaction instruction;
step S102, determining a sight-line target according to first eye movement data, judging whether the sight-line target is right opposite to one of target areas of a vehicle-mounted screen, and outputting a judging result;
step S103, when the sight line target in the judging result is over against one of the target areas, receiving a second interaction instruction sent by the AR glasses, and extracting second eye movement data of the wearing object in the second interaction instruction;
and step S104, when the second eye movement data is in a preset eye movement mode, controlling the driving motor to start so as to adjust the expansion or contraction of the vehicle-mounted screen.
In one embodiment, in step S101, before the step of receiving the first interaction instruction sent by the AR glasses, the method includes: and responding to the fact that an occupant in the car wears the AR glasses, and then the car machine is in communication connection with the started AR glasses by using a preset connection mode. For example, the car machine and the AR glasses all start Bluetooth, the AR glasses which start Bluetooth are searched in the car machine, and the car machine is in communication connection with the AR glasses through the Bluetooth so as to establish the connection relation between the car machine and the AR glasses. The car machine and the AR glasses in this embodiment are not limited to the communication by the bluetooth technology.
When the vehicle-mounted device and the AR glasses are connected, a first interaction instruction sent by the AR glasses is received in real time, the AR glasses are used as an instruction triggering mechanism, when the first interaction instruction sent by the AR glasses meets certain requirements, next interaction is performed, and after a pre-configured interaction flow is completed, the vehicle-mounted device function corresponding to the requirements is executed.
The AR glasses in this embodiment receive the instruction operation of wearing the subject by using the eye movement tracking technology, which uses the feature of keeping the eye movement unchanged, and typically uses Purkinje image (Purkinje image) on the outer surface of the cornea of the eyeball (Purkinje image is a bright spot on the cornea of the eyeball, and is generated by reflecting (corneal reflection) the light entering the pupil on the outer surface of the cornea).
In one embodiment, before the step of determining the sight-line target according to the first eye movement data, step S101 includes: determining a sight line target corresponding to the first eye movement data by utilizing an eye movement tracking technology; determining the seat position of the current wearing object according to the sight line target; and when the seat position is the driving position, selecting whether to execute the pre-configuration process according to the running state of the current vehicle, and if not, directly executing the pre-configuration process.
Further, referring to fig. 3, the in-vehicle scene is a relatively fixed environment, the vehicle-mounted screen is fixedly mounted on a center console of the vehicle, and directions between different seat positions and the vehicle-mounted screen are different, so that a line of sight of a passenger watching the vehicle-mounted screen at each seat position is different.
When the wearing object is a driver, the driver is considered to be distracted in the running process of the vehicle, and safe driving is affected. To ensure safe driving of the vehicle, avoiding adjusting the on-board screen to affect driving, and ensuring that the use of AR glasses does not affect vehicle travel, therefore, when the vehicle is in a driving state and the wearing object of the AR glasses is a driver, then the interaction relationship between the vehicle machine and the AR glasses is in an interrupted state.
In this embodiment, the seat position of the current wearing object is determined according to the sight-line target, including the driving position and the non-driving position, including the co-driving position and the rear seat. The wearing object of the AR glasses in this embodiment is not limited to the driver, if the wearing object of the non-driving position needs to use the vehicle-mounted screen, after the vehicle recognizes that the current wearing object is the non-driver according to the seat position, the vehicle and the AR glasses normally communicate interactively, otherwise, when the vehicle recognizes that the current wearing object is the driver and the vehicle is in the driving state according to the seat position, the vehicle and the AR glasses interrupt the interactive communication.
Further, when the seat position is the driving position, selecting whether to execute the pre-configuration process of the interactive instruction according to the running state of the current vehicle, including: judging whether the seat position is a driving position or not, and outputting a judging result; if the seat position in the judgment result is the driving position, acquiring current state data of the vehicle; if the state data of the vehicle is in a driving state, interrupting interaction with the AR glasses; and if the state data of the vehicle is in a parking state, extracting a sight line target in the first eye movement data. Further, if the seat position in the judgment result is the non-driving position, the device is in normal interactive communication with the AR glasses, and the sight line target in the interactive instruction is further extracted according to a pre-configuration flow.
In one embodiment, the on-board screen includes a plurality of target areas, different target areas to match different screen adjustment positions. Referring to fig. 4, target areas at different positions are used to indicate screen extensions at different positions.
Step S102, determining a sight-line target according to the first eye movement data, judging whether the sight-line target is opposite to one of target areas of the vehicle-mounted screen, and outputting a judging result, wherein the step comprises the following steps:
dividing a vehicle-mounted screen into a plurality of target areas in advance, wherein different target areas are used for matching different screen adjusting positions; matching the current selected area fed back by the AR glasses with each target area divided in advance, and if the current selected area is matched with one of the target areas, indicating that the current sight-line target is opposite to one of the target areas of the vehicle-mounted screen.
Before the AR glasses feed back the currently selected area, the virtual cursor of the AR glasses is moved by utilizing an eye tracking technology to move in the visual field range, and the real-time moving area of the virtual cursor is marked to generate first eye movement data.
In one embodiment, when the wearing object wants to expand the screen, the line of sight falls into a target area which is divided in advance in the vehicle-mounted screen, for example, an expandable edge of the vehicle-mounted screen is set as the target area, the eye tracking system associated with the AR glasses is responsible for moving the virtual cursor to the expandable edge, the expandable edge is displayed in other colors in the AR glasses, for example, green, and at the moment, the AR glasses are equivalent to sending a first interaction instruction for controlling the expansion of the screen to the vehicle.
In one embodiment, step S103, when the sight-line target in the determination result is opposite to one of the target areas, receives a second interaction instruction sent by the AR glasses, and extracts second eye movement data of the wearing object in the second interaction instruction, where the step includes:
responding to the fact that the sight line target is right opposite to one of the target areas, feeding back a determining selection instruction to the AR glasses, so that the wearing object executes preset eye movements according to the determining selection instruction, and outputting a second interaction instruction after the AR glasses generate second eye movement data according to the eye movements; and receiving a second interaction instruction sent by the AR glasses, and extracting second eye movement data in the second interaction instruction. In one embodiment, the eye movement may stay for 1.5 seconds, which is not particularly limited in this embodiment, and may be set according to the habit, for example, the eye movement may be confirmed by blinking 2 times.
In one embodiment, step S104, when the second eye movement data is the preset eye movement mode, the step of controlling the driving motor to start to drive the on-vehicle screen to expand to a predetermined size corresponding to the target area includes:
when the sight line target is opposite to one of the target areas, determining a screen adjusting position of the vehicle-mounted screen; when the second eye movement data is in the preset eye movement mode, a starting signal is generated, and after the driving motor is controlled to drive, the vehicle-mounted screen is controlled to expand or contract so as to be adjusted to the required size.
Further stated, when the sight-line target in the judgment result is opposite to one of the target areas, the step of receiving a second interaction instruction sent by the AR glasses and extracting second eye movement data of the wearing object in the second interaction instruction further includes: responding to the fact that the sight line target is opposite to one of the target areas, feeding back the expandable range of the vehicle-mounted screen to the AR glasses, so that the wearing object can execute preset eye movements after adjusting the sight line target according to the expandable range, and further the AR glasses can generate second eye movement data according to the eye movements.
In one embodiment, the on-board screen employs a scroll flexible screen configured with a drive motor. Wherein the scroll flexible screen includes a screen body and at least one sized screen extension. Referring to fig. 5, a screen body and a driving motor in the vehicle-mounted screen are fixedly installed on the vehicle-mounted center console, referring to fig. 6, an expansion state change schematic diagram of the vehicle-mounted screen is shown, the screen body 41 is movably connected with the screen expansion part 42, and when the driving motor is started, the screen expansion part 42 and the screen body 41 are driven to move relatively. Further, the screen body 41 is divided into a plurality of target areas in advance, and one target area may be set, and when the wearing object needs to adjust the screen size, only the sight line target needs to be aligned to the target area, so that different target areas can be selected for passengers in different seat positions.
When the wearing object is used for expanding the vehicle-mounted screen, the sight line falls in a target area of the vehicle-mounted screen, the selected target area is displayed and fed back through the AR glasses, for example, the selected target area is displayed as green, the selected target area is determined by utilizing a preset eye action, for example, stay for 1.5 seconds, the AR glasses give feedback, the expandable range of the current screen adjusting position is displayed, for example, the selected target area changes color again, the expandable range is displayed, then the sight line object of the wearing object moves in the expandable range, and after the size requirement of the wearing object is met, stay for 1.5 seconds again, the determination is made, the AR glasses generate second eye movement data according to the eye action, and the second eye movement data is fed back to the vehicle, and the vehicle is started by controlling the driving motor through the vehicle to execute the expansion of the vehicle-mounted screen.
In the control method of the vehicle-mounted screen, as the first interaction instruction sent by the AR glasses is received, first eye movement data of the wearing object in the first interaction instruction is extracted; determining a sight-line target according to the first eye movement data, judging whether the sight-line target is over against one of target areas of the vehicle-mounted screen, and outputting a judging result; when the sight line target in the judgment result is over against one of the target areas, receiving a second interaction instruction sent by the AR glasses, and extracting second eye movement data of the wearing object in the second interaction instruction; and when the second eye movement data is in the preset eye movement mode, controlling the driving motor to start so as to adjust the expansion or contraction of the vehicle-mounted screen. According to the application, the first eye movement data of the wearing object is collected through the AR glasses and sent to the vehicle so as to collect the first eye movement data of the wearing object for real-time monitoring, the expansion of the vehicle-mounted screen depends on the first eye movement data and the second eye movement data of the AR glasses, and the expansion or contraction of the vehicle-mounted screen is controlled after the vehicle-mounted screen is worn by the vehicle occupant, so that the size requirement of the vehicle-mounted screen by the vehicle occupant is met.
It should be understood that, although the steps in the flowcharts of fig. 2-6 are shown in order as indicated by the arrows, these steps are not necessarily performed in order as indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in fig. 2-6 may include multiple sub-steps or stages that are not necessarily performed at the same time, but may be performed at different times, nor does the order in which the sub-steps or stages are performed necessarily occur in sequence, but may be performed alternately or alternately with at least a portion of the sub-steps or stages of other steps or other steps.
In one embodiment, as shown in fig. 7, there is provided a control device of an in-vehicle screen, including: a first extraction module 201, a line-of-sight judgment module 202, a second extraction module 203, and a screen adjustment module 204, wherein:
a first extraction module 201, configured to receive a first interaction instruction sent by the AR glasses, and extract first eye movement data of a wearing object in the first interaction instruction;
a sight line judging module 202, configured to determine a sight line target according to the first eye movement data, judge whether the sight line target is right against one of target areas of the vehicle-mounted screen, and output a judgment result;
a second extracting module 203, configured to receive a second interaction instruction sent by the AR glasses when the sight-line target in the determination result is opposite to one of the target areas, and extract second eye movement data of the wearing object in the second interaction instruction;
and the screen adjusting module 204 is configured to control the driving motor to start when the second eye movement data is in the preset eye movement mode, so as to adjust expansion or contraction of the vehicle-mounted screen.
For specific limitations of the control device of the in-vehicle screen, reference may be made to the above limitations of the control method of the in-vehicle screen, and no further description is given here. The respective modules in the control device of the on-vehicle screen described above may be implemented in whole or in part by software, hardware, and a combination thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
In one embodiment, a computer device is provided, which may be a server, and the internal structure of which may be as shown in fig. 8. The computer device includes a processor, a memory, a network interface, and a database connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, computer programs, and a database. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The database of the computer device is for storing eye movement data. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program, when executed by a processor, implements a method of controlling an on-board screen.
It will be appreciated by those skilled in the art that the structure shown in FIG. 8 is merely a block diagram of some of the structures associated with the present inventive arrangements and is not limiting of the computer device to which the present inventive arrangements may be applied, and that a particular computer device may include more or fewer components than shown, or may combine some of the components, or have a different arrangement of components.
In one embodiment, a computer device is provided comprising a memory and a processor, the memory having stored therein a computer program, the processor when executing the computer program performing the steps of:
receiving a first interaction instruction sent by AR (augmented reality) glasses, and extracting first eye movement data of a wearing object in the first interaction instruction; determining a sight-line target according to the first eye movement data, judging whether the sight-line target is right opposite to one of target areas of the vehicle-mounted screen, and outputting a judging result; when the sight line target in the judging result is right opposite to one of the target areas, receiving a second interaction instruction sent by the AR glasses, and extracting second eye movement data of the wearing object in the second interaction instruction; and when the second eye movement data is in the preset eye movement mode, controlling the driving motor to start so as to adjust the expansion or contraction of the vehicle-mounted screen.
In one embodiment, a computer readable storage medium is provided having a computer program stored thereon, which when executed by a processor, performs the steps of:
receiving a first interaction instruction sent by AR (augmented reality) glasses, and extracting first eye movement data of a wearing object in the first interaction instruction; determining a sight-line target according to the first eye movement data, judging whether the sight-line target is right opposite to one of target areas of the vehicle-mounted screen, and outputting a judging result; when the sight line target in the judging result is right opposite to one of the target areas, receiving a second interaction instruction sent by the AR glasses, and extracting second eye movement data of the wearing object in the second interaction instruction; and when the second eye movement data is in the preset eye movement mode, controlling the driving motor to start so as to adjust the expansion or contraction of the vehicle-mounted screen.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in embodiments provided herein may include non-volatile and/or volatile memory. The nonvolatile memory can include Read Only Memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), memory bus direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM), among others.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples illustrate only a few embodiments of the application, which are described in detail and are not to be construed as limiting the scope of the application. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the application, which are all within the scope of the application. Accordingly, the scope of protection of the present application is to be determined by the appended claims.

Claims (10)

1. The control method of the vehicle-mounted screen is applied to a vehicle machine, the vehicle machine is connected with AR glasses and a driving motor, and the driving motor is used for adjusting the size of the vehicle-mounted screen, and is characterized by comprising the following steps:
receiving a first interaction instruction sent by AR (augmented reality) glasses, and extracting first eye movement data of a wearing object in the first interaction instruction;
determining a sight-line target according to the first eye movement data, judging whether the sight-line target is right opposite to one of target areas of the vehicle-mounted screen, and outputting a judging result;
when the sight line target in the judging result is right opposite to one of the target areas, receiving a second interaction instruction sent by the AR glasses, and extracting second eye movement data of the wearing object in the second interaction instruction;
and when the second eye movement data is in the preset eye movement mode, controlling the driving motor to start so as to adjust the expansion or contraction of the vehicle-mounted screen.
2. The control method of an in-vehicle screen according to claim 1, wherein the step of determining a sight-line target from the first eye movement data includes:
determining a sight line target corresponding to the first eye movement data by utilizing an eye movement tracking technology;
determining the seat position of the current wearing object according to the sight line target;
and when the seat position is the driving position, selecting whether to execute the pre-configuration process according to the running state of the current vehicle, and if not, directly executing the pre-configuration process.
3. The control method of the in-vehicle screen according to claim 2, characterized in that the step of selecting whether to execute the pre-configuration flow according to the current running state of the vehicle when the seat position is the driving position, includes:
judging whether the seat position is a driving position or not, and outputting a judging result;
if the seat position in the judgment result is the driving position, acquiring current state data of the vehicle;
if the state data of the vehicle is in a driving state, interrupting interaction with the AR glasses;
and if the state data of the vehicle is in a parking state, extracting a sight line target in the first eye movement data.
4. The method according to claim 1, wherein the step of determining a sight line target based on the first eye movement data, determining whether the sight line target is facing one of the target areas of the vehicle-mounted screen, and outputting the determination result comprises:
dividing a vehicle-mounted screen into a plurality of target areas in advance, wherein different target areas are used for matching different screen adjusting positions;
matching the current selected area fed back by the AR glasses with each target area divided in advance, and if the current selected area is matched with one of the target areas, indicating that the current sight target is opposite to one of the target areas of the vehicle-mounted screen;
before the AR glasses feed back the currently selected area, the virtual cursor of the AR glasses is moved by utilizing an eye tracking technology to move in the visual field range, and the real-time moving area of the virtual cursor is marked to generate first eye movement data.
5. The method for controlling a vehicle-mounted screen according to claim 1, wherein the step of receiving a second interaction instruction sent by the AR glasses when the sight-line target is facing one of the target areas in the judgment result, and extracting second eye movement data of the wearing object in the second interaction instruction comprises:
responding to the fact that the sight line target is right opposite to one of the target areas, feeding back a determining selection instruction to the AR glasses, so that the wearing object executes preset eye movements according to the determining selection instruction, and outputting a second interaction instruction after the AR glasses generate second eye movement data according to the eye movements;
and receiving a second interaction instruction sent by the AR glasses, and extracting second eye movement data in the second interaction instruction.
6. The method for controlling an on-vehicle screen according to claim 1, wherein the step of controlling the driving motor to be started to adjust the expansion or contraction of the on-vehicle screen when the second eye movement data is a preset eye movement pattern comprises:
when the sight line target is opposite to one of the target areas, determining a screen adjusting position of the vehicle-mounted screen;
when the second eye movement data is in the preset eye movement mode, a starting signal is generated, and after the driving motor is controlled to start, the vehicle-mounted screen is controlled to expand or contract.
7. The method for controlling a vehicle-mounted screen according to claim 5, wherein when the sight-line target is opposite to one of the target areas in the judgment result, the step of receiving a second interaction instruction sent by the AR glasses, and extracting second eye movement data of the wearing object in the second interaction instruction further comprises:
responding to the fact that the sight line target is right opposite to one of the target areas, feeding back the expandable range of the vehicle-mounted screen to the AR glasses, so that the wearing object can execute preset eye movements after adjusting the sight line target according to the expandable range, and the AR glasses can generate second eye movement data according to the eye movements.
8. A control device of an in-vehicle screen, characterized by comprising:
the first extraction module is used for receiving a first interaction instruction sent by the AR glasses and extracting first eye movement data of a wearing object in the first interaction instruction;
the sight line judging module is used for determining a sight line target according to the first eye movement data, judging whether the sight line target is right opposite to one of target areas of the vehicle-mounted screen, and outputting a judging result;
the second extraction module is used for receiving a second interaction instruction sent by the AR glasses when the sight line target is right opposite to one of the target areas in the judgment result, and extracting second eye movement data of the wearing object in the second interaction instruction;
and the screen adjusting module is used for controlling the driving motor to start when the second eye movement data is in a preset eye movement mode so as to adjust the expansion or contraction of the vehicle-mounted screen.
9. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method of any of claims 1 to 7 when the computer program is executed.
10. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method according to any one of claims 1 to 7.
CN202311057078.2A 2023-08-18 2023-08-18 Control method and device of vehicle-mounted screen, computer equipment and storage medium Pending CN117055734A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311057078.2A CN117055734A (en) 2023-08-18 2023-08-18 Control method and device of vehicle-mounted screen, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311057078.2A CN117055734A (en) 2023-08-18 2023-08-18 Control method and device of vehicle-mounted screen, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117055734A true CN117055734A (en) 2023-11-14

Family

ID=88664152

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311057078.2A Pending CN117055734A (en) 2023-08-18 2023-08-18 Control method and device of vehicle-mounted screen, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117055734A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117755076A (en) * 2023-12-26 2024-03-26 广州车全影电子科技有限公司 Automobile instrument panel control method and device, storage medium and electronic equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117755076A (en) * 2023-12-26 2024-03-26 广州车全影电子科技有限公司 Automobile instrument panel control method and device, storage medium and electronic equipment
CN117755076B (en) * 2023-12-26 2024-05-31 广州车全影电子科技有限公司 Automobile instrument panel control method and device, storage medium and electronic equipment

Similar Documents

Publication Publication Date Title
CN111931579B (en) Automatic driving assistance system and method using eye tracking and gesture recognition techniques
EP3261871B1 (en) Display control apparatus and method
CN117055734A (en) Control method and device of vehicle-mounted screen, computer equipment and storage medium
JP2020509466A (en) Computational framework system and method for driver visual attention using a complete convolutional architecture
US10040353B2 (en) Information display system
CN107284356A (en) Vehicle mirror alternative system
US20180335633A1 (en) Viewing direction detector and viewing direction detection system
EP4245609A1 (en) Rear-view mirror control method and related device
CN111252074B (en) Multi-modal control method, device, computer-readable storage medium and vehicle
US20220289249A1 (en) Vehicle driving system
CN112026652A (en) Back-view image head-up display system based on eyeball recognition laser imaging
EP3734502B1 (en) Display method
CN114604191A (en) Intelligent cabin active interaction system and method, electronic equipment and storage medium
DE102017215405A1 (en) Method, mobile user device, system, computer program for controlling a mobile user device of an occupant of a vehicle
JPWO2020105685A1 (en) Display controls, methods, and computer programs
US9495871B2 (en) Display control device, display control method, non-transitory recording medium, and projection device
CN113002461A (en) Virtual image position adjusting method, device and storage medium of AR-HUD system
US20230356746A1 (en) Presentation control device and non-transitory computer readable medium
JP5223289B2 (en) Visual information presentation device and visual information presentation method
CN112109729B (en) Man-machine interaction method, device and system for vehicle-mounted system
GB2566611B (en) Display control apparatus and method
US20230022485A1 (en) Vehicle display control device, vehicle display device, vehicle display control method, and non-transitory storage medium
JP2018034716A (en) Operation system for vehicle and computer program
CN110120110A (en) Reversing image processing method, device, computer equipment and storage medium
DE102019105216A1 (en) Method and device for operating a display system with data glasses

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination