CN114115656A - Vehicle-mounted augmented reality vehicle window entertainment system and operation method thereof - Google Patents

Vehicle-mounted augmented reality vehicle window entertainment system and operation method thereof Download PDF

Info

Publication number
CN114115656A
CN114115656A CN202111386673.1A CN202111386673A CN114115656A CN 114115656 A CN114115656 A CN 114115656A CN 202111386673 A CN202111386673 A CN 202111386673A CN 114115656 A CN114115656 A CN 114115656A
Authority
CN
China
Prior art keywords
augmented reality
vehicle
window
entertainment
state data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111386673.1A
Other languages
Chinese (zh)
Inventor
卜烨雯
吴文斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SAIC Volkswagen Automotive Co Ltd
Original Assignee
SAIC Volkswagen Automotive Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SAIC Volkswagen Automotive Co Ltd filed Critical SAIC Volkswagen Automotive Co Ltd
Priority to CN202111386673.1A priority Critical patent/CN114115656A/en
Publication of CN114115656A publication Critical patent/CN114115656A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/02Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R2011/0001Arrangements for holding or mounting articles, not otherwise provided for characterised by position
    • B60R2011/0003Arrangements for holding or mounting articles, not otherwise provided for characterised by position inside the vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a vehicle-mounted augmented reality window entertainment system and an operation method thereof, wherein the system comprises: the face detection and analysis module is used for detecting and analyzing the real-time image data of the passengers to generate face information and state data; the system comprises an outside real scene analysis module, a human face state data acquisition module, an outside real scene analysis module and a human face state data analysis module, wherein the outside real scene analysis module is used for acquiring and analyzing outside real scene image data and generating a current attention area image based on the human face state data; the entertainment content generation and operation module is used for generating an entertainment content image and corresponding functional logic according to the face information and state data and the current attention area image and receiving an operation instruction of a user to the functional logic; and the augmented reality module is used for generating augmented reality display content according to the entertainment content image and projecting the augmented reality display content onto the car window.

Description

Vehicle-mounted augmented reality vehicle window entertainment system and operation method thereof
Technical Field
The invention mainly relates to the field of internet automobiles, in particular to a vehicle-mounted augmented reality window entertainment system and an operation method thereof.
Background
With the increasing popularity of household vehicles, more and more families select self-driving as a transportation mode for long-distance or short-distance travel, and passengers on the vehicle need various entertainment modes and contents to spend boring time on the vehicle, and particularly under the condition that children take the vehicle, if the mood of the children cannot be pacified to cause crying and screaming, the mood of the driver is more likely to be influenced, so that the driving safety is greatly influenced.
At present, people often select several ways of sleeping, chatting, reading books, watching a mobile phone or an entertainment screen in a cabin and the like to meet the demand of entertainment in a car. Sleeping and chatting can meet the requirements for a period of time, but cannot keep the state of being effective for a long time; reading books and watching screens in the swaying cabin has great damage to vision. For the member with weak balance tolerance, discomfort such as dizziness and vomiting can be caused due to balance stimulation, and the vehicle riding experience is influenced.
Disclosure of Invention
The invention aims to provide a vehicle-mounted augmented reality window entertainment system and an operation method thereof, which meet the leisure requirements of passengers in the driving process of a vehicle and improve the riding experience.
In order to solve the technical problem, the invention provides a vehicle-mounted augmented reality window entertainment system, which comprises: the face detection and analysis module is used for detecting and analyzing the real-time image data of the passengers to generate face information and state data; the system comprises an outside real scene analysis module, a human face state data acquisition module, an outside real scene analysis module and a human face state data analysis module, wherein the outside real scene analysis module is used for acquiring and analyzing outside real scene image data and generating a current attention area image based on the human face state data; the entertainment content generation and operation module is used for generating an entertainment content image and corresponding functional logic according to the face information and state data and the current attention area image and receiving an operation instruction of a user to the functional logic; and the augmented reality module is used for generating augmented reality display content according to the entertainment content image and projecting the augmented reality display content onto the car window.
In an embodiment of the present invention, the face information and the state data include face basic information, head orientation, gaze direction and field of view range.
In an embodiment of the present invention, the face detection is implemented by an in-vehicle camera device; the outdoor live-action data is collected through an outdoor camera device.
In an embodiment of the invention, the projection is realized by an optical glass projector.
In an embodiment of the present invention, the operation control instruction of the user is obtained by a point touch sensing device.
In one embodiment of the invention, the point touch sensing device comprises a window point touch sensor.
In an embodiment of the invention, the vehicle-mounted augmented reality window entertainment system further includes a seat control device, which is used for receiving the control signal sent by the entertainment content generation and operation module, and performing corresponding action according to the control signal.
In an embodiment of the invention, the control signal includes a position and direction adjustment command.
In an embodiment of the invention, the entertainment content image includes annotation and description information of the scenery in the current region of interest image.
The invention also provides an operation method of the vehicle-mounted augmented reality window entertainment system, which comprises the following steps:
detecting and analyzing the real-time image data of the passengers to generate face information and state data; acquiring and analyzing the outdoor live-action image data, and generating a current attention area image based on the face state data; generating an entertainment content image and a corresponding functional logic according to the face information and state data and the current attention area image, and receiving an operation instruction of a user to the functional logic; and generating augmented reality display content according to the entertainment content image, and projecting the augmented reality display content onto a vehicle window.
Compared with the prior art, the invention has the following advantages: when the vehicle runs, the scene outside the vehicle when the vehicle runs is combined with the attention area of the passenger in the vehicle to generate the entertainment content image, and corresponding operation function logic is provided, so that the leisure requirement of the passenger is met during the running of the vehicle, and the passenger experience of the vehicle is improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the principle of the application. In the drawings:
fig. 1 is a schematic composition diagram of an in-vehicle augmented reality window entertainment system according to an embodiment of the present application.
Fig. 2 is a flowchart of an operation method of the vehicle-mounted augmented reality window entertainment system according to an embodiment of the present application.
FIG. 3 is a schematic layout diagram of components of an in-vehicle augmented reality window entertainment system according to an embodiment of the present application.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings used in the description of the embodiments will be briefly introduced below. It is obvious that the drawings in the following description are only examples or embodiments of the application, from which the application can also be applied to other similar scenarios without inventive effort for a person skilled in the art. Unless otherwise apparent from the context, or otherwise indicated, like reference numbers in the figures refer to the same structure or operation.
As used in this application and the appended claims, the terms "a," "an," "the," and/or "the" are not intended to be inclusive in the singular, but rather are intended to be inclusive in the plural unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that steps and elements are included which are explicitly identified, that the steps and elements do not form an exclusive list, and that a method or apparatus may include other steps or elements.
The relative arrangement of the components and steps, the numerical expressions, and numerical values set forth in these embodiments do not limit the scope of the present application unless specifically stated otherwise. Meanwhile, it should be understood that the sizes of the respective portions shown in the drawings are not drawn in an actual proportional relationship for the convenience of description. Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate. In all examples shown and discussed herein, any particular value should be construed as merely illustrative, and not limiting. Thus, other examples of the exemplary embodiments may have different values. It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
In the description of the present application, it is to be understood that the orientation or positional relationship indicated by the directional terms such as "front, rear, upper, lower, left, right", "lateral, vertical, horizontal" and "top, bottom", etc., are generally based on the orientation or positional relationship shown in the drawings, and are used for convenience of description and simplicity of description only, and in the case of not making a reverse description, these directional terms do not indicate and imply that the device or element being referred to must have a particular orientation or be constructed and operated in a particular orientation, and therefore, should not be considered as limiting the scope of the present application; the terms "inner and outer" refer to the inner and outer relative to the profile of the respective component itself.
Spatially relative terms, such as "above … …," "above … …," "above … …," "above," and the like, may be used herein for ease of description to describe one device or feature's spatial relationship to another device or feature as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if a device in the figures is turned over, devices described as "above" or "on" other devices or configurations would then be oriented "below" or "under" the other devices or configurations. Thus, the exemplary term "above … …" can include both an orientation of "above … …" and "below … …". The device may be otherwise variously oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
It should be noted that the terms "first", "second", and the like are used to define the components, and are only used for convenience of distinguishing the corresponding components, and the terms have no special meanings unless otherwise stated, and therefore, the scope of protection of the present application is not to be construed as being limited. Further, although the terms used in the present application are selected from publicly known and used terms, some of the terms mentioned in the specification of the present application may be selected by the applicant at his or her discretion, the detailed meanings of which are described in relevant parts of the description herein. Further, it is required that the present application is understood not only by the actual terms used but also by the meaning of each term lying within.
It will be understood that when an element is referred to as being "on," "connected to," "coupled to" or "contacting" another element, it can be directly on, connected or coupled to, or contacting the other element or intervening elements may be present. In contrast, when an element is referred to as being "directly on," "directly connected to," "directly coupled to" or "directly contacting" another element, there are no intervening elements present. Similarly, when a first component is said to be "in electrical contact with" or "electrically coupled to" a second component, there is an electrical path between the first component and the second component that allows current to flow. The electrical path may include capacitors, coupled inductors, and/or other components that allow current to flow even without direct contact between the conductive components.
Flow charts are used herein to illustrate operations performed by systems according to embodiments of the present application. It should be understood that the preceding or following operations are not necessarily performed in the exact order in which they are performed. Rather, various steps may be processed in reverse order or simultaneously. Meanwhile, other operations are added to or removed from these processes.
Embodiments of the present application describe an in-vehicle augmented reality window entertainment system.
Fig. 1 is a schematic composition diagram of an in-vehicle augmented reality window entertainment system according to an embodiment of the present application.
Fig. 2 is a flowchart of an operation method of the vehicle-mounted augmented reality window entertainment system according to an embodiment of the present application.
Referring to fig. 1, an in-vehicle augmented reality window entertainment system 100 includes a face detection and analysis module 102, an out-of-vehicle real-world analysis module 104, an entertainment content generation and operation module 106, and an augmented reality module 108.
In some embodiments, the face detection and analysis module 102 is configured to detect and analyze the real-time image data of the passenger to generate face information and status data.
In one embodiment, the face information and status data includes face basic information, head orientation, gaze direction, and field of view range. The face basic information includes, for example, gender and age group or age information corresponding to the face, and the information is obtained based on the detection and analysis of the face. In practical applications, the function can be implemented by professional service providers, for example, products for implementing related functions by enterprises in the artificial intelligence industry. The data such as the head orientation, the sight line direction, the sight field range and the like can be acquired by analyzing and calculating the face data.
In some embodiments, the off-board live-action analysis module 104 is configured to collect and analyze the off-board live-action image data and generate the current region-of-interest image based on the face state data.
The face detection is realized through an in-vehicle camera device; the outdoor live-action data is collected through an outdoor camera device. The in-vehicle imaging device is located, for example, in front of a single-side passenger or in front of a center of two-side passengers in the vehicle. The vehicle exterior camera device can be positioned at different positions of the vehicle body and arranged and installed according to requirements.
In some embodiments, the entertainment content generating and operating module 106 is configured to generate an entertainment content image and corresponding function logic according to the face information and state data and the current attention area image, and receive an operating instruction of a user to the function logic. And the operation control instruction of the user is acquired through the point touch sensing device. The point touch sensing device includes, for example, a window point touch sensor, and may also be a point touch sensor located elsewhere in the vehicle passenger compartment.
In some embodiments, an augmented Reality (Advanced Reality) module 108 is configured to generate augmented Reality display content from the entertainment content imagery and project the augmented Reality display content onto a vehicle window. The projection is for example realized by an optical glass projector.
The window includes, for example, a side window of a vehicle. The optical glass projector is located, for example, below a window of a vehicle and inside a door of the vehicle.
In some embodiments, the vehicle-mounted augmented reality window entertainment system of the present application further comprises a seat control device. The seat control device is used for receiving the control signal sent by the entertainment content generation and operation module and carrying out corresponding action according to the control signal. The control signal includes, for example, a position and direction adjustment command for the seat.
In some embodiments, the entertainment content imagery includes annotation and description information for the scene in the current region of interest image. The caption information includes, for example, an animated video stream corresponding to the scene.
The camera device in the vehicle, the camera device outside the vehicle, the point touch sensing device, the optical glass projector and the like are coupled to a vehicle-mounted machine system of the vehicle, or the central control system, so that data are processed in a centralized manner, and corresponding display effects are output on a vehicle window under control.
In some application scenarios, the vehicle-mounted augmented reality window entertainment system of the present application can provide scene display and interaction of multiple functions to passengers in a vehicle. For example, in an augmented reality game, game contents are projected in a real scene in a window area and change as a vehicle travels. The teaching device can also be used for children science popularization teaching, marks and explains things in the real scene outside the car window, and can be matched with animation projection and the like for visualization teaching. In addition, the scenic spot guide along the vehicle can be realized, buildings, places of interest, historic sites, scenic spots and the like passing by the vehicle in the advancing process are marked and explained, and the scenic spot guide can be displayed and introduced by matching with images or video clips and the like.
The application also provides an operation method of the vehicle-mounted augmented reality vehicle window entertainment system.
As mentioned above, fig. 2 is a flowchart of an operation method of the vehicle-mounted augmented reality window entertainment system according to an embodiment of the present application.
As shown in fig. 2, the operation method of the vehicle-mounted augmented reality window entertainment system includes, step 101, detecting and analyzing real-time image data of passengers to generate face information and state data; 102, acquiring and analyzing the outdoor live-action image data, and generating a current attention area image based on the face state data; 103, generating an entertainment content image and a corresponding functional logic according to the face information and state data and the current attention area image, and receiving an operation instruction of a user to the functional logic; and 104, generating augmented reality display content according to the entertainment content image, and projecting the augmented reality display content onto a vehicle window.
FIG. 3 is a schematic layout diagram of components of an in-vehicle augmented reality window entertainment system according to an embodiment of the present application.
As illustrated in fig. 3, inside a vehicle passenger compartment 300, 301 is, for example, a front seat, 302 is, for example, a rear seat, and 303 is a window glass region.
The layout of the in-vehicle augmented reality window entertainment system includes, for example, an in-vehicle image pickup device 401, an out-vehicle image pickup device 402, an optical glass projector 403, a point touch sensor device 404, and a seat control device 405. Fig. 3 is a block diagram showing the region where each device is located.
As described above, the optical glass projector is located, for example, below the window and inside the door. The in-vehicle imaging device is located, for example, in front of a single-side passenger or in front of a center of two-side passengers in the vehicle. The vehicle exterior camera device can be positioned at different positions of the vehicle body and arranged and installed according to requirements. The touch sensing device includes, for example, a window touch sensor, and may be a touch sensor located elsewhere in the vehicle passenger compartment.
The seat control is located, for example, inside the seat, the designation in fig. 3 being indicated by a dashed line. The seat control device is used for receiving the control signal sent by the entertainment content generation and operation module and carrying out corresponding action according to the control signal.
The technical scheme of the application can be applied to solving the problem of entertainment requirements of passengers in the vehicle during driving of the vehicle, particularly during long-distance travel, and the provided entertainment types are rich in contents. The vision can be protected by combining with the augmented reality display of the live-action, the body discomfort (such as carsickness and the like) caused by the balance stimulation brought by the driving speed of passengers can be avoided, and the method is an effective means for realizing the digital intelligent cockpit.
The scheme of this application has also guaranteed the stability of driving environment when bringing joyful for the passenger to also reach very strong supplementary action to driving safety.
Having thus described the basic concept, it will be apparent to those skilled in the art that the foregoing disclosure is by way of example only, and is not intended to limit the present application. Various modifications, improvements and adaptations to the present application may occur to those skilled in the art, although not explicitly described herein. Such modifications, improvements and adaptations are proposed in the present application and thus fall within the spirit and scope of the exemplary embodiments of the present application.
Also, this application uses specific language to describe embodiments of the application. Reference throughout this specification to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic described in connection with at least one embodiment of the present application is included in at least one embodiment of the present application. Therefore, it is emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, some features, structures, or characteristics of one or more embodiments of the present application may be combined as appropriate.
Aspects of the present application may be embodied entirely in hardware, entirely in software (including firmware, resident software, micro-code, etc.) or in a combination of hardware and software. The above hardware or software may be referred to as "data block," module, "" engine, "" unit, "" component, "or" system. The processor may be one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), digital signal processing devices (DAPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), processors, controllers, microcontrollers, microprocessors, or a combination thereof. Furthermore, aspects of the present application may be represented as a computer product, including computer readable program code, embodied in one or more computer readable media. For example, computer-readable media may include, but are not limited to, magnetic storage devices (e.g., hard disk, floppy disk, magnetic strips … …), optical disks (e.g., Compact Disk (CD), Digital Versatile Disk (DVD) … …), smart cards, and flash memory devices (e.g., card, stick, key drive … …).
The computer readable medium may comprise a propagated data signal with the computer program code embodied therein, for example, on a baseband or as part of a carrier wave. The propagated signal may take any of a variety of forms, including electromagnetic, optical, and the like, or any suitable combination. The computer readable medium can be any computer readable medium that can communicate, propagate, or transport the program for use by or in connection with an instruction execution system, apparatus, or device. Program code on a computer readable medium may be propagated over any suitable medium, including radio, electrical cable, fiber optic cable, radio frequency signals, or the like, or any combination of the preceding.
Similarly, it should be noted that in the preceding description of embodiments of the application, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of one or more of the embodiments. This method of disclosure, however, is not intended to require more features than are expressly recited in the claims. Indeed, the embodiments may be characterized as having less than all of the features of a single embodiment disclosed above.
Although the present application has been described with reference to the present specific embodiments, it will be recognized by those skilled in the art that the foregoing embodiments are merely illustrative of the present application and that various changes and substitutions of equivalents may be made without departing from the spirit of the application, and therefore, it is intended that all changes and modifications to the above-described embodiments that come within the spirit of the application fall within the scope of the claims of the application.

Claims (10)

1. An in-vehicle augmented reality window entertainment system, comprising:
the face detection and analysis module is used for detecting and analyzing the real-time image data of the passengers to generate face information and state data;
the system comprises an outside real scene analysis module, a human face state data acquisition module, an outside real scene analysis module and a human face state data analysis module, wherein the outside real scene analysis module is used for acquiring and analyzing outside real scene image data and generating a current attention area image based on the human face state data;
the entertainment content generation and operation module is used for generating an entertainment content image and corresponding functional logic according to the face information and state data and the current attention area image and receiving an operation instruction of a user to the functional logic;
and the augmented reality module is used for generating augmented reality display content according to the entertainment content image and projecting the augmented reality display content onto the car window.
2. The vehicle-mounted augmented reality window entertainment system of claim 1, wherein the face information and status data includes face base information, head orientation, gaze direction, and field of view range.
3. The vehicle-mounted augmented reality window entertainment system of claim 1, wherein the face detection is achieved by an in-vehicle camera device; the outdoor live-action data is collected through an outdoor camera device.
4. The vehicle-mounted augmented reality window entertainment system of claim 1, wherein the projection is achieved by an optical glass projector.
5. The vehicle-mounted augmented reality window entertainment system of claim 1, wherein the user's operation control instruction is obtained by a point-touch sensing device.
6. The vehicle-mounted augmented reality window entertainment system of claim 5, wherein the point touch sensing device comprises a window point touch sensor.
7. The vehicle-mounted augmented reality window entertainment system of claim 1 further comprising a seat control device for receiving the control signal sent by the entertainment content generation and operation module and performing corresponding action according to the control signal.
8. The vehicle-mounted augmented reality window entertainment system of claim 7, wherein the control signal comprises a position and orientation adjustment instruction.
9. The vehicle-mounted augmented reality window entertainment system of claim 1, wherein the entertainment content image includes annotation and description information for the scene in the current region of interest image.
10. An operation method of a vehicle-mounted augmented reality window entertainment system comprises the following steps:
detecting and analyzing the real-time image data of the passengers to generate face information and state data;
acquiring and analyzing the outdoor live-action image data, and generating a current attention area image based on the face state data;
generating an entertainment content image and a corresponding functional logic according to the face information and state data and the current attention area image, and receiving an operation instruction of a user to the functional logic;
and generating augmented reality display content according to the entertainment content image, and projecting the augmented reality display content onto a vehicle window.
CN202111386673.1A 2021-11-22 2021-11-22 Vehicle-mounted augmented reality vehicle window entertainment system and operation method thereof Pending CN114115656A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111386673.1A CN114115656A (en) 2021-11-22 2021-11-22 Vehicle-mounted augmented reality vehicle window entertainment system and operation method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111386673.1A CN114115656A (en) 2021-11-22 2021-11-22 Vehicle-mounted augmented reality vehicle window entertainment system and operation method thereof

Publications (1)

Publication Number Publication Date
CN114115656A true CN114115656A (en) 2022-03-01

Family

ID=80439492

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111386673.1A Pending CN114115656A (en) 2021-11-22 2021-11-22 Vehicle-mounted augmented reality vehicle window entertainment system and operation method thereof

Country Status (1)

Country Link
CN (1) CN114115656A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103366708A (en) * 2012-03-27 2013-10-23 冠捷投资有限公司 Transparent display with real scene tour-guide function
CN203793139U (en) * 2013-11-19 2014-08-27 惠州比亚迪电池有限公司 Control device based on automobile side window and automobile with control device
CN108995590A (en) * 2018-07-26 2018-12-14 广州小鹏汽车科技有限公司 A kind of people's vehicle interactive approach, system and device
CN109739352A (en) * 2018-12-27 2019-05-10 斑马网络技术有限公司 The methods of exhibiting and equipment of Land-scape picture
CN112498189A (en) * 2019-09-16 2021-03-16 广州汽车集团股份有限公司 Car seat coordinated control system
CN113492756A (en) * 2021-07-01 2021-10-12 中汽创智科技有限公司 Method, device, equipment and storage medium for displaying vehicle external information

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103366708A (en) * 2012-03-27 2013-10-23 冠捷投资有限公司 Transparent display with real scene tour-guide function
CN203793139U (en) * 2013-11-19 2014-08-27 惠州比亚迪电池有限公司 Control device based on automobile side window and automobile with control device
CN108995590A (en) * 2018-07-26 2018-12-14 广州小鹏汽车科技有限公司 A kind of people's vehicle interactive approach, system and device
CN109739352A (en) * 2018-12-27 2019-05-10 斑马网络技术有限公司 The methods of exhibiting and equipment of Land-scape picture
CN112498189A (en) * 2019-09-16 2021-03-16 广州汽车集团股份有限公司 Car seat coordinated control system
CN113492756A (en) * 2021-07-01 2021-10-12 中汽创智科技有限公司 Method, device, equipment and storage medium for displaying vehicle external information

Similar Documents

Publication Publication Date Title
KR102432614B1 (en) A shared experience for vehicle occupants and remote users
CN108028016B (en) Augmented reality display system
Abdi et al. In-vehicle augmented reality traffic information system: a new type of communication between driver and vehicle
CN108140311A (en) The display methods and parking aid of parking assisting information
CN110281932A (en) Travel controlling system, vehicle, drive-control system, travel control method and storage medium
TWI738132B (en) Human-computer interaction method based on motion analysis, in-vehicle device
Löcken et al. Increasing user experience and trust in automated vehicles via an ambient light display
US20200209850A1 (en) Methods and systems to facilitate monitoring center for ride share and safe testing method based for selfdriving cars to reduce the false call by deuddaction systems based on deep learning machine
Kim et al. Effects on productivity and safety of map and augmented reality navigation paradigms
CN114201038A (en) Integrated augmented reality system for sharing augmented reality content between vehicle occupants
Rong et al. Artificial intelligence methods in in-cabin use cases: A survey
Hofbauer et al. Measuring driver situation awareness using region-of-interest prediction and eye tracking
Jansen et al. Autovis: Enabling mixed-immersive analysis of automotive user interface interaction studies
US20220092860A1 (en) Extended reality for moving platforms
Feld et al. Dfki cabin simulator: A test platform for visual in-cabin monitoring functions
Feld et al. Dfki cabin simulator: A test platform for visual in-cabin monitoring functions
CN114115656A (en) Vehicle-mounted augmented reality vehicle window entertainment system and operation method thereof
KR101850857B1 (en) Display Apparatus and Vehicle Having The Same
Akaho et al. A study and evaluation on route guidance of a car navigation system based on augmented reality
CN108304244B (en) Method and device for displaying vehicle-mounted system interface
WO2019114019A1 (en) Scene generation method for self-driving vehicle and intelligent glasses
Crispim-Junior et al. AutoExp: A multidisciplinary, multi-sensor framework to evaluate human activities in self-driving cars
CN109070799A (en) Display methods and moving body display apparatus for displaying image of surroundings around moving body
JP7021899B2 (en) Image generator and image generation method
CN114327079B (en) Test driving effect presentation device, method and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20220301

RJ01 Rejection of invention patent application after publication