CN115297308B - Surrounding AR-HUD projection system and method based on unmanned aerial vehicle - Google Patents

Surrounding AR-HUD projection system and method based on unmanned aerial vehicle Download PDF

Info

Publication number
CN115297308B
CN115297308B CN202210908521.1A CN202210908521A CN115297308B CN 115297308 B CN115297308 B CN 115297308B CN 202210908521 A CN202210908521 A CN 202210908521A CN 115297308 B CN115297308 B CN 115297308B
Authority
CN
China
Prior art keywords
vehicle
virtual
projection
unmanned aerial
aerial vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210908521.1A
Other languages
Chinese (zh)
Other versions
CN115297308A (en
Inventor
程梁柱
舒丽
沈骏
王夏雨
李祥一
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dongfeng Motor Corp
Original Assignee
Dongfeng Motor Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dongfeng Motor Corp filed Critical Dongfeng Motor Corp
Priority to CN202210908521.1A priority Critical patent/CN115297308B/en
Publication of CN115297308A publication Critical patent/CN115297308A/en
Application granted granted Critical
Publication of CN115297308B publication Critical patent/CN115297308B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/363Image reproducers using image projection screens
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Navigation (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention relates to the technical field of virtual reality, and discloses a surrounding AR-HUD projection system and method based on an unmanned aerial vehicle, wherein the system comprises a vehicle, at least one physical projection surface is arranged on the vehicle, the unmanned aerial vehicle is projected, the flying speed and the flying direction of the unmanned aerial vehicle are synchronous with the vehicle, the unmanned aerial vehicle is provided with a projection lens, and the projection lens is used for projecting a received projection picture to the physical projection surface of the vehicle; the first memory is used for storing a preset virtual image model, wherein the virtual image model comprises a virtual map and a virtual scene image corresponding to each virtual coordinate in the virtual map; the first processor is used for giving virtual coordinates of the vehicle on the virtual map according to the instruction, and uploading a virtual scene image corresponding to the virtual coordinates to the projection lens as a projection picture. The invention can meet the entertainment demands of passengers and drivers in the vehicle, realize the audiovisual experience of virtual reality movies or games, does not occupy the cabin space in the vehicle, and can not shade the pictures along with the activities of the passengers or the movement of the vehicle.

Description

Surrounding AR-HUD projection system and method based on unmanned aerial vehicle
Technical Field
The invention relates to the technical field of virtual reality, in particular to a surrounding AR-HUD projection system and method based on an unmanned plane.
Background
With the development and popularization of information technology, providing users with virtual immersive experience based on real scenes has been the goal of efforts of all parties. Virtual Reality (Virtual Reality) refers to an interface between people and machines that artificially creates a particular environment or situation to allow the person using it to interact as if it were actually surrounding the situation and environment. Virtual reality is sometimes also mixed with terms such as artificial reality (artificial reality), cyberspace, virtual worlds, virtual environment (virtual environment), synthetic environment (synthetic environment), artificial environment (artificial environment), augmented reality (augmented reality), mixed reality (mixed reality), and the like. The virtual reality is used for the purpose that for scenes which are difficult to experience in daily life of people, people can be presented and controlled as if they were in the scene without directly experiencing the scenes. The virtual reality head-up display (AR-HUD, augmented Reality head-up display) projects information to the front windshield through a projection device in the vehicle, so that an extension line of reflected light forms a far virtual image in the view of the driver, and the effect of reality augmentation is achieved. The action that the driver looks at the instrument to obtain various information is avoided, and the driving safety is improved.
For high-level autopilot vehicles, the traditional virtual reality head-up display of the vehicle during autopilot is no longer important, and in-vehicle occupants can more see the amusement facilities in the heavy vehicle to enhance the ride experience. However, the existing vehicle entertainment facilities are usually only vehicle-mounted display screens with built-in video or games, and the traditional vehicle-mounted entertainment facilities cannot realize virtual reality audio-visual experience, so that the requirements of high-grade automatic driving vehicles are not met.
Disclosure of Invention
The invention aims to provide a surrounding AR-HUD projection system and method based on an unmanned aerial vehicle, which utilize a projection unmanned aerial vehicle accompanying around a vehicle to project a projection picture onto a physical projection surface of the vehicle, so as to meet the virtual reality audiovisual experience of drivers and passengers in the vehicle.
In a first aspect, the present invention provides an unmanned aerial vehicle-based surrounding AR-HUD projection system, comprising:
a vehicle having at least one physical projection surface;
the projection unmanned aerial vehicle is synchronous with the vehicle in flying speed and direction, and is provided with a projection lens, wherein the projection lens is used for projecting a received projection picture to a physical projection surface of the vehicle;
the first memory is used for storing a preset virtual image model, wherein the virtual image model comprises a virtual map and a virtual scene image corresponding to each virtual coordinate in the virtual map;
the first processor is used for giving virtual coordinates of the vehicle on the virtual map according to the instruction, and uploading a virtual scene image corresponding to the virtual coordinates to the projection lens as the projection picture.
Further, the plurality of physical projection surfaces comprise side window glass of the vehicle, and the plurality of projection unmanned aerial vehicles are distributed on two sides of the vehicle.
Further, the plurality of physical projection surfaces comprise a front windshield, a side windshield, a rear windshield and a skylight glass of the vehicle, and the plurality of projection unmanned aerial vehicles are distributed around the vehicle.
Further, the projection unmanned aerial vehicle is provided with at least one first sensor, and the first sensor is used for acquiring relative position data of the projection unmanned aerial vehicle and the physical projection surface, and the projection unmanned aerial vehicle adjusts the self flight state based on the relative position data so as to keep the projection unmanned aerial vehicle and the physical projection surface relatively static.
Further, the vehicle is internally provided with a positioning device and a real navigation map, the real navigation map is matched with the virtual map, the first processor gives real coordinates of the vehicle on the real navigation map, and the virtual coordinates of the given vehicle on the virtual map are synchronized in real time based on the real coordinates.
Further, the virtual scene image corresponding to each virtual coordinate includes at least four virtual pictures surrounding the virtual coordinate; the vehicle is provided with at least four physical projection surfaces distributed around the vehicle, and the virtual pictures are projected onto the physical projection surfaces in a one-to-one correspondence as the projection pictures.
Further, the center of the projection screen coincides with the center of the physical projection surface or is within an allowable offset range.
Further, the vehicle is a vehicle having an L4 or L5 class autopilot function.
Further, the virtual image model comprises a film image model or a game image model; the vehicle is internally provided with a sound box and a second memory, preset audio data are stored in the second memory, the audio data are matched with the video image model or the game image model, and the first processor starts the projection unmanned aerial vehicle and the sound box according to the instruction, so that synchronous output of a projection picture and the audio data is realized.
In a second aspect, the invention provides an unmanned aerial vehicle-based surrounding AR-HUD projection method, which comprises the following steps:
setting at least one physical projection surface on the vehicle;
synchronizing the flying speed and direction of the projection unmanned aerial vehicle provided with the projection lens with the vehicle;
storing a preset virtual image model in a first memory, wherein the virtual image model comprises a virtual map and a virtual scene image corresponding to each virtual coordinate in the virtual map;
the first processor gives virtual coordinates of the vehicle on the virtual map according to the instruction, and uploads a virtual scene image corresponding to the virtual coordinates to the projection lens as the projection picture, so that the projection lens projects the received corresponding projection picture to a physical projection surface of the vehicle.
Further, the vehicle navigation system further comprises a positioning device and a real navigation map which are built in the vehicle, the real navigation map is matched with the virtual map, the first processor gives real coordinates of the vehicle on the real navigation map based on the positioning device and the real navigation map, and gives virtual coordinates of the vehicle on the virtual map, which are synchronous in real time when the vehicle is running, based on the real coordinates.
The technical scheme provided by the invention has the beneficial effects that: the projection unmanned aerial vehicle with the flight is arranged around the vehicle, the projection lens is arranged on the projection unmanned aerial vehicle, the virtual scene image corresponding to the virtual coordinate of the virtual map of the vehicle is used as a projection picture to be projected to the physical projection surface of the vehicle, the entertainment requirements of passengers and drivers in the vehicle are met, the audiovisual experience of a virtual reality movie or game is realized, the cabin space in the vehicle is not occupied, and the picture is not blocked along with the movement of the passengers or the movement of the vehicle. The unmanned aerial vehicle can be recycled, and the appearance modeling of the vehicle cannot be influenced.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic structural diagram of a surrounding AR-HUD projection system based on an unmanned aerial vehicle according to the present invention.
Fig. 2 is a schematic plan distribution view of a vehicle and a projection unmanned aerial vehicle based on a surrounding AR-HUD projection system of the present invention.
Fig. 3 is a schematic view of the projection drone of fig. 2 projecting a projection screen onto a side window glass of a vehicle.
In the figure: 10-vehicle; 11-side window glass; 20-projection unmanned plane; 21-projecting a picture; 30-a first processor; 40-a first memory; 50-occupants.
Detailed Description
In the description of the present invention, it should be understood that the terms "center", "longitudinal", "lateral", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", etc. indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, are merely for convenience in describing the present invention and simplifying the description, and do not indicate or imply that the devices or elements referred to must have a specific orientation, be configured and operated in a specific orientation, and thus should not be construed as limiting the present invention.
In the description of the present invention, it should be noted that, unless explicitly specified and limited otherwise, the terms "mounted," "connected," and "connected" are to be construed broadly, and may be either fixedly connected, detachably connected, or integrally connected, for example; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communication between two elements. The specific meaning of the above terms in the present invention can be understood by those of ordinary skill in the art in a specific case.
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
As shown in fig. 1-3, the unmanned aerial vehicle-based surrounding AR-HUD projection system provided by the present invention includes a vehicle, a projection unmanned aerial vehicle, a first memory, and a first processor.
The vehicle has at least one physical projection surface.
The speed and direction of flight of the projection drone is synchronized with the vehicle. The projection unmanned aerial vehicle can realize flight control through the existing flight control system, and is used for controlling flight data such as flight speed, acceleration, direction, azimuth angle, height and the like of the projection unmanned aerial vehicle, so that the projection unmanned aerial vehicle and a vehicle are synchronous, and the relative static state of the projection unmanned aerial vehicle and the vehicle is realized.
Of course, the vehicle is internally provided with at least one second sensor, a second processor and a second memory, wherein the second sensor is used for acquiring running data of the vehicle, including vehicle speed, acceleration and the like. The second processor uploads the travel data to the second memory. So that the first processor retrieves the vehicle travel data from the second memory. Of course, the second processor may also call the data stored in the first memory, and the first processor and the second processor implement the operation and control of the vehicle and/or the projection unmanned aerial vehicle through the interaction of the data.
In the present embodiment, the vehicle is a vehicle having an L4 or L5 class autopilot function. Wherein, L4 autopilot: refers to the level of automatic driving that a user may take off the pipe completely under certain road conditions. L5 autopilot: refers to the level of automatic driving that a user can take off the pipe completely under all road conditions. Because the vehicle has an L4 or L5 level automatic driving function, the driving safety is not affected when the projection system of the embodiment is started in the vehicle form process. In addition, the second sensor, the second processor, and the second memory built in the vehicle may be an existing structure of the vehicle having an autopilot function.
The projection unmanned aerial vehicle is provided with at least one first sensor and is used for acquiring relative position data of the projection unmanned aerial vehicle and the physical projection surface, and the projection unmanned aerial vehicle adjusts the self flight state based on the relative position data so as to keep the projection unmanned aerial vehicle and the physical projection surface relatively static. The first sensor may be a laser ranging head or a millimeter wave radar.
The projection unmanned aerial vehicle is provided with a projection lens, and the projection lens is used for projecting a received projection picture to a physical projection surface of the vehicle. In practice, the projection unmanned aerial vehicle adjusts its own flight state based on the relative position data acquired by the first sensor, and is also used for realizing that the projection picture and the physical projection plane remain relatively static. The center of the projection picture is coincident with the center of the physical projection surface or within an allowable offset range.
The first memory stores a preset virtual image model including a virtual map and a virtual scene image corresponding to each virtual coordinate in the virtual map.
The first processor is used for giving virtual coordinates of the vehicle on the virtual map according to the instruction, and uploading a virtual scene image corresponding to the virtual coordinates to the projection lens as the projection picture.
The first memory and the first processor may both be disposed on the projection drone or may be disposed within the vehicle. The projection unmanned aerial vehicle is also in wireless communication connection with the vehicle through the wireless network module so as to be used for transmitting data.
In one embodiment, the plurality of physical projection surfaces comprises side window glass of the vehicle, and the plurality of projection unmanned aerial vehicles are distributed on two sides of the vehicle. For example, there are four physical projection surfaces, including a total of four side glazings on both sides of the vehicle. The projection unmanned aerial vehicle has two or four, distributes respectively in the both sides of vehicle, and when projection unmanned aerial vehicle had two, two side window glass of the same side of vehicle shared a projection unmanned aerial vehicle, and when projection unmanned aerial vehicle had four, every side window glass of vehicle corresponded a projection unmanned aerial vehicle.
In one embodiment, the plurality of physical projection surfaces includes a front windshield, a side windshield, a rear windshield and a sunroof glass of the vehicle, and the plurality of projection unmanned aerial vehicles are distributed around the vehicle. For example, there are five projection unmanned aerial vehicles, which correspond to the windshield, the rear windshield, the skylight glass and the side window glass on two sides respectively, that is, the five projection unmanned aerial vehicles are distributed around and above the vehicle. When there are four physical projection surfaces, for example, when the projection surfaces are distributed on the front windshield, the rear windshield and the two side windows, the projection unmanned aerial vehicle is four and distributed around the vehicle.
The speed V of the vehicle is received by the projection unmanned aerial vehicle, and the same advancing speed as the vehicle is kept; and simultaneously, the side surface distance w between the projection unmanned aerial vehicle and the height h of the projection unmanned aerial vehicle relative to the ground are measured through laser ranging or millimeter wave radar ranging. Through the universal unmanned aerial vehicle control method, the single unmanned aerial vehicle can adjust the flying speed, the flying height and the flying azimuth angle to keep w, h and V at constant values or allowed ranges, and is hovered on one side of a corresponding physical projection surface, namely, hovered on the outer side of glass corresponding to a vehicle.After the projection unmanned aerial vehicle keeps hovering, the projection unmanned aerial vehicle also compares the center S of a projection picture projected by the projection unmanned aerial vehicle with the S of the geometric center of the corresponding physical projection surface, namely glass through the configured camera w Relative to the plane in which the glass lies. Set S w A rectangular coordinate system is established for the origin of coordinates, so that the center S of the projection picture has coordinates (x, y), and the projection unmanned aerial vehicle temporarily increases and decreases Deltah by moving up and down, and adjusts x; adjusting y by briefly increasing and decreasing DeltaV; the image definition can be finished by adjusting the focal length or increasing and decreasing w through a projection lens of the projection unmanned aerial vehicle. The projection unmanned aerial vehicle can keep relative static with the glass of the vehicle through the dynamic adjustment, and a foundation is provided for projecting a clear projection picture.
The vehicle is internally provided with a positioning device and a real navigation map, the real navigation map is matched with the virtual map, the first processor gives the real coordinates of the vehicle on the real navigation map, and the virtual coordinates of the given vehicle on the virtual map are synchronized in real time based on the real coordinates.
The virtual scene image corresponding to each virtual coordinate comprises at least four virtual pictures surrounding the virtual coordinate; the vehicle is provided with at least four physical projection surfaces distributed around the vehicle, and the virtual pictures are projected onto the physical projection surfaces in a one-to-one correspondence as the projection pictures.
The projection pictures of the front windshield and the four side window glass are uniformly distributed by the first processor, so that stereoscopic surround visual perception can be created, and in-car people can fully experience the entertainment effect of the virtual reality movies and games.
For example, the first processor obtains real coordinates of the vehicle on the real navigation map according to the instruction (the instruction may be an instruction for starting the projection system, may be a voice instruction or an action instruction) in combination with the positioning device. Based on the real coordinates, virtual coordinates of the vehicle on the virtual map are synchronously calculated in real time. The first processor invokes a virtual scene image corresponding to the virtual coordinates, uploads the virtual scene image as a projection screen to the projection lens, and the projection lens projects the received projection screen onto a physical projection surface (such as a front windshield and four side windows) of the vehicle.
It should be noted that, in the vehicle form process, the real coordinates are changed in real time, and the corresponding virtual coordinates are also changed in real time. So that the corresponding projection picture is also changed in real time.
It can also be understood as follows: a host vehicle model is placed in the virtual image model, and has a physical projection surface with the same proportion and position as the vehicle.
The virtual coordinates x of the own vehicle model in the virtual image model correspond to the positions of the vehicles in the real coordinates.
The first processor calculates virtual scene images of the host vehicle model facing each physical projection surface in the virtual image model. Taking four side panes a, b, c, d as an example, the virtual scene image includes ax, bx, cx, dx four virtual pictures, and the four virtual pictures ax, bx, cx, dx are synchronized to four projection unmanned aerial vehicles A, B, C, D in a one-to-one correspondence. The four projection unmanned aerial vehicle projects the four virtual pictures as projection pictures on the four side window glass a, b, c, d in a one-to-one correspondence.
The virtual image model comprises a film image model or a game image model; the vehicle is internally provided with a sound box, the second memory stores preset audio data, the audio data is matched with the video image model or the game image model, and the first processor starts the projection unmanned aerial vehicle and the sound box according to the instruction to realize synchronous output of a projection picture and the audio data.
Based on the same inventive concept, the invention also provides an unmanned aerial vehicle-based surrounding AR-HUD projection method, which comprises the following steps:
at least one physical projection surface is provided on the vehicle.
The flying speed and direction of the projection unmanned aerial vehicle provided with the projection lens are synchronized with the vehicle.
A preset virtual image model is stored in the first memory, wherein the virtual image model comprises a virtual map and a virtual scene image corresponding to each virtual coordinate in the virtual map.
The first processor gives virtual coordinates of the vehicle on the virtual map according to the instruction, and uploads a virtual scene image corresponding to the virtual coordinates to the projection lens as the projection picture, so that the projection lens projects the received corresponding projection picture to a physical projection surface of the vehicle.
The vehicle navigation system further comprises a positioning device and a real navigation map which are built in the vehicle, the real navigation map is matched with the virtual map, the first processor gives real coordinates of the vehicle on the real navigation map based on the positioning device and the real navigation map, and gives virtual coordinates of the vehicle on the virtual map, which are synchronous in real time, when the vehicle is running based on the real coordinates.
Those skilled in the art will appreciate that the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, magnetic disk storage, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, systems according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In the description of the present invention, it should be noted that the azimuth or positional relationship indicated by the terms "upper", "lower", etc. are based on the azimuth or positional relationship shown in the drawings, and are merely for convenience of describing the present invention and simplifying the description, and are not indicative or implying that the apparatus or element in question must have a specific azimuth, be constructed and operated in a specific azimuth, and thus should not be construed as limiting the present invention. Unless specifically stated or limited otherwise, the terms "mounted," "connected," and "coupled" are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communication between two elements. The specific meaning of the above terms in the present invention can be understood by those of ordinary skill in the art according to the specific circumstances.
It should be noted that in the present invention, relational terms such as "first" and "second" and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The foregoing is only a specific embodiment of the invention to enable those skilled in the art to understand or practice the invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (8)

1. A surrounding AR-HUD projection system based on an unmanned aerial vehicle, comprising:
a vehicle having a plurality of physical projection surfaces; the plurality of physical projection surfaces includes a front windshield, side window glass, rear windshield and sunroof glass of the vehicle;
the plurality of projection unmanned aerial vehicles are distributed around the vehicle, and the flying speed and the flying direction are synchronous with the vehicle so as to keep the projection unmanned aerial vehicles and the physical projection surface relatively static; the projection unmanned aerial vehicle is provided with a projection lens, and the projection lens is used for projecting the received projection picture to a physical projection surface of the vehicle;
the first memory is used for storing a preset virtual image model, wherein the virtual image model comprises a virtual map and a virtual scene image corresponding to each virtual coordinate in the virtual map;
the first processor is used for giving virtual coordinates of the vehicle on the virtual map according to the instruction, and uploading a virtual scene image corresponding to the virtual coordinates to the projection lens as the projection picture.
2. The unmanned aerial vehicle-based surrounding AR-HUD projection system of claim 1, wherein the unmanned aerial vehicle has at least one first sensor for acquiring relative position data of the unmanned aerial vehicle and the physical projection surface, the unmanned aerial vehicle adjusting its own flight state based on the relative position data.
3. The unmanned aerial vehicle-based surrounding AR-HUD projection system of claim 1, wherein the vehicle incorporates a locating device and a real navigation map that matches the virtual map, the first processor giving the vehicle real coordinates on the real navigation map, virtual coordinates on the virtual map for the given vehicle synchronized in real time based on the real coordinates.
4. The unmanned aerial vehicle-based surrounding AR-HUD projection system of claim 1, wherein the virtual scene image corresponding to each of the virtual coordinates includes at least four virtual pictures surrounding the virtual coordinates; the vehicle is provided with at least four physical projection surfaces distributed around the vehicle, and the virtual pictures are projected onto the physical projection surfaces in a one-to-one correspondence as the projection pictures.
5. The unmanned aerial vehicle-based surrounding AR-HUD projection system of claim 1, wherein the center of the projected screen coincides with the center of the physical projection surface or is within an allowable range of offsets.
6. The unmanned aerial vehicle-based surrounding AR-HUD projection system of claim 1, wherein the vehicle is a vehicle having L4 or L5 class autopilot functionality.
7. The surrounding AR-HUD projection method based on the unmanned aerial vehicle is characterized by comprising the following steps of:
a plurality of physical projection surfaces are arranged on a vehicle; the plurality of physical projection surfaces includes a front windshield, side window glass, rear windshield and sunroof glass of the vehicle;
synchronizing the flying speed and direction of the projection unmanned aerial vehicle provided with the projection lens with the vehicle so as to keep the projection unmanned aerial vehicle and a physical projection surface relatively static;
storing a preset virtual image model in a first memory, wherein the virtual image model comprises a virtual map and a virtual scene image corresponding to each virtual coordinate in the virtual map;
the first processor gives virtual coordinates of the vehicle on the virtual map according to the instruction, and uploads a virtual scene image corresponding to the virtual coordinates to the projection lens as a projection picture, so that the projection lens projects the received corresponding projection picture to a physical projection surface of the vehicle.
8. The unmanned aerial vehicle-based surrounding AR-HUD projection method of claim 7, further comprising incorporating a locating device and a real navigation map into the vehicle, matching the real navigation map with the virtual map, wherein the first processor, based on the locating device and the real navigation map, gives the vehicle real coordinates on the real navigation map, based on which real coordinates the vehicle is synchronized in real time with the virtual coordinates of the given vehicle on the virtual map.
CN202210908521.1A 2022-07-29 2022-07-29 Surrounding AR-HUD projection system and method based on unmanned aerial vehicle Active CN115297308B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210908521.1A CN115297308B (en) 2022-07-29 2022-07-29 Surrounding AR-HUD projection system and method based on unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210908521.1A CN115297308B (en) 2022-07-29 2022-07-29 Surrounding AR-HUD projection system and method based on unmanned aerial vehicle

Publications (2)

Publication Number Publication Date
CN115297308A CN115297308A (en) 2022-11-04
CN115297308B true CN115297308B (en) 2023-05-26

Family

ID=83825278

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210908521.1A Active CN115297308B (en) 2022-07-29 2022-07-29 Surrounding AR-HUD projection system and method based on unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN115297308B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116974417B (en) * 2023-07-25 2024-03-29 江苏泽景汽车电子股份有限公司 Display control method and device, electronic equipment and storage medium

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2537548A1 (en) * 2005-03-04 2006-09-04 Jinfang Zhang Multifunction mirror display
CN101308384A (en) * 2008-03-05 2008-11-19 中科院嘉兴中心微系统所分中心 Lake eutrophication monitoring system platform based on wireless sensing network
EP2709069A1 (en) * 2012-09-15 2014-03-19 DSP-Weuffen GmbH Method and apparatus for an imaging driver assistance system with adaptive all-round view display
CN104216511A (en) * 2013-05-30 2014-12-17 湾流航空航天公司 Aircraft instrument cursor control using multi-touch deep sensors
CA2895081A1 (en) * 2014-06-30 2015-08-24 Airbus Helicopters Method and system for setting a rotary wing aircraft to hover flight in directional stability or heading mode based on its forward speed
CN105791810A (en) * 2016-04-27 2016-07-20 深圳市高巨创新科技开发有限公司 Virtual stereo display method and device
CA2959632A1 (en) * 2016-03-02 2017-09-02 Goodrich Lighting Systems, Inc. Aircraft mounted display module
CN108051918A (en) * 2017-12-25 2018-05-18 中国航空工业集团公司洛阳电光设备研究所 A kind of telecontrol equipment of head-up display compound glass
CN109618134A (en) * 2018-12-10 2019-04-12 北京智汇云舟科技有限公司 A kind of unmanned plane dynamic video three-dimensional geographic information real time fusion system and method
WO2019231477A1 (en) * 2018-05-31 2019-12-05 Gillett Carla R Robot and drone array
CN111061421A (en) * 2019-12-19 2020-04-24 北京澜景科技有限公司 Picture projection method and device and computer storage medium
CN113460038A (en) * 2021-06-28 2021-10-01 东风汽车集团股份有限公司 Ramp automatic parking safe vehicle speed control method
CN114694117A (en) * 2022-03-30 2022-07-01 东风汽车集团股份有限公司 Parking space identification method based on image
CN114693787A (en) * 2022-03-18 2022-07-01 东风汽车集团股份有限公司 Parking garage map building and positioning method and system and vehicle

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10802665B2 (en) * 2016-10-05 2020-10-13 Motorola Solutions, Inc. System and method for projecting graphical objects
US10777008B2 (en) * 2017-08-31 2020-09-15 Disney Enterprises, Inc. Drones generating various air flow effects around a virtual reality or augmented reality user

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2537548A1 (en) * 2005-03-04 2006-09-04 Jinfang Zhang Multifunction mirror display
CN101308384A (en) * 2008-03-05 2008-11-19 中科院嘉兴中心微系统所分中心 Lake eutrophication monitoring system platform based on wireless sensing network
EP2709069A1 (en) * 2012-09-15 2014-03-19 DSP-Weuffen GmbH Method and apparatus for an imaging driver assistance system with adaptive all-round view display
CN104216511A (en) * 2013-05-30 2014-12-17 湾流航空航天公司 Aircraft instrument cursor control using multi-touch deep sensors
CA2895081A1 (en) * 2014-06-30 2015-08-24 Airbus Helicopters Method and system for setting a rotary wing aircraft to hover flight in directional stability or heading mode based on its forward speed
CA2959632A1 (en) * 2016-03-02 2017-09-02 Goodrich Lighting Systems, Inc. Aircraft mounted display module
CN105791810A (en) * 2016-04-27 2016-07-20 深圳市高巨创新科技开发有限公司 Virtual stereo display method and device
CN108051918A (en) * 2017-12-25 2018-05-18 中国航空工业集团公司洛阳电光设备研究所 A kind of telecontrol equipment of head-up display compound glass
WO2019231477A1 (en) * 2018-05-31 2019-12-05 Gillett Carla R Robot and drone array
CN109618134A (en) * 2018-12-10 2019-04-12 北京智汇云舟科技有限公司 A kind of unmanned plane dynamic video three-dimensional geographic information real time fusion system and method
CN111061421A (en) * 2019-12-19 2020-04-24 北京澜景科技有限公司 Picture projection method and device and computer storage medium
CN113460038A (en) * 2021-06-28 2021-10-01 东风汽车集团股份有限公司 Ramp automatic parking safe vehicle speed control method
CN114693787A (en) * 2022-03-18 2022-07-01 东风汽车集团股份有限公司 Parking garage map building and positioning method and system and vehicle
CN114694117A (en) * 2022-03-30 2022-07-01 东风汽车集团股份有限公司 Parking space identification method based on image

Also Published As

Publication number Publication date
CN115297308A (en) 2022-11-04

Similar Documents

Publication Publication Date Title
CN108657089B (en) Amusement device for automatically driving a motor vehicle
JP7331696B2 (en) Information processing device, information processing method, program, and mobile object
CN104781873B (en) Image display device, method for displaying image, mobile device, image display system
EP3151938B1 (en) Display for immersive window effect
US8888304B2 (en) Optical control techniques
CN105929539B (en) Automobile 3D image collections and bore hole 3D head-up display systems
GB2544884A (en) Utilization of video see-through to make an occupant of a vehicle aware of an imminent event that will cause a sudden decrease in ride smoothness
US20150245017A1 (en) Virtual see-through instrument cluster with live video
JP2017211366A (en) Mobile body system and information display device
WO2020110580A1 (en) Head-up display, vehicle display system, and vehicle display method
CN115297308B (en) Surrounding AR-HUD projection system and method based on unmanned aerial vehicle
CN103995429A (en) Multi-projection system
WO2017197971A1 (en) Automobile or mobile device 3d image acquisition and naked-eye 3d head-up display system and 3d image processing method
WO2022141369A1 (en) Systems and methods for supporting automatic video capture and video editing
CN103998983A (en) Multi-projection system and method comprising direction-changeable audience seats
JP7371629B2 (en) Information processing device, information processing method, program, and vehicle
JP2016064760A (en) Virtual image display device
CN105150935A (en) Head-up displayer, head-up display method and vehicle-mounted display device
CN113064279B (en) Virtual image position adjusting method, device and storage medium of AR-HUD system
WO2024021852A1 (en) Stereoscopic display apparatus, stereoscopic display system, and vehicle
CN108983963A (en) A kind of vehicle virtual reality system method for establishing model and system
CN113038116A (en) Method for constructing aerial refueling simulation training visual system
CN116745725A (en) System and method for determining object position using unmanned aerial vehicle
CN113156643A (en) Vehicle display system based on stereoscopic vision display
CN110191307A (en) Pass through the miniature land Navigating System of the manipulation of wireless remotecontrol vehicle and real-time imaging transmission

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant