CN116959342A - Vehicle end projection method and device and vehicle - Google Patents

Vehicle end projection method and device and vehicle Download PDF

Info

Publication number
CN116959342A
CN116959342A CN202310925410.6A CN202310925410A CN116959342A CN 116959342 A CN116959342 A CN 116959342A CN 202310925410 A CN202310925410 A CN 202310925410A CN 116959342 A CN116959342 A CN 116959342A
Authority
CN
China
Prior art keywords
vehicle
projected
projection
pattern
simulation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310925410.6A
Other languages
Chinese (zh)
Inventor
张希
袁怡
孙然
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mercedes Benz Group AG
Original Assignee
Mercedes Benz Group AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mercedes Benz Group AG filed Critical Mercedes Benz Group AG
Priority to CN202310925410.6A priority Critical patent/CN116959342A/en
Publication of CN116959342A publication Critical patent/CN116959342A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09FDISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
    • G09F19/00Advertising or display means not otherwise provided for
    • G09F19/12Advertising or display means not otherwise provided for using special optical effects
    • G09F19/18Advertising or display means not otherwise provided for using special optical effects involving the use of optical projection means, e.g. projection of images on clouds
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/0017Devices integrating an element dedicated to another function
    • B60Q1/0023Devices integrating an element dedicated to another function the element being a sensor, e.g. distance sensor, camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q2400/00Special features or arrangements of exterior signal lamps for vehicles
    • B60Q2400/50Projected symbol or information, e.g. onto the road or car body

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Business, Economics & Management (AREA)
  • Accounting & Taxation (AREA)
  • Marketing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses a vehicle end projection method and device and a vehicle, and belongs to the technical field of vehicles. The vehicle end projection method can comprise the following steps: determining a simulation scene in response to the projection trigger, and generating a pattern to be projected corresponding to the simulation scene according to the simulation scene; determining a projection area according to the position information of the vehicle; and controlling the emergent light beam of the car lamp of the car according to the projection area and the pattern to be projected so as to project a simulation field containing the pattern to be projected on the projection area, so that a user plays or moves on the simulation field projected by the car lamp. The vehicle-end projection method can provide richer entertainment experience for users through vehicles.

Description

Vehicle end projection method and device and vehicle
Technical Field
The present invention relates to the field of vehicle technologies, and in particular, to a vehicle end projection method and apparatus, and a vehicle.
Background
With the popularization of self-driving outdoor activities, the demands of people for interactive entertainment in field scenes are gradually increased. Although some outdoor entertainment needs of users can be met through vehicle-mounted entertainment systems such as vehicle-mounted sound equipment, vehicle-mounted display screens and the like, the entertainment modes are relatively single, so that the entertainment experience enjoyed by users through vehicles is limited.
Disclosure of Invention
In view of the above, the present invention provides a vehicle-end projection method, apparatus and vehicle, so as to provide a richer entertainment experience for users through the vehicle.
In order to solve the technical problems, the invention provides the following technical scheme:
in a first aspect, the present invention provides a vehicle end projection method, including:
determining a simulation scene in response to projection triggering, and generating a pattern to be projected corresponding to the simulation scene according to the simulation scene;
determining a projection area according to the position information of the vehicle;
and controlling the emergent light beam of the car lamp of the car according to the projection area and the pattern to be projected so as to project a simulation field containing the pattern to be projected on the projection area, so that a user plays or moves on the simulation field projected by the car lamp.
Optionally, the determining, in response to the projection trigger, a simulated scene includes:
providing a plurality of selectable simulation scenes to a user through a display device;
responding to the selection of any target optional simulation scene by a user, and determining the target optional simulation scene as a simulation scene;
or alternatively, the process may be performed,
receiving a field image input by a user;
and generating a simulation scene according to the field image.
Optionally, the generating a pattern to be projected corresponding to the simulated scene includes:
determining the size of the simulated scene;
if the user selects the single-vehicle projection mode, generating a complete pattern matched with the size of the simulated scene;
if the user selects the multi-vehicle projection mode, determining a part of simulation scenes corresponding to the vehicles according to the position information of each vehicle related to the multi-vehicle projection mode, and generating patterns to be projected of the part of simulation scenes corresponding to the vehicles so as to enable the patterns to be projected of the part of simulation scenes corresponding to the vehicles to be combined into a complete simulation scene.
Optionally, the determining the projection area includes:
for the multi-vehicle projection mode,
determining the range of the area surrounded by a plurality of vehicles according to the position information of the vehicles, and determining the center point of the range of the area;
dividing an area block matched with each vehicle according to the central point and the area range;
judging whether the area block is within the light beam coverage range of the vehicle lamp, if so, determining that the area block is the projection area; otherwise, reminding the user to adjust the projection range of the vehicle.
Optionally, the projecting the simulated field containing the pattern to be projected on the projection area includes:
and for a multi-vehicle projection mode, a complete simulation site is spliced by utilizing the patterns to be projected generated by a plurality of vehicles.
Optionally, the vehicle end projection method further includes: for the multi-vehicle projection mode,
any one or more of the vehicles performs the following operations:
detecting the superposition condition between the pattern to be projected by the adjacent vehicle on the corresponding projection area and the pattern to be projected by the own vehicle on the corresponding projection area,
and if the superposition condition indicates that deviation exists between the pattern to be projected by the own vehicle and the pattern to be projected by the adjacent vehicle, adjusting the pattern to be projected by the own vehicle or indicating the adjacent vehicle to adjust the projected pattern to be projected.
In a second aspect, an embodiment of the present invention provides a vehicle end projection apparatus, including: a pattern generating unit, a region determining unit and a projection unit, wherein,
the pattern generation unit is used for responding to projection triggering, determining a simulation scene and generating a pattern to be projected corresponding to the simulation scene according to the simulation scene;
The area determining unit is used for determining a projection area according to the position information of the vehicle;
the projection unit is used for controlling the emergent light beam of the car lamp of the car according to the projection area and the pattern to be projected so as to project a simulation field containing the pattern to be projected on the projection area, so that a user plays or moves on the simulation field projected by the car lamp.
In a third aspect, an embodiment of the present invention provides a vehicle including: the second aspect of the embodiment provides a vehicle end projection device.
In a fourth aspect, an embodiment of the present invention provides an electronic device, including:
one or more processors;
storage means for storing one or more programs,
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the vehicle end projection method as provided in the embodiment of the first aspect described above.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium having stored thereon a computer program for implementing end-of-vehicle projection, including:
the computer program, when executed by the on-board processor, implements the vehicle-end projection method provided in the embodiment of the first aspect.
The technical scheme of the invention has the following advantages or beneficial effects: the method and the device have the advantages that the simulated scene is determined, the pattern to be projected is generated according to the simulated scene, then the projection area is determined according to the position information of the vehicle, and then the simulated field containing the pattern to be projected is projected on the projection area by controlling the vehicle lamp of the vehicle, so that a user can play or move on the simulated field projected by the vehicle lamp, namely, the field and the scene required by the game or the movement can be controlled by the vehicle lamp to be projected for the user by the vehicle lamp, and richer entertainment experience is provided for the user through the vehicle.
Drawings
Fig. 1 is a schematic flow chart of a vehicle-end projection method according to an embodiment of the present invention;
FIG. 2 is a schematic view of a single vehicle projected out of a field of motion provided in accordance with an embodiment of the present invention;
FIG. 3A is a schematic illustration of a playground projected by multiple vehicles according to an embodiment of the present invention;
FIG. 3B is a schematic illustration of a pattern projected for projection of vehicle Ve1 in the sports field of FIG. 3A, provided in accordance with an embodiment of the present invention;
FIG. 3C is a schematic illustration of a pattern projected for projection of vehicle Ve2 in the sports field of FIG. 3A, provided in accordance with an embodiment of the present invention;
FIG. 3D is a schematic illustration of a pattern projected for projection of vehicle Ve3 in the sports arena of FIG. 3A, provided in accordance with an embodiment of the present invention;
FIG. 3E is a schematic illustration of a pattern projected for projection of vehicle Ve4 in the sports arena of FIG. 3A, provided in accordance with an embodiment of the present invention;
FIG. 4 is a schematic view of another playground projected by multiple vehicles according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a main flow of determining a projection area according to an embodiment of the present invention;
fig. 6 is a main flow diagram of another vehicle-end projection method according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of a vehicle end projection device according to an embodiment of the present invention;
fig. 8 is a schematic structural view of a vehicle according to an embodiment of the present invention;
FIG. 9 is an exemplary system architecture diagram in which embodiments of the present invention may be applied;
FIG. 10 is a schematic diagram of a vehicle-mounted computer system suitable for use in implementing embodiments of the invention.
Detailed Description
Exemplary embodiments of the present invention will now be described with reference to the accompanying drawings, in which various details of the embodiments of the present invention are included to facilitate understanding, and are to be considered merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the invention. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
It should be noted that the embodiments of the present invention and the technical features in the embodiments may be combined with each other without collision.
In addition, the terms "first," "second," "third," etc. in the terms of embodiments of the present invention are used to distinguish similar objects from each other, and are not necessarily used to describe a specific number or order. It is to be understood that the terms so used are interchangeable under appropriate circumstances and are merely illustrative of the manner in which embodiments of the invention have been described in connection with objects of the same nature.
The vehicle according to the embodiment of the invention may be an internal combustion engine vehicle having an engine as a power source, a hybrid vehicle having an engine and an electric motor as power sources, an electric vehicle having an electric motor as a power source, or the like.
Fig. 1 shows a main flow diagram of a vehicle end projection method according to an embodiment of the present invention. As shown in fig. 1, the vehicle end projection method provided by the embodiment of the invention may include the following steps:
step S101: determining a simulation scene in response to the projection trigger, and generating a pattern to be projected corresponding to the simulation scene according to the simulation scene;
the simulation scene refers to information indicating characteristics, types and the like of games or activities required by a user, which are obtained by a controller at a vehicle end, such as an ECU (electronic control Unit) based on characteristic identification information or indication signals and the like, for example, an identification defined for basketball sports is B, an identification defined for badminton sports is C, the ECU determines that the simulation scene is basketball sports when acquiring the identification B provided by projection triggering, and the ECU determines that the simulation scene is badminton sports when acquiring the identification C provided by projection triggering. In addition, the simulated scene may also be other sports or games such as sprint, football, chess, checkers, house-jump games, and the like.
The pattern to be projected may be a complete pattern corresponding to a playing field or a game field or a partial pattern corresponding to a playing field or a game field.
Step S102: determining a projection area according to the position information of the vehicle;
the projection area is generally the area occupied by the pattern to be projected onto a floor or wall surface or the like.
Step S103: and controlling the emergent light beam of the car lamp of the car according to the projection area and the pattern to be projected so as to project a simulation field containing the pattern to be projected on the projection area, so that a user plays or moves on the simulation field projected by the car lamp.
The outgoing beam of the vehicle lamp of the control vehicle can be realized by adopting the existing vehicle lamp control mode, and the description is omitted here.
Aiming at the technical scheme provided by the embodiment shown in fig. 1, a simulation scene is determined, a pattern to be projected is generated according to the simulation scene, then a projection area is determined according to the position information of a vehicle, and then a simulation site containing the pattern to be projected is projected on the projection area by controlling the vehicle lamp of the vehicle, so that a user plays or moves on the simulation site projected by the vehicle lamp, namely, the scheme provided by the embodiment of the invention can control the vehicle lamp to project the site and scene required by the game or the movement for the user, and thus richer entertainment experience is provided for the user through the vehicle.
In response to the projection trigger in step S101, two specific embodiments of determining the simulated scene may be used.
A first embodiment of determining a simulated scene may include: providing a plurality of selectable simulation scenes to a user through a display device; and in response to the user selecting any one of the target selectable simulation scenes, determining the target selectable simulation scene as the simulation scene. The ECU provides selectable simulation scenes for a user through the vehicle-mounted display, and after the display screen receives the trigger of the user for any simulation scene displayed, the ECU provides the characteristic identifiers corresponding to the triggered simulation scenes, so that the ECU determines the simulation scene selected by the user.
A second embodiment of determining a simulated scene may include: receiving a field image input by a user; from the field images, a simulated scene is generated. For example, when a user inputs an image of a sports field or an image of a game field through a terminal device, an in-vehicle display device, or the like, the ECU recognizes feature information of the field or the like from the image, and combines the recognized feature information into a simulation scene.
According to the embodiment of the invention, through the two modes for determining the simulation scene, a user can flexibly select any mode according to the requirement to obtain the required simulation field. Further, the second specific implementation mode of determining the simulation scene can realize expansion of the simulation scene so as to better meet the outdoor entertainment or sports requirements of users.
Further, the technical scheme provided by the embodiment of the invention can provide different projection modes such as a single-vehicle projection mode and a multi-vehicle projection mode for users so as to meet different requirements of the users. Specifically, for the above generation of the pattern to be projected corresponding to the simulated scene, there are different implementations for different projection scenes or projection modes.
Wherein, for the single vehicle projection mode selected by the user, the specific implementation of generating the pattern to be projected corresponding to the simulated scene can comprise: determining the size of the simulated scene; a complete pattern is generated that matches the size of the simulated scene. For example, for a sprint scene, where the user selects the single vehicle projection mode (i.e., projects a sprint by one vehicle), a sprint having three sprints is projected by vehicle Ve1 as shown in fig. 2. The size of the simulated scene generally refers to the size of the simulated field to be projected, such as length, width, etc.
Further, for a multi-vehicle projection mode selected by a user (i.e. a pattern projected by a plurality of vehicles is combined into a complete field), the specific embodiment of generating a pattern to be projected corresponding to a simulated scene may include: for each vehicle related to the multi-vehicle projection mode, determining a part of simulation scenes corresponding to the vehicles according to the position information of the vehicles, and generating patterns to be projected corresponding to the part of simulation scenes so as to enable the patterns to be projected of the part of simulation scenes corresponding to the plurality of vehicles to be combined into a complete simulation scene. The positional information of the vehicle generally refers to relative positional information between a plurality of vehicles, for example, a badminton court shown in fig. 3A, a soccer court shown in fig. 4, or the like can be projected by the multi-vehicle projection mode. For example, the badminton court shown in fig. 3A is projected by vehicles Ve1, ve2, ve3 and Ve4 together, where the vehicle Ve1 generates a pattern P1 to be projected corresponding to a partially simulated scene as shown in fig. 3B, the vehicle Ve2 generates a pattern P2 to be projected corresponding to a partially simulated scene as shown in fig. 3C, the vehicle Ve3 generates a pattern P3 to be projected corresponding to a partially simulated scene as shown in fig. 3D, and the vehicle Ve4 generates a pattern P4 to be projected corresponding to a partially simulated scene as shown in fig. 3E, and then the P1 shown in fig. 3B, the P2 shown in fig. 3C, the P3 shown in fig. 3D and the P4 shown in fig. 3E projected on the ground splice the badminton court shown in fig. 3A. For another example, the soccer field shown in fig. 4 is projected by vehicles Ve1, ve2, ve3, ve4, ve5 and Ve6, where vehicle Ve1 generates a pattern corresponding to an area corresponding to a P10 of the pattern to be projected of the partial simulation scene as shown in fig. 4, vehicle Ve2 generates a pattern corresponding to an area corresponding to a P8 of the pattern to be projected of the partial simulation scene as shown in fig. 4, vehicle Ve3 generates a pattern corresponding to an area corresponding to a P7 of the pattern to be projected of the partial simulation scene as shown in fig. 4, vehicle Ve4 generates a pattern corresponding to an area corresponding to a P5 of the pattern to be projected of the partial simulation scene as shown in fig. 4, vehicle Ve5 generates a pattern corresponding to an area corresponding to a P9 of the pattern to be projected of the partial simulation scene as shown in fig. 4, and then the patterns corresponding to P5 to P10 can be obtained directly on the ground as shown in fig. 4. A complete activity or sports ground is spliced through the patterns projected by multiple vehicles, so that the requirements of users on a larger sports ground in suburbs can be met, and better game or sports entertainment experience is provided for the users.
It should be noted that, fig. 3B to fig. 3E and fig. 4 are only exemplary to show possible projection patterns of each vehicle, the patterns projected by each vehicle or the sizes of the patterns, etc. may be regulated according to requirements, for example, the lines of the patterns projected by adjacent vehicles may be partially overlapped, the lengths of the lines projected by the vehicles may be increased or decreased, etc. Any scheme as long as the patterns projected by multiple vehicles can be combined into one finished sports field is within the protection scope of the application.
Further, as shown in fig. 5, for the multi-vehicle projection mode, one embodiment of determining the projection area may include the following steps:
step S501: determining the range of the area surrounded by the vehicles according to the position information of the vehicles, and determining the central point of the range of the area;
as described above, the positional information of the plurality of vehicles refers to relative positional information between the plurality of vehicles, for example, a coordinate system is drawn with one vehicle as a coordinate origin, and coordinates of other vehicles in the coordinate system are plotted. For example, in the case of fig. 3A where four vehicles are required to project a badminton court, the positions of Ve2, ve3, and Ve4 relative to Ve1 are adaptively adjusted according to the size of the badminton court and the position of Ve1 where the vehicles are located as the origin of coordinates. The positions of Ve1, ve2, ve3 and Ve4 can be determined by the position of the vehicle lamp or the vehicle head.
Step S502: dividing an area block matched with each vehicle according to the central point and the area range;
for example, for the scenario shown in fig. 3A, the area encompassed by four vehicles may be equally divided into four area blocks.
Step S503: judging whether the area block is within the light beam coverage range of the vehicle lamp, if yes, executing step S504; otherwise, N, execute step S505;
the judgment of whether the area block is within the light beam coverage range of the vehicle lamps is to judge that the patterns formed by each vehicle can not be spliced, if the light beam of any vehicle lamp can not cover the corresponding area block, the projected sports ground is defective, and the user experience is affected.
Step S504: determining the area block as a projection area, and ending the current flow;
step S505: the user is reminded to adjust the projection range of the vehicle.
The adjusting the projection range of the vehicles can be to increase the projection range of the adjacent vehicles under the condition that the projection range of one vehicle is smaller than the matched area block of the vehicles so as to ensure that the complete pattern of the sports ground is spliced.
Further, for the multi-vehicle projection mode, the specific implementation of projecting the simulated field including the pattern to be projected on the projection area may include: and splicing the to-be-projected patterns generated by a plurality of vehicles into a complete simulation field. I.e. the patterns projected as in fig. 3B to 3E are spliced on the ground to form a badminton court as shown in fig. 3A.
Further, for the above multi-vehicle projection mode, any one or more vehicles perform the following operations:
detecting the superposition condition between the pattern to be projected by the adjacent vehicle on the corresponding projection area and the pattern to be projected by the own vehicle on the corresponding projection area; the detection process can be carried out through the vehicle-mounted camera, the sizes of patterns projected by all vehicles can be obtained through mutual communication among multiple vehicles, and the overlapping condition among the patterns is judged according to the sizes of the patterns and the position information of the vehicles. For example, with respect to fig. 3B to 3E, the overlapping of the projected pattern P1 shown in fig. 3B and the projected pattern P2 of fig. 3C, the overlapping of the pattern P2 shown in fig. 3C and the projected pattern P3 of fig. 3D, and the overlapping of the pattern P4 shown in fig. 3D and the projected pattern P4 of fig. 3E are detected, respectively.
And if the superposition condition indicates that deviation exists between the pattern to be projected by the own vehicle and the pattern to be projected by the adjacent vehicle, adjusting the pattern to be projected by the own vehicle or indicating the adjacent vehicle to adjust the projected pattern to be projected. The deviation can give out the deviation of how large the projected adjacent patterns exist, so that the ECU of the vehicle can adjust the light beam of the vehicle lamp according to the deviation, thereby achieving the purpose of adjusting parameters such as the direction, the size and the like of the patterns. Through the process, the projection of multiple vehicles on a complete sports ground or a sports ground can be ensured, so that the user experience is improved.
Taking the scene shown in fig. 4 as an example, the implementation process of the vehicle end projection will be described in detail. As shown in fig. 6, the vehicle end projection may include the following steps:
step S601: controlling each vehicle from Ve1 to Ve6 to enter a multi-vehicle projection mode, wherein any vehicle from Ve1 to Ve6 provides a plurality of selectable simulation scenes for a user through a display device;
step S602: determining a size of the simulated scene in response to the user selecting the football scene;
step S603: according to the size of the simulated scene and the position of any vehicle, enabling a user to adjust the positions of other vehicles;
step S604: determining the range of the area surrounded by the vehicles according to the position information of the vehicles, and determining the central point of the range of the area;
step S605: dividing an area block matched with each vehicle according to the central point and the area range;
each of the vehicles Ve1 to Ve6 performs steps S604 to S described below. . . . :
step S606: determining a part of simulation scene corresponding to the vehicle according to the position information of the vehicle, and generating a pattern to be projected corresponding to the part of simulation scene;
for example, for the football field of fig. 4, through this step, the vehicle Ve1 may determine the simulated scene and the pattern to be projected corresponding to P10, the vehicle Ve2 may determine the simulated scene and the pattern to be projected corresponding to P8, the vehicle Ve3 may determine the simulated scene and the pattern to be projected corresponding to P7, the vehicle Ve4 may determine the simulated scene and the pattern to be projected corresponding to P5, the vehicle Ve5 may determine the simulated scene and the pattern to be projected corresponding to P9, and the vehicle Ve6 may determine the simulated scene and the pattern to be projected corresponding to P6.
It should be noted that, there is no strict sequence between the step S605 and the step S606.
Step S607: judging whether the area block is within the light beam coverage range of the vehicle lamp, if so, executing step S608; otherwise, step S609 is performed;
step S608: determining the region block as a projection region, and executing step S610;
step S609: reminding a user to adjust the projection range of the vehicle;
step S610: and controlling the emergent light beam of the car lamp of the car according to the projection area and the pattern to be projected so as to project a simulation field containing the pattern to be projected on the projection area, so that a user plays or moves on the simulation field projected by the car lamp.
The football field as shown in fig. 4 is projected on the ground by the vehicles Ve1 to Ve6 shown in fig. 4 according to the determined projection area and the relevant parameters of the pattern to be projected, such as the size, width, etc. of each line of the pattern.
Fig. 7 is a schematic structural diagram of a vehicle end projection device according to an embodiment of the present invention. As shown in fig. 7, the vehicle end projection device 700 may include: a pattern generation unit 701, a region determination unit 702, and a projection unit 703, wherein,
the pattern generation unit 701 determines a simulation scene in response to the projection trigger, and generates a pattern to be projected corresponding to the simulation scene according to the simulation scene;
A region determining unit 702 for determining a projection region according to position information of the vehicle;
and a projection unit 703 for controlling the outgoing beam of the vehicle lamp according to the projection area and the pattern to be projected, so as to project a simulated field containing the pattern to be projected on the projection area, so that the user plays or moves on the simulated field projected by the vehicle lamp.
In an embodiment of the present invention, the pattern generating unit 701 is configured to provide a plurality of selectable simulation scenes to a user through a display device; and in response to the user selecting any one of the target selectable simulation scenes, determining the target selectable simulation scene as the simulation scene.
In the embodiment of the present invention, the pattern generating unit 701 is further configured to receive a field image input by a user; from the field images, a simulated scene is generated.
In the embodiment of the present invention, the pattern generating unit 701 is further configured to determine the size of the simulated scene; if the user selects the single vehicle projection mode, generating a complete pattern matching the size of the simulated scene; if the user selects the multi-vehicle projection mode, for each vehicle involved in the multi-vehicle projection mode, determining a part of the simulation scene corresponding to the vehicle according to the position information of the vehicle, and generating a pattern to be projected corresponding to the part of the simulation scene so as to combine the patterns to be projected of the part of the simulation scene corresponding to the plurality of vehicles into a complete simulation scene.
In the embodiment of the present invention, the area determining unit 702 is further configured to determine, for the multi-vehicle projection mode, an area range surrounded by the plurality of vehicles according to the position information of the plurality of vehicles, and determine a center point of the area range; dividing an area block matched with each vehicle according to the central point and the area range; judging whether the area block is within the coverage range of the light beam of the vehicle lamp, if so, determining that the area block is a projection area; otherwise, reminding the user to adjust the projection range of the vehicle.
In the embodiment of the present invention, the projection unit 703 is further configured to splice a complete simulated field with the pattern to be projected generated by a plurality of vehicles for the multi-vehicle projection mode.
In the embodiment of the present invention, the projection unit 703 is further configured to detect, for a multi-vehicle projection mode, a superposition condition between a pattern to be projected by an adjacent vehicle on a corresponding projection area and a pattern to be projected by an own vehicle on the corresponding projection area; and if the superposition condition indicates that deviation exists between the pattern to be projected by the own vehicle and the pattern to be projected by the adjacent vehicle, adjusting the pattern to be projected by the own vehicle or indicating the adjacent vehicle to adjust the projected pattern to be projected.
Fig. 8 is a schematic diagram of a main structure of a vehicle according to an embodiment of the present invention. As shown in fig. 8, the vehicle 800 may include: the vehicle end projection apparatus 700 shown in fig. 7.
Fig. 9 illustrates an exemplary vehicle system architecture 900 to which an end-of-vehicle projection method or end-of-vehicle projection apparatus of an embodiment of the present invention may be applied.
As shown in fig. 9, the vehicle system architecture 900 may include various systems, such as a drive control system 901, a power system 902, a sensor system 903, a control system 904, one or more peripheral devices 905, a power supply 906, a computer system 907, and a user interface 908. Alternatively, the vehicle system architecture 900 may include more or fewer systems, and each system may include multiple elements. In addition, each of the systems and elements of the vehicle system architecture 900 may be interconnected by wires or wirelessly.
Among other things, the vehicle system architecture 900 includes a drive control system 901 that may be used to regulate vehicle travel related components.
The powertrain 902 may include components that provide powered movement of the vehicle. For example, the powertrain 902 may include an engine, an energy source, a transmission, wheels, tires, and the like. The engine may be an internal combustion engine, an electric motor, an air compression engine, or other types of engine combinations, such as a hybrid engine of a gasoline engine and an electric motor, or a hybrid engine of an internal combustion engine and an air compression engine. The engine converts the energy source into mechanical energy to provide the transmission. Examples of energy sources may include gasoline, diesel, other petroleum-based fuels, propane, other compressed gas-based fuels, ethanol, solar panels, batteries, and other sources of electricity. The energy source may also provide energy to other systems of the vehicle. Further, the transmission may include a gearbox, differential, drive shaft, clutch, and the like.
The sensor system 903 may include a sensor that senses the environment surrounding the vehicle, a pressure sensor that senses whether a passenger is present on the seat, and the like. For example, a positioning system (which may be a global positioning system (global positioning system, GPS) system, but also a beidou system or other positioning system), a radar, a laser rangefinder, an inertial measurement unit (inertial measurement unit, IMU), and a camera. The positioning system may be used to locate the geographic location of the vehicle. The IMU is used to sense the position and orientation changes of the vehicle based on inertial acceleration. In one embodiment, the IMU may be a combination of an accelerometer and a gyroscope. Radar may utilize radio signals to sense objects within the surrounding environment of the vehicle. In some embodiments, in addition to sensing an object, the radar may be used to sense the speed and/or heading of the object, etc.
In order to detect environmental information, objects, and the like located in front of, behind, or beside the vehicle, a radar, a camera, and the like may be disposed at an appropriate position outside the vehicle. For example, in order to acquire an image in front of the vehicle, a camera may be disposed in the vehicle interior so as to be close to the front windshield. Alternatively, the camera may be disposed around the front bumper or radiator grille. For example, in order to acquire an image of the rear of the vehicle, a camera may be disposed in the vehicle interior in proximity to the rear window. Alternatively, the camera may be disposed around the rear bumper, trunk or tailgate. In order to acquire an image of the side of the vehicle, the camera may be disposed in the vehicle interior so as to be close to at least one of the side windows. Alternatively, the camera may be disposed on a side mirror, a fender, or the periphery of a door, or the like.
The laser rangefinder may utilize a laser to sense objects in the environment in which the vehicle is located.
The camera may be used to capture multiple images of the surrounding environment of the vehicle. The camera may be a still or video camera.
The control system 904 may include software systems required to implement driving, such as controlling the beam of the vehicle lights, etc.
The control system 904 interacts with external sensors, other autopilots, other computer systems, or users through peripheral devices 905. Peripheral devices 905 may include a wireless communication system, a car computer, a microphone, and/or a speaker.
In some embodiments, the peripheral 905 provides a means for a user of the control system 904 to interact with the user interface. For example, the vehicle computer may provide information to a user of the vehicle. The user interface is also operable to receive user input from the vehicle computer. The vehicle-mounted computer can be operated through the touch screen. In other cases, the peripheral device may provide a means for communicating with other devices located within the vehicle. For example, a microphone may receive audio (e.g., voice commands or other audio input) from a user of the control system 904. Similarly, speakers may output audio to a user of the control system 904.
The wireless communication system may communicate wirelessly with one or more devices directly or via a communication network. For example, wireless communication systems may communicate with wireless local area networks (wireless local area network, WLAN) using cellular networks, wiFi, etc., and may also communicate directly with devices using infrared links, bluetooth, or ZigBee. Other wireless protocols, such as various autopilot communication systems, etc.
The power source 906 may provide power to various components of the vehicle. The power source 906 may be a rechargeable lithium ion or lead acid battery.
Some or all of the functions to implement the vehicle end projection are controlled by computer system 907. The computer system 907 may include at least one processor that executes instructions stored in a non-transitory computer readable medium, such as memory. The computer system 907 provides the above-described driving control system with execution code that implements the vehicle end projection.
The processor may be any conventional processor, such as a commercially available central processing unit (central processing unit, CPU). Alternatively, the processor may be a special purpose device such as an application specific integrated circuit (a pplica tion specific integrated circuits, ASIC) or other hardware-based processor. Those of ordinary skill in the art will appreciate that the processor, computer, or memory may in fact comprise a plurality of processors, computers, or memories that may or may not be stored within the same physical housing. For example, the memory may be a hard disk drive or other storage medium located in a different housing than the computer. Thus, references to a processor or computer will be understood to include references to a collection of processors or computers or memories that may or may not operate in parallel. Rather than using a single processor to perform the steps described herein, some components, such as the steering component and the retarding component, may each have their own processor that performs only calculations related to the component-specific functions.
A user interface 908 for providing information to or receiving information from a user of the vehicle. Optionally, the user interface 908 may include one or more input/output devices within the set of peripheral devices 905, such as a wireless communication system, a car computer, a microphone, and a speaker.
It should be understood that the above components are merely examples, and in practical applications, components in the above modules or systems may be added or deleted according to actual needs, and fig. 9 should not be construed as limiting the embodiments of the present application.
Referring now to FIG. 10, there is illustrated a schematic diagram of a computer system 1000 suitable for use in implementing embodiments of the present application. The computer system illustrated in fig. 10 is merely an example, and should not be construed as limiting the functionality and scope of use of embodiments of the present application.
As shown in fig. 10, the computer system 1000 includes a Central Processing Unit (CPU) 1001, which can execute various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 1002 or a program loaded from a storage section 1008 into a Random Access Memory (RAM) 1003. In the RAM 1003, various programs and data required for the operation of the system 1000 are also stored. The CPU 1001, ROM 1002, and RAM 1003 are connected to each other by a bus 1004. An input/output (I/O) interface 1005 is also connected to bus 1004.
The following components are connected to the I/O interface 1005: includes an input portion 1006; an output portion 1007 including a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), etc., and a speaker, etc.; a storage portion 1008 including a hard disk or the like; and a communication section 1009 including a network interface card such as a LAN card, a modem, or the like. The communication section 1009 performs communication processing via a network such as the internet. The drive 1010 is also connected to the I/O interface 1005 as needed. A removable medium 1011, such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like, is installed as needed in the drive 1010, so that a computer program read out therefrom is installed as needed in the storage section 1008.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication portion 1009, and/or installed from the removable medium 1011. The above-described functions defined in the system of the present invention are performed when the computer program is executed by a Central Processing Unit (CPU) 1001.
The computer readable medium shown in the present invention may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present invention, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present invention may be implemented in software or in hardware. The described units may also be provided in a processor, for example, described as: a processor includes a pattern generation unit, a region determination unit, and a projection unit. The names of these modules or units do not in any way constitute a limitation of the module or unit itself, and the pattern generation unit may also be described as "a module or unit that generates a pattern to be projected corresponding to a simulated scene", for example.
As another aspect, the present invention also provides a computer-readable medium that may be contained in the apparatus described in the above embodiments; or may be present alone without being fitted into the device. The computer readable medium carries one or more programs which, when executed by a device, cause the device to include: determining a simulation scene in response to the projection trigger, and generating a pattern to be projected corresponding to the simulation scene according to the simulation scene; determining a projection area according to the position information of the vehicle; and controlling the emergent light beam of the car lamp of the car according to the projection area and the pattern to be projected so as to project a simulation field containing the pattern to be projected on the projection area, so that a user plays or moves on the simulation field projected by the car lamp.
According to the technical scheme provided by the embodiment of the invention, the simulated scene is determined, the pattern to be projected is generated according to the simulated scene, then the projection area is determined according to the position information of the vehicle, and then the simulated field containing the pattern to be projected is projected on the projection area by controlling the vehicle lamp of the vehicle, so that a user can play or move on the simulated field projected by the vehicle lamp, namely, the field and the scene required by the game or the movement of the user can be projected by the vehicle lamp under the control of the scheme provided by the embodiment of the invention, and richer entertainment experience is provided for the user through the vehicle.
The above steps are presented merely to aid in understanding the structure and core concept of the present invention. The above-described embodiments are not intended to limit the scope of the present invention to those skilled in the art. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives can occur depending upon design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should be included in the scope of the present invention.

Claims (10)

1. A vehicle end projection method, comprising:
determining a simulation scene in response to projection triggering, and generating a pattern to be projected corresponding to the simulation scene according to the simulation scene;
determining a projection area according to the position information of the vehicle;
and controlling the emergent light beam of the car lamp of the car according to the projection area and the pattern to be projected so as to project a simulation field containing the pattern to be projected on the projection area, so that a user plays or moves on the simulation field projected by the car lamp.
2. The vehicle end projection method of claim 1, wherein the determining a simulated scene in response to a projection trigger comprises:
Providing a plurality of selectable simulation scenes to a user through a display device;
responding to the selection of any target optional simulation scene by a user, and determining the target optional simulation scene as a simulation scene;
or alternatively, the process may be performed,
receiving a field image input by a user;
and generating a simulation scene according to the field image.
3. The vehicle-end projection method according to claim 1, wherein the generating a pattern to be projected corresponding to the simulated scene includes:
determining the size of the simulated scene;
if the user selects the single-vehicle projection mode, generating a complete pattern matched with the size of the simulated scene;
if the user selects the multi-vehicle projection mode, determining a part of simulation scenes corresponding to the vehicles according to the position information of each vehicle related to the multi-vehicle projection mode, and generating patterns to be projected of the part of simulation scenes corresponding to the vehicles so as to enable the patterns to be projected of the part of simulation scenes corresponding to the vehicles to be combined into a complete simulation scene.
4. The vehicle end projection method of claim 3, wherein the determining the projection area comprises:
for the multi-vehicle projection mode,
Determining the range of the area surrounded by a plurality of vehicles according to the position information of the vehicles, and determining the center point of the range of the area;
dividing an area block matched with each vehicle according to the central point and the area range;
judging whether the area block is within the light beam coverage range of the vehicle lamp, if so, determining that the area block is the projection area; otherwise, reminding the user to adjust the projection range of the vehicle.
5. The vehicle-end projection method according to claim 3, wherein projecting a simulated field containing the pattern to be projected on the projection area includes:
and for a multi-vehicle projection mode, a complete simulation site is spliced by utilizing the patterns to be projected generated by a plurality of vehicles.
6. The vehicle end projection method of claim 1, further comprising: for the multi-vehicle projection mode,
any one or more of the vehicles performs the following operations:
detecting the superposition condition between the pattern to be projected by the adjacent vehicle on the corresponding projection area and the pattern to be projected by the own vehicle on the corresponding projection area;
And if the superposition condition indicates that deviation exists between the pattern to be projected by the own vehicle and the pattern to be projected by the adjacent vehicle, adjusting the pattern to be projected by the own vehicle or indicating the adjacent vehicle to adjust the projected pattern to be projected.
7. An end-of-vehicle projection apparatus, comprising: a pattern generating unit, a region determining unit and a projection unit, wherein,
the pattern generation unit is used for responding to projection triggering, determining a simulation scene and generating a pattern to be projected corresponding to the simulation scene according to the simulation scene;
the area determining unit is used for determining a projection area according to the position information of the vehicle;
the projection unit is used for controlling the emergent light beam of the car lamp of the car according to the projection area and the pattern to be projected so as to project a simulation field containing the pattern to be projected on the projection area, so that a user plays or moves on the simulation field projected by the car lamp.
8. A vehicle, characterized by comprising: the vehicle end projection device of claim 7.
9. An electronic device, comprising:
one or more processors;
Storage means for storing one or more programs,
when executed by the one or more processors, causes the one or more processors to implement the method of any of claims 1-6.
10. A computer-readable storage medium having stored thereon a computer program for implementing end-of-vehicle projection, comprising:
the computer program, when executed by an onboard processor, implements the method of any one of claims 1-6.
CN202310925410.6A 2023-07-25 2023-07-25 Vehicle end projection method and device and vehicle Pending CN116959342A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310925410.6A CN116959342A (en) 2023-07-25 2023-07-25 Vehicle end projection method and device and vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310925410.6A CN116959342A (en) 2023-07-25 2023-07-25 Vehicle end projection method and device and vehicle

Publications (1)

Publication Number Publication Date
CN116959342A true CN116959342A (en) 2023-10-27

Family

ID=88450898

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310925410.6A Pending CN116959342A (en) 2023-07-25 2023-07-25 Vehicle end projection method and device and vehicle

Country Status (1)

Country Link
CN (1) CN116959342A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117818459A (en) * 2024-01-02 2024-04-05 深圳市欧冶半导体有限公司 Game interaction method and device of intelligent car lamp, computer equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117818459A (en) * 2024-01-02 2024-04-05 深圳市欧冶半导体有限公司 Game interaction method and device of intelligent car lamp, computer equipment and storage medium

Similar Documents

Publication Publication Date Title
JP6269552B2 (en) Vehicle travel control device
US9919611B2 (en) Vehicle charge assistance device and vehicle including the same
US10460184B2 (en) Image information acquiring apparatus, vehicle, and control method thereof
JP2022013929A (en) Positioning method, apparatus, medium, and device
CN108860141A (en) Method, apparatus of parking and storage medium
CN110979332B (en) Control method and device of intelligent automobile and storage medium
CN114056236A (en) Augmented reality processing device, route guidance method based on augmented reality, and electronic device
CN116959342A (en) Vehicle end projection method and device and vehicle
CN114935334B (en) Construction method and device of lane topological relation, vehicle, medium and chip
KR20160114486A (en) Mobile terminal and method for controlling the same
WO2018131514A1 (en) Signal processing device, signal processing method, and program
CN113343457B (en) Automatic driving simulation test method, device, equipment and storage medium
US11907086B2 (en) Infotainment device for vehicle and method for operating same
CN115164910B (en) Travel route generation method, travel route generation device, vehicle, storage medium, and chip
US20230076816A1 (en) Systems And Methods For Assisting A Battery Electric Vehicle Execute A Charging Operation At A Battery Charging Lot
CN115202234A (en) Simulation test method, device, storage medium and vehicle
CN114842454B (en) Obstacle detection method, device, equipment, storage medium, chip and vehicle
CN115221260B (en) Data processing method, device, vehicle and storage medium
CN117022004A (en) Automatic vehicle charging method and system
CN115082886B (en) Target detection method, device, storage medium, chip and vehicle
CN114911630B (en) Data processing method and device, vehicle, storage medium and chip
CN115115822B (en) Vehicle-end image processing method and device, vehicle, storage medium and chip
CN114801832B (en) Charging device, method, apparatus, vehicle, electronic device, and storage medium
WO2024108380A1 (en) Automatic parking method and device
US20240066988A1 (en) Vehicle and a method of controlling the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication