CN115412709A - Projection method, projection device, vehicle and storage medium - Google Patents

Projection method, projection device, vehicle and storage medium Download PDF

Info

Publication number
CN115412709A
CN115412709A CN202210886902.4A CN202210886902A CN115412709A CN 115412709 A CN115412709 A CN 115412709A CN 202210886902 A CN202210886902 A CN 202210886902A CN 115412709 A CN115412709 A CN 115412709A
Authority
CN
China
Prior art keywords
person
projection
determining
vehicle
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210886902.4A
Other languages
Chinese (zh)
Other versions
CN115412709B (en
Inventor
黄勇斌
周清清
刘关林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Automobile Group Co Ltd
Original Assignee
Guangzhou Automobile Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Automobile Group Co Ltd filed Critical Guangzhou Automobile Group Co Ltd
Priority to CN202210886902.4A priority Critical patent/CN115412709B/en
Publication of CN115412709A publication Critical patent/CN115412709A/en
Application granted granted Critical
Publication of CN115412709B publication Critical patent/CN115412709B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Traffic Control Systems (AREA)

Abstract

The embodiment of the application provides a projection method, a projection device, a vehicle and a storage medium, and relates to the technical field of projection. When detecting that the motion trail of a first person meets a preset motion trail, acquiring biological information of the first person, and acquiring biological information and the motion trail of a second person; determining a third person from the second persons according to the movement track of the second persons; determining target persons as a first person and a third person, and determining a projection scheme according to the biological information and the motion trail of the target persons; the projection is carried out according to the projection scheme, so that different projection schemes can be determined for different people, and the flexibility and the diversity of the projection schemes can be improved.

Description

Projection method, projection device, vehicle and storage medium
Technical Field
The embodiment of the application relates to the technical field of projection, in particular to a projection method, a projection device, a vehicle and a storage medium.
Background
Projection systems, also known as ground lighting systems, greeting systems, chassis light systems, ground lighting systems, create a greeting atmosphere by projecting specific patterns onto the ground, thereby enhancing the greeting effect. For example, a vehicle projection light may project a particular pattern off of the vehicle. However, the currently adopted projection scheme projects a static single pattern to the ground, which results in poor user experience.
Disclosure of Invention
The embodiment of the application provides a projection method, a projection device, a vehicle and a storage medium, so as to improve the problems.
In a first aspect, an embodiment of the present application provides a projection method. The method comprises the following steps: if the motion trail of a first person is detected to meet a preset motion trail, acquiring biological information of the first person, and acquiring biological information and the motion trail of a second person, wherein the second person is a person in a preset range of the first person except the first person, and the first person is a key holder; determining a third person from the second persons according to the movement track of the second persons, wherein the third person and the first person are the same-row persons; determining target persons as the first person and the third person, and determining a projection scheme according to the biological information and the motion trail of the target persons; and performing projection according to the projection scheme.
In a second aspect, embodiments of the present application provide a projection apparatus. The device includes:
the device comprises an acquisition module, a processing module and a control module, wherein the acquisition module is used for acquiring biological information of a first person and acquiring biological information and a motion track of a second person if the motion track of the first person is detected to meet a preset motion track, the second person is a person in a preset range of the first person except the first person, and the first person is a key holder; the determining module is used for determining a third person from the second persons according to the movement track of the second persons, wherein the third person and the first person are the same-row persons; the decision module is used for determining target persons as the first person and the third person and determining a projection scheme according to the biological information and the motion trail of the target persons; and the projection module is used for carrying out projection according to the projection scheme.
In a third aspect, embodiments of the present application provide a vehicle. The vehicle comprises a signal receiver, a camera module, a projection module and a controller. The controller comprises a memory, one or more processors, and one or more application programs, wherein the one or more application programs are stored in the memory and are configured to cause the one or more processors to execute the method provided by the embodiment of the application when called by the one or more processors.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium. The computer readable storage medium has stored therein program code configured to, when invoked by a processor, cause the processor to perform the methods provided by embodiments of the present application.
The embodiment of the application provides a projection method, a projection device, a vehicle and a storage medium, wherein a projection scheme is determined according to biological information and a motion track of a key holder and co-workers of the key holder, different projection schemes can be determined for different workers, and therefore the flexibility and the diversity of the projection scheme can be improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic diagram of an application scenario of a projection method provided in an exemplary embodiment of the present application;
fig. 2 is a schematic flowchart of a projection method according to an embodiment of the present application;
FIG. 3 is a schematic illustration of preset ranges provided by an exemplary embodiment of the present application;
FIG. 4 is a schematic flowchart of a projection method according to another embodiment of the present application;
FIG. 5 is a schematic illustration of a trajectory of movement of a person around a vehicle provided by an exemplary embodiment of the present application;
FIG. 6 is a schematic view of a projection region provided by an exemplary embodiment of the present application;
FIG. 7 is a flowchart illustrating a step 390 in a projection method according to another embodiment of the present application;
FIG. 8 is a schematic illustration of a vehicle projection provided by an exemplary embodiment of the present application;
FIG. 9 is a schematic flow chart diagram of a projection method provided in an exemplary embodiment of the present application;
fig. 10 is a block diagram of a projection apparatus according to an embodiment of the present disclosure;
FIG. 11 is a block diagram of a vehicle according to an embodiment of the present application;
fig. 12 is a block diagram of a computer-readable storage medium according to an embodiment of the present disclosure.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application.
Referring to fig. 1, fig. 1 is a schematic view of an application scenario of a projection method according to an exemplary embodiment of the present application. The projection system 100 includes a vehicle 110 and a key 120. When the distance between the vehicle 110 and the key 120 is less than or equal to the preset distance, the vehicle 110 and the key 120 may communicate to achieve data interaction. The preset distance refers to a distance at which the vehicle 110 can establish a communication connection with the key 120.
In some embodiments, as indicated by the dashed arrow in fig. 1, vehicle 110 may include a signal receiver 111, a camera module 112, a projection module 113, and a controller 114. The controller 114 is respectively connected to the signal receiver 111, the camera module 112 and the projection module 113 in a communication manner.
The signal receiver 111 is used for receiving the signal transmitted by the key 120. The signal receiver 111 may be a single signal receiver, or may include a plurality of signal receivers, and is not limited herein. In some embodiments, the signal receiver 111 is an Ultra Wide Band (UWB) signal receiver. In other embodiments, signal receiver 111 is a bluetooth signal receiver. In still other embodiments, the signal receiver 111 is a remote control signal receiver.
The camera module 112 is used for identifying the biological information and the motion track of the user. The camera module 112 may be a single camera, and may also include a plurality of cameras, which is not limited herein.
The projection module 113 is used for projecting according to the projection scheme decided by the controller. The projection module 113 may include an illumination control unit, a projection unit, a module element, and the like. The illumination control unit is connected to the controller 114, the projection unit, and the module element, respectively. The lighting control unit is used for driving the projection unit and the module elements to project to the ground according to the instruction of the controller 114. The content, distance, and direction of the projection unit and module elements to the ground are determined by the controller 114. For example, the projected content may be a particular pattern, text or video.
The controller 114 is used for making a decision on the projection scheme according to the information uploaded by the signal receiver 111, the camera module 112 and the controller 114. In some embodiments, the controller 114 is an Electronic Control Unit (ECU). In other embodiments, the controller 114 is a controller integrated with multiple ECUs. In still other embodiments, the controller 114 is a controller integrated with a plurality of Micro Control Units (MCUs).
Key 120 refers to a smart key that can be used to start vehicle 110. In some embodiments, the key 120 is a UWB key. In other embodiments, the key 120 is a bluetooth key. In still other embodiments, the key 120 is a key fob. In still other embodiments, the key 120 may also be a mobile terminal, such as a smartphone, a smartwatch, a smartpad, which can be used to start the vehicle 110. The predetermined distance may be determined according to the key 120. As an example, the predetermined distance may be the farthest distance over which the signal transmitted by the key 120 can be transmitted. As another example, the preset distance may be a distance corresponding to when the key 120 starts transmitting a signal until the intensity of the signal decreases to the preset signal intensity. The preset signal strength is a strength capable of stably receiving the signal content, and may be determined according to the type of the key 120 in practical application.
Referring to fig. 2, fig. 2 is a schematic flowchart illustrating a projection method according to an embodiment of the present disclosure. The projection method is applied to the controller 114 in the vehicle 110 shown in fig. 1, or the apparatus 400 shown in fig. 10, which will be mentioned below, or the controller 540 in the vehicle 500 shown in fig. 11, which will be mentioned below. The projection method may include the following steps 210 to 240.
Step 210, if it is detected that the movement track of the first person meets the preset movement track, acquiring the biological information of the first person, and acquiring the biological information and the movement track of the second person, wherein the second person is a person other than the first person within the preset range of the first person, and the first person is a key holder.
Wherein the key of the first person is matched with the vehicle, i.e. the key is registered or activated in the system of the vehicle, the key can start the matched vehicle, and the vehicle determines whether the key is matched with the vehicle by identifying the Identity (ID) of the key. The same vehicle may correspond to one key or a plurality of keys, and is not limited herein. During the movement of the first person relative to the vehicle, the relative distance between the first person and the vehicle is less than or equal to the preset distance as described above.
Wherein, the preset motion track refers to a trigger condition for starting the projection operation. The preset motion trajectory may be preset according to practical experience and stored in the memory. In some embodiments, the preset motion trajectory comprises a first person approaching the vehicle from far to near, the first person stays in the process of approaching the vehicle from far to near, and the first person stays away from the vehicle after staying in the process of approaching the vehicle from far to near, wherein the stay time of the first person is less than or equal to the preset time length. The preset time duration may be set according to an actual requirement, and is not specifically limited herein, for example, the preset time duration may be 3 seconds.
As an example, a first person approaching a vehicle from far to near may include the following scenarios: 1) The first person approaches the vehicle from far to near from two sides of the vehicle; 2) The first person approaches the vehicle from far to near from the front of the vehicle; and 3) the first person approaches the vehicle from far to near from behind the vehicle.
As an example, the first person stays in the process of approaching the vehicle from far to near, wherein the time period for which the first person stays less than or equal to the preset time period may include the following situations: 1) The first person stays in the process of approaching the vehicle from far to near from two sides of the vehicle, and the stay time of the first person is less than or equal to the preset time; 2) The method comprises the following steps that a first person stays in the process of approaching a vehicle from far to near from the front of the vehicle, and the stay time of the first person is less than or equal to a preset time; and 3) the first person stays in the process of approaching the vehicle from far to near from the rear of the vehicle, and the stay time of the first person is less than or equal to the preset time.
As an example, the first person is far away from the vehicle after staying in the process of approaching the vehicle from far to near, wherein the staying time period of the first person is less than or equal to the preset time period may include the following situations: 1) The method comprises the steps that a first person stays away from a vehicle after staying in the process of approaching the vehicle from far to near from two sides of the vehicle, wherein the staying time length of the first person is less than or equal to a preset time length; 2) The method comprises the following steps that a first person stays away from a vehicle after staying in the process of approaching the vehicle from far to near from the front of the vehicle, wherein the staying time length of the first person is less than or equal to a preset time length; and 3) the first person stays away from the vehicle after staying in the process of approaching the vehicle from far to near from the rear of the vehicle, wherein the staying time of the first person is less than or equal to the preset time.
The biological information of the first person refers to information that can be used for determining differences between different persons, so that the projected content can be differentiated according to the differences. In some embodiments, the biological information of the first person may be gender and age.
The shape of the preset range may be a regular shape or an irregular shape. The preset range may or may not be centered on the first person. The preset range may be specifically set according to the requirement of actual accuracy.
Referring to fig. 3, fig. 3 is a schematic diagram of a preset range according to an exemplary embodiment of the present application. As an example, as shown in fig. 3, the preset range may be a circular range with the first person as a center and the preset length as a radius. The preset length can be set according to actual requirements, for example, the preset length can be 3 meters, and the preset length can also be the preset distance. As another example, as shown in fig. 3, the preset range may be a trapezoid with a short side close to the vehicle, considering that a fellow person (i.e., a third person) of the first person tends to be not far from the first person and to have substantially the same direction.
In some embodiments, when the motion trajectory of the first person meets the preset motion trajectory, the projection operation is triggered to be started, and at this time, the biological information of the first person may be acquired, and the biological information and the motion trajectory of the second person may be acquired. Specifically, the biological information of the first person and the biological information and the motion trail of the second person can be acquired through the camera module.
In some embodiments, if the movement trajectories of a plurality of first persons meet the preset movement trajectory, historical use data of each key may be obtained, the target first person is determined according to the historical use data of the key, and biological information of the target first person and biological information and the movement trajectory of a second person are obtained, wherein the second person is a person other than the target first person within the preset range of the target first person.
In some embodiments, the historical usage data may be the time the key last started the vehicle. The holder corresponding to the key having the time closest to the current time may be determined as the target first person.
In other embodiments, the historical usage data may be the number of times the key has been activated for a preset period of time. The preset time period can be set according to actual needs, for example, the preset time period can be a week, a month, or a year. The holder corresponding to the most frequent key may be determined as the target first person.
In still other embodiments, all keys that are matched to the vehicle have a pre-set priority. The holder corresponding to the key with the highest priority may be determined as the target first person.
And step 220, determining a third person from the second persons according to the movement track of the second persons, wherein the third person and the first person are the same-row persons.
In some embodiments, it may be determined whether the motion trajectory of the second person meets the preset motion trajectory, it is determined that the third person is a person whose motion trajectory meets the preset motion trajectory among the second persons, and it is determined that a person whose motion trajectory does not meet the preset motion trajectory among the second persons does not belong to a fellow person of the first person.
In other embodiments, the motion trajectory of the second person may also be converted into a coordinate system, and a track point closest to the vehicle and an end point of the motion trajectory in the motion trajectory of the second person may be obtained. It is possible to detect whether the locus point of the approaching vehicle coincides with the end point of the movement locus. And if the track point closest to the vehicle is coincident with the end point of the motion track, determining that the third person is the person corresponding to the motion track. And if the track point closest to the vehicle is not coincident with the end point of the motion track, the person corresponding to the motion track does not belong to the same person of the first person.
It should be noted that, determining whether the track point of the approaching vehicle coincides with the end point of the motion track means that the track point of the approaching vehicle coincides completely with the end point of the motion track, or the distance between the track point of the approaching vehicle and the end point of the motion track is smaller than a preset separation distance, and the preset separation distance may be set according to an actual requirement, for example, 0.5 m.
In still other embodiments, since the direction of movement of the fellow person of the key holder (i.e., the third person) is generally the same as the direction of movement of the key holder (i.e., the first person), the direction of movement may be determined from the trajectory of the second person, and thus the third person may be determined from the second person. Specifically, the movement direction of the first person may be determined from the movement trajectory of the first person. And determining the movement direction of the second person according to the movement track of the second person. And determining the third person as the person of the second person who moves in the same direction as the first person. And a person of the second person who moves in a direction different from the direction of movement of the first person may be determined as a fellow person who does not belong to the first person.
And step 230, determining the target persons as a first person and a third person, and determining a projection scheme according to the biological information and the motion trail of the target persons.
The projection scheme comprises a projection direction and projection content. Each projection direction corresponds to one projection area, and if the target person exists in the projection area, the projection direction corresponding to the projection area where the target person exists is determined to be projected. The projection content and the target person have a mapping relationship, which may be one-to-one mapping, one-to-many mapping, many-to-one mapping, and the like, so that different target persons correspond to different projection contents to differentiate the projection contents.
The projected content can be specific patterns, characters and videos. For example, the projected content may be a specific LOGO pattern, may be text such as "welcome", or may be a piece of animated video. The one or more projected contents may be preconfigured and stored in the vehicle or in a server or database communicatively connected to the vehicle. In some embodiments, the vehicle may communicate with a cloud, and a user or Original Equipment Manufacturer (OEM) may customize or update the projection content through the cloud, so that the projection content has diversity and real-time performance.
In some embodiments, the projection direction may be determined according to the motion trajectory of the target person. The projected content may be determined based on the biometric information of the target person. And determining a projection scheme according to the projection direction and the projection content.
And 240, projecting according to the projection scheme.
In some embodiments, after determining the projection scheme, the controller may send the projection scheme to the projection module to cause the projection module to project according to the projection scheme. For example, the controller may send the projection scheme to an illumination control unit in the projection module, and the illumination control unit drives the module element and the projection unit to perform projection according to the projection scheme, so as to implement dynamic control projection. In particular, the lighting control unit may generate a projection control sequence according to the projection scheme to drive the module elements and the projection unit to implement the projection scheme.
In some embodiments, the controller may also adjust the distance of the projection and the range size of the projection based on the relative distance of the target person with respect to the vehicle. For example, if the relative distance of the target person with respect to the vehicle increases, the projected distance increases, and the projected range increases.
According to the projection method provided by the embodiment of the application, the projection scheme is determined according to the biological information and the motion trail of the key holder and the persons in the same row, different projection schemes can be determined for different persons, and therefore the flexibility and the diversity of the projection schemes can be improved.
Referring to fig. 4, fig. 4 is a schematic flowchart of a projection method according to another embodiment of the present application. The projection method is applied to the controller 114 in the vehicle 110 shown in fig. 1, or the apparatus 400 shown in fig. 10, which will be mentioned below, or the controller 540 in the vehicle 500 shown in fig. 11, which will be mentioned below. The projection method may include the following steps 310 to 3110.
The distance and orientation of the key relative to the vehicle is determined 310 based on the signal transmitted by the key.
Wherein, the signal transmitted by the key can be UWB signal, bluetooth signal, remote control signal, etc. The signal transmitted by the key corresponds to the type of key.
In some embodiments, the controller, upon receiving the signal transmitted by the key, may locate and identify the distance and orientation of the key relative to the vehicle based on the signal. In particular, the controller may receive signals emitted by at least two keys and determine the distance and orientation of the key relative to the vehicle twice from the at least two signals, so as to subsequently determine the movement trajectory of the key holder from the distance and orientation of the key relative to the vehicle at least twice.
And step 320, determining the motion track of the first person according to the distance and the position of the key relative to the vehicle.
In some embodiments, the distance and orientation of the key relative to the vehicle may be converted into a coordinate system, one coordinate point corresponding to the distance and orientation determined from the same signal. The movement trajectory of the first person may be determined from the distance and the orientation (i.e. the at least two coordinate points) determined from the at least two signals. Specifically, at least two coordinate points may be connected, and the movement locus of the first person may be obtained.
Step 330, detecting whether the motion trail of the first person meets a preset motion trail.
For a detailed description of the preset motion trajectory, please refer to relevant parts of step 210, which is not described herein again.
In some embodiments, whether a motion trail of the first person exists in the preset motion trail may be detected, and if the motion trail of the first person exists in the preset motion trail, it may be determined that the motion trail of the first person meets the preset motion trail. If the motion trail of the first person does not exist in the preset motion trail, it can be determined that the motion trail of the first person does not meet the preset motion trail.
If it is detected that the motion trajectory of the first person meets the preset motion trajectory, it may be considered that the trigger condition for starting the projection operation is met, and then steps 340 to 3110 may be performed.
If it is detected that the motion trajectory of the first person does not satisfy the preset motion trajectory, it may be considered that the trigger condition for starting the projection operation is not satisfied, and then the step 310 may be executed.
And 340, acquiring the biological information of the first person, and acquiring the biological information and the motion trail of the second person.
Please refer to step 210 in step 340, which is not described herein again.
And step 350, determining the movement direction of the first person according to the movement track of the first person.
Wherein the direction of movement of the first person refers to the direction of movement of the first person relative to the vehicle. The moving direction of the first person may be a moving direction included in the preset moving track, for example, the moving direction of the first person may include approaching the vehicle or approaching the vehicle and then departing from the vehicle.
In some embodiments, the relative distance of the first person with respect to the vehicle may be calculated in real time or at preset intervals, and the movement direction of the first person may be determined according to the variation of the relative distance. The preset time period may be set according to actual requirements, for example, the preset time period may be 0.5 second.
As an example, the motion trajectory of the first person and the position of the vehicle may be converted to a coordinate system, and a starting point in the motion trajectory, an ending point of the motion trajectory, and a coordinate point characterizing the vehicle may be obtained. And calculating the distance between the starting point of the motion trail and a coordinate point representing the vehicle. And calculating the distance between the end point of the motion trail and a coordinate point representing the vehicle. And if the distance between the starting point of the motion track and the coordinate point representing the vehicle is greater than the distance between the end point of the motion track and the coordinate point representing the vehicle, determining that the motion direction of the first person is close to the vehicle. And if the distance between the starting point of the motion track and the coordinate point representing the vehicle is smaller than the distance between the end point of the motion track and the coordinate point representing the vehicle, determining that the motion direction of the first person is close to the vehicle and then far away from the vehicle.
As an example, please refer to fig. 5, fig. 5 is a schematic diagram of a motion trajectory of a person around a vehicle according to an exemplary embodiment of the present application. Person A is the first person and person B, C, D is the second person. Trajectory 1 is the trajectory of person a. The trajectory 2 is the movement trajectory of the person B. The trajectory 3 is the movement trajectory of the person C, and the trajectory 4 is the movement trajectory of the person D. Step 350 is performed and the direction of movement of person a (the first person) may be determined to be approaching the vehicle.
And step 360, determining the movement direction of the second person according to the movement track of the second person.
Wherein the direction of movement of the second person refers to the direction of movement of the second person relative to the vehicle. The direction of movement of the second person may comprise the direction of movement of the first person, but the direction of movement of the second person may also comprise other directions of movement, for example away from the vehicle or stationary relative to the vehicle.
In some embodiments, the relative distance of the second person with respect to the vehicle may be calculated in real time or at intervals of a preset time period, and the moving direction of the second person may be determined according to an amount of change in the relative distance. The preset time period may be set according to actual requirements, for example, the preset time period may be 0.5 second.
As an example, the movement track of the second person and the position of the vehicle may be converted into a coordinate system, and a start point in the movement track, an end point of the movement track, and a coordinate point characterizing the vehicle may be obtained. And calculating the distance between the starting point of the motion trail and a coordinate point representing the vehicle. And calculating the distance between the end point of the motion trail and a coordinate point representing the vehicle. And if the distance between the starting point of the motion track and the coordinate point representing the vehicle is greater than the distance between the end point of the motion track and the coordinate point representing the vehicle, determining the motion direction of the person corresponding to the motion track as approaching the vehicle. And if the distance between the starting point of the motion track and the coordinate point representing the vehicle is smaller than the distance between the end point of the motion track and the coordinate point representing the vehicle, determining the motion direction of the person corresponding to the motion track to be far away from the vehicle. And if the distance between the starting point of the motion trail and the coordinate point representing the vehicle is equal to the distance between the end point of the motion trail and the coordinate point representing the vehicle, determining the motion direction of the person corresponding to the motion trail to be static relative to the vehicle.
In some embodiments, after determining that the distance between the starting point of the motion trail and the coordinate point representing the vehicle is smaller than the distance between the ending point of the motion trail and the coordinate point representing the vehicle (i.e., after determining that the motion direction of the person corresponding to the motion trail is far away from the vehicle), the track point closest to the vehicle in the motion trail may be further obtained. And calculating the distance between the track point closest to the vehicle and the vehicle. And judging whether the distance between the track point of the vehicle and the vehicle is smaller than the distance between the starting point of the motion track and the vehicle. And if the distance between the track point of the vehicle and the vehicle is smaller than the distance between the starting point of the motion track and the vehicle, determining the motion direction of the person corresponding to the motion track to be close to the vehicle and then far away from the vehicle. And if the distance between the track point of the vehicle and the vehicle is greater than or equal to the distance between the starting point of the motion track and the vehicle, determining the motion direction of the person corresponding to the motion track to be far away from the vehicle.
As an example, referring again to fig. 5, step 360 may be executed to determine that the moving direction of person B within the preset range of the key holder is approaching the vehicle, the moving direction of person C is away from the vehicle, and the moving direction of person D is approaching the vehicle.
And step 370, determining that the third person is the person in the second person who moves in the same direction as the first person.
Considering that the moving direction of the fellow person of the key holder (i.e., the third person) is generally the same as the moving direction of the key holder (i.e., the first person), it can be determined that the third person is one of the second persons who moves in the same direction as the moving direction of the first person.
As an example, referring again to fig. 5, the step 370 can be executed to determine that the members B and D of the same row of the member a (the first member), i.e. the members B and D of the third member.
And 380, determining the target personnel as a first personnel and a third personnel, and determining the projection direction according to the motion trail of the target personnel.
The projection direction refers to a direction in which the vehicle can be projected, and each projection direction corresponds to one projection area. Each projection direction corresponds to one vehicle door, and the specific projection direction can be determined according to the type of the vehicle.
As an example, please refer to fig. 6, fig. 6 is a schematic diagram of a projection area according to an exemplary embodiment of the present application. The vehicle includes four doors (not shown in the figure) of a left front door, a left rear door, a right front door, and a right rear door. The projection area comprises four projection areas including a left front projection area S1, a left rear projection area S2, a right front projection area S3 and a right rear projection area S4. And S0 is a non-projection area, and the projection directions comprise four projection directions including a left front projection direction, a left rear projection direction, a right front projection direction and a right rear projection direction.
Step 390, determining the projection content of the projection direction according to the biological information of the target person.
As previously mentioned, the biometric information may include the age and gender of the target person. The camera module can identify the biological information of the target person. As an example, referring to fig. 5 again, the camera module identifies the target person A, B, D in fig. 5, and the gender of the target person a (i.e., the first person) can be obtained as male and the age is about 30. The sex of the target person B (i.e., the third person) was male, and the age was about 35. The sex of the target person C (i.e., the third person) is female, and the age is about 50.
For a detailed description of the projection content, please refer to the relevant part of the foregoing step 230, which is not described herein again.
In some embodiments, please refer to fig. 7, fig. 7 is a flowchart illustrating a step 390 in a projection method according to another embodiment of the present application. This step 390 may include the following steps 391-395.
Step 391, determining a projection scene corresponding to the target person according to the biological information of the target person.
The biological information of the target person has a mapping relation with the projection scene, and the mapping relation can be one-to-one mapping or many-to-one mapping, so that different target persons can correspond to different projection scenes, and projection content differentiation can be realized for different target persons in the following.
In some embodiments, the projection scene corresponding to the biological information of the target person may be searched in the mapping table of the biological information and the projection scene according to the biological information of the target person.
As an example, a mapping table of the biological information and the projection scene may be as shown in table 1. And if the age of the target person is smaller than a first preset age A1, determining that the projection scene is a child scene. If the target person is a female with the age greater than or equal to a first preset age A1 and less than or equal to a second preset age A2, determining that the projection scene is a universal female scene. And if the target person is a female with the uncertain age, determining the projection scene as a universal female scene. And if the target person is a male with the age greater than or equal to the first preset age A1 and less than or equal to the second preset age A2, determining that the projection scene is a general male scene. And if the target person is a male with an uncertain age, determining the projection scene as a general male scene. And if the target person is a female with the age greater than the second preset age A2, determining that the projection scene is a female scene of middle and old aged people. And if the target person is a male with the age larger than the second preset age A2, determining that the projection scene is a middle-aged and old male scene. And if the gender of the target person cannot be determined, determining the projection scene as a general scene. The first preset age A1 and the second preset age A2 may be set according to actual requirements, for example, the first preset age A1 may be 15 years old, and the second preset age A2 may be 50 years old. It should be noted that table 1 is only an example, and more projection scenes may be divided in practical applications.
TABLE 1
Biological information (age: A) Projection scene
A<A1 Scene for children
A1≤A≤A2&Female with a pattern of holes Universal female scene
Female with uncertain age Universal female scene
A1≤A≤A2&Male sex General male scene
Age indeterminate men General purposeScene for male
A>A2&Female with a view to preventing the formation of wrinkles Scene for middle and old aged women
A>A2&Male sex Scene for middle and old aged men
Sex is not determined General purpose scenarios
Step 392, the number of target persons in the projection direction is determined.
Since one projection direction only corresponds to one projection area, the number of target persons in each projection direction can be calculated, and the projection scene in each projection direction can be determined according to the target persons in the projection direction.
Step 393, if a target person exists in the projection direction, determining that the projection scene in the projection direction is the projection scene corresponding to the target person in the projection direction.
If a target person exists in the projection direction, the projection scene corresponding to the target person can be directly determined as the projection scene in the projection direction.
Step 394, if there are at least two target persons in the same projection direction, determining that the projection scene in the projection direction is the projection scene with the highest preset priority in the projection scenes corresponding to the at least two target persons.
The preset priority of the projection scene can be set according to actual requirements. As an example, referring to table 1 above, the following preset priority order of projection scenarios may be set: child scene > middle-aged and elderly women scene > middle-aged and elderly men scene > general women scene > general men scene > general scene. When the vehicle is communicated with the cloud, a third party or an OEM can update the preset priority of the projection scene through the cloud. For example, a third party or OEM may change the preset priority of an existing projection scene through the cloud, and the third party or OEM may add a new projection scene through the cloud and set the preset priority of the new projection scene.
In some embodiments, if there are at least two target persons in the same projection direction, a preset priority of a projection scene corresponding to each target person in the at least two target persons may be obtained, and the projection scene with the highest preset priority is used as the projection scene in the projection direction.
Step 395, determining the projection content in the projection direction based on the preset mapping relationship between the projection scene and the projection content.
The preset mapping relationship between the projection scene and the projection content may be one-to-one mapping, one-to-many mapping, many-to-one mapping, many-to-many mapping, or the like. The preset mapping relationship between the projection scene and the projection content may be preset and stored in the vehicle. When the vehicle is communicated with the cloud end, a third party or an OEM can update the preset mapping relation between the projection scene and the projection content through the cloud end. For example, a third party or OEM may change a preset mapping relationship between an existing projection scene and projection content through the cloud.
In some embodiments, each projection scene corresponds to one projection content, and the projection content corresponding to the projection scene may be directly determined as the projection content in the projection direction.
In other embodiments, each projection scene corresponds to at least two projection contents. The uploading time of at least two projection contents can be obtained, and the recently uploaded projection contents are determined as the projection contents in the projection direction, so that the real-time performance of the contents is improved. Or, the number of times of projection of at least two projection contents may be acquired, and the projection content with the largest number of times of projection may be determined as the projection content in the projection direction, so as to improve the accuracy of the projection content.
Step 3100, determining a projection scheme according to the projection content and the projection direction.
After determining the projection direction and the projection content corresponding to the projection direction, it may be determined that the projection scheme projects the projection content corresponding to the projection direction to the projection area corresponding to the projection direction.
As an example, please refer to fig. 8, fig. 8 is a schematic diagram of a vehicle projection according to an exemplary embodiment of the present application. The projection directions are left front, left rear and right rear. Target personnel A exist in the left front projection area, and the projection content corresponding to the left front is a vehicle LOGO. The target person B exists in the left rear projection area, the projection content corresponding to the left rear is a gear pattern, the target person C exists in the right rear projection area, and the projection content corresponding to the right rear is a piglet Peckey animation (the cloud is used for replacing the picture). The determined projection scheme is then as follows: projecting a vehicle LOGO to the left front projection area, projecting a gear pattern to the left rear projection area, and projecting a piggy Peckey animation to the right rear projection area.
Step 3110, projection is performed according to the projection scheme.
Please refer to step 240 in step 3110, which is not described herein.
And 3120, monitoring whether the target person triggers the projection finishing operation.
In some embodiments, it may be determined that the target person triggers the end of the projection operation if it is detected that the first person is away from the vehicle and all target persons are not boarding.
In other embodiments, if it is detected that the first person is not far away from the vehicle, the time spent in the vicinity of the vehicle exceeds the preset dwell time, and all the target persons are not on board, it may be determined that the target persons trigger the end of the projection operation. The preset stay time can be set according to actual requirements, for example, the preset stay time can be 30 seconds.
In still other embodiments, it may be determined that the target person triggers the end of the projection operation if it is detected that the vehicle door corresponding to the target person is opened and then closed.
If it is monitored that the target person triggers the end of the projection operation, step 3130 is performed.
If it is monitored that the target person does not trigger the end of the projection operation, the step 3120 may be continuously performed, that is, whether the target person triggers the end of the projection operation is continuously monitored.
Step 3130, stop the projection operation.
According to the projection method provided by the embodiment of the application, the projection scheme is determined according to the biological information and the motion trail of the key holder and the persons in the same row, different projection schemes can be determined for different persons, and therefore the flexibility and the diversity of the projection schemes can be improved. The embodiment takes the key as a trigger source, and can avoid false triggering action of pedestrians around the vehicle. The embodiment does not need to pre-store a specific identification object, but generalizes the identification object, expands the identification correspondence from a specific individual to a group with certain characteristics, and expands the application range of projection.
Referring to fig. 9, fig. 9 is a schematic flowchart of a projection method according to an exemplary embodiment of the present application. The method may be applied to the projection system 100 shown in fig. 1 described above.
The signal receiver receives the signal sent by the key, determines whether a key holder (namely a first person) approaches the vehicle or not according to the signal sent by the key, and if the key holder is detected to approach the vehicle, the key ID and the position parameter of the key are sent to the controller, and the position information of the key is continuously detected. In some embodiments, the signal receiver may transmit a signal transmitted by the key to the controller, and the controller may determine whether the key holder is near the vehicle based on the signal transmitted by the key.
The controller determines the motion track of the key holder according to the key ID and the position parameter, and determines whether to trigger the starting of the projection operation according to whether the motion track of the key holder meets the preset motion track. And if the trigger starting and the trigger projection operation are triggered, sending a wake-up instruction to the camera module, the illumination control unit, the module element and the projection unit. If the projection operation is not triggered to be started, whether the key holder approaches or not is continuously detected or a continuous detection instruction is sent to the signal receiver, so that the signal receiver detects whether the key holder approaches or not.
After the camera module receives the awakening instruction, information acquisition is carried out on people around the vehicle (the people around the vehicle comprise a first person and a second person), wherein the information acquisition comprises the biological information and the movement track of the people around the vehicle, and the obtained biological information and the obtained movement track of the people around the vehicle are sent to the controller.
And after the lighting control unit, the module element and the projection unit receive the awakening instruction, activating the element and waiting for the next indication about projection.
After the controller sends the wake-up instruction, the projection scheme can be decided according to the position information of the key and the information (biological information and motion trail) of the personnel around the vehicle, and the projection scheme is sent to the lighting control unit.
And the illumination control unit generates a projection control sequence according to the projection scheme, and drives the module element and the projection element to realize the projection scheme. The module component and the projection component feed back the execution situation to the lighting control unit. And the illumination control unit determines whether the projection scheme is executed or not according to the execution conditions fed back by the module element and the projection element. And if the execution is finished, sending an executed instruction to the controller. If not, continuing to generate a projection control sequence and continuing to drive the module element and the projection element to realize the projection scheme.
And when the controller receives the executed instruction or detects that the projection stopping operation is triggered, the controller sends an ending instruction to the signal receiver, the illumination control unit, the module element and the projection unit. After the signal receiver, the illumination control unit, the module element and the projection unit perform post-processing, the whole projection process is finished.
For parts of the exemplary embodiment that are not described in detail, please refer to relevant parts of the foregoing embodiments, which are not described again here.
Referring to fig. 10, fig. 10 is a block diagram of a projection apparatus according to an embodiment of the disclosure. The projection apparatus 400 may be applied to the controller 114 in the vehicle 110 shown in fig. 1 described above, or the controller 540 in the vehicle 500 shown in fig. 11 to be mentioned below. The projection apparatus 400 includes an acquisition module 410, a determination module 420, a decision module 430, and a projection module 440. The obtaining module 410, the determining module 420, the decision module 430 and the projecting module 440 are respectively in communication connection to achieve data interaction therebetween. Wherein:
the obtaining module 410 is configured to obtain biological information of a first person and obtain biological information and a motion trajectory of a second person if it is detected that the motion trajectory of the first person meets a preset motion trajectory, where the second person is a person who is within a preset range of the first person and is other than the first person, and the first person is a key holder. The preset motion track comprises that the first person approaches to a vehicle from far to near, the first person stays in the process of approaching the vehicle from far to near, the first person is far away from the vehicle after staying in the process of approaching the vehicle from far to near, and the stay time of the first person is less than or equal to the preset stay time. A determining module 420, configured to determine a third person from the second persons according to the motion trajectory of the second person, where the third person and the first person are a person in the same row. And a decision module 430, configured to determine that the target person is the first person and the third person, and determine a projection scheme according to the biological information and the motion trajectory of the target person. A projection module 440, configured to perform projection according to the projection scheme.
In some implementations, the decision module 430 includes a direction determination sub-module, a content determination sub-module, a schema determination sub-module. Wherein: and the direction determining submodule is used for determining the projection direction according to the motion trail of the target person. And the content determining submodule is used for determining the projection content of the projection direction according to the biological information of the target person, and the scheme determining submodule is used for determining the projection scheme according to the projection content and the projection direction.
In some embodiments, the content determination submodule includes a scene determination unit and a content determination unit. Wherein: and the scene determining unit is used for determining the projection scene corresponding to the target person according to the biological information of the target person. And the scene determining unit is further configured to determine that the projection scene in the projection direction is a projection scene corresponding to the target person in the projection direction. And the content determining unit is used for determining the projection content in the projection direction based on a preset mapping relation between the projection scene and the projection content.
In some embodiments, the scene determining unit is further configured to determine, if at least two target persons exist in the same projection direction, that the projection scene in the projection direction is a projection scene with a highest preset priority in projection scenes corresponding to the at least two target persons.
In some embodiments, the projection device 400 further comprises a location determination module, a trajectory detection module. Wherein: and the position determining module is used for determining the distance and the direction of the key relative to the vehicle according to the signal transmitted by the key. And the track determining module is used for determining the motion track of the first person according to the distance and the position of the key relative to the vehicle. And the track detection module is used for detecting whether the motion track of the first person meets the preset motion track.
In some embodiments, the determination module 420 includes a direction determination sub-module, a person determination sub-module. Wherein: and the direction determining submodule is used for determining the movement direction of the first person according to the movement track of the first person. And the direction determining submodule is also used for determining the movement direction of the second person according to the movement track of the second person. A person determination sub-module configured to determine that the third person is a person of the second persons who moves in the same direction as the first person.
It can be clearly understood by those skilled in the art that the projection apparatus 400 provided in the embodiment of the present application can implement the projection method provided in the embodiment of the present application. The specific working processes of the above devices and modules may refer to the processes corresponding to the projection method in the embodiments of the present application, and are not described herein again.
In the embodiments provided in this application, the coupling, direct coupling or communication connection between the modules shown or discussed may be an indirect coupling or communication coupling through some interfaces, devices or modules, and may be in an electrical, mechanical or other form, which is not limited in this application.
In addition, each functional module in the embodiments of the present application may be integrated into one processing module, or each module may exist alone physically, or two or more modules may be integrated into one module. The integrated module may be implemented in the form of hardware, or may be implemented in the form of a functional module of software, and the embodiment of the present application is not limited herein.
Referring to fig. 11, fig. 11 is a block diagram of a vehicle according to an embodiment of the present disclosure. The vehicle 500 includes a signal receiver 510, a camera module 520, a projection module 530, and a controller 540. The signal receiver 510, the camera module 520, the projection module 530 and the controller 540 are in communication connection to realize data interaction. The vehicle 500 is the same as the vehicle 110 described above. The signal receiver 510 is the same as the signal receiver 111 described above. The camera module 520 is the same as the camera module 112 described above. The projection module 530 is the same as the projection module 113 described above. The controller 540 is the same as the controller 114 described above.
The controller 540 may include a memory 541, one or more processors 542, and one or more applications. One or more application programs may be stored in the memory 541 and configured to cause the one or more processors 542 to perform the above-described projection method provided by the embodiments of the present application when called by the one or more processors 542.
Processor 542 may include one or more processing cores. The processor 542 is coupled to various portions throughout the controller 540 using various interfaces and lines to execute or perform instructions, programs, code sets, or instruction sets stored in the memory 541, and to invoke the execution or execution of data stored in the memory 541, to perform various functions of the controller 540, and to process data. Alternatively, the processor 542 may be implemented in hardware using at least one of Digital Signal Processing (DSP), field-Programmable Gate Array (FPGA), and Programmable Logic Array (PLA). The processor 542 may integrate one or a combination of a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), and a modem. Wherein, the CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing display content; the modem is used to handle wireless communications. It is to be understood that the modem may not be integrated into the processor 542, but may be implemented by a communication chip.
The Memory 541 may include a Random Access Memory (RAM) or a Read-Only Memory (ROM). The memory 541 may be used to store instructions, programs, code, sets of codes, or sets of instructions. The memory 541 may include a program storage area and a data storage area. Wherein the stored program area may store instructions for implementing an operating system, instructions for implementing at least one function, instructions for implementing the various method embodiments described above, and the like. The storage data area may store data created by the controller 540 in use, and the like.
Referring to fig. 12, fig. 12 is a block diagram of a computer readable storage medium according to an embodiment of the present disclosure. The computer-readable storage medium 600 has program code 610 stored therein, and the program code 610 is configured to, when called by a processor, cause the processor to execute the above projection method provided by the embodiment of the present application.
The computer-readable storage medium 600 may be an electronic Memory such as a flash Memory, an Electrically-Erasable Programmable Read-Only-Memory (EEPROM), an Erasable Programmable Read-Only-Memory (EPROM), a hard disk, or a ROM. Optionally, the Computer-Readable Storage Medium 600 includes a Non-volatile Computer-Readable Medium (Non-TCRSM). The computer readable storage medium 600 has storage space for program code 610 for performing any of the method steps described above. The program code 610 can be read from or written to one or more computer program products. The program code 610 may be compressed in a suitable form.
To sum up, the embodiment of the present application provides a projection method, an apparatus, a vehicle, and a storage medium, where the method obtains biological information of a first person and obtains biological information and a motion trajectory of a second person when detecting that the motion trajectory of the first person meets a preset motion trajectory; determining a third person from the second persons according to the movement track of the second persons; determining target persons as a first person and a third person, and determining a projection scheme according to the biological information and the motion trail of the target persons; the projection is carried out according to the projection scheme, so that different projection schemes can be determined for different people, and the flexibility and the diversity of the projection schemes can be improved.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same. Although the present application has been described in detail with reference to the foregoing embodiments, those skilled in the art will appreciate that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not necessarily depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (10)

1. A method of projection, comprising:
if the motion trail of a first person is detected to meet a preset motion trail, acquiring biological information of the first person, and acquiring biological information and the motion trail of a second person, wherein the second person is a person in a preset range of the first person except the first person, and the first person is a key holder;
determining a third person from the second persons according to the movement track of the second persons, wherein the third person and the first person are the same-row persons;
determining target persons as the first person and the third person, and determining a projection scheme according to the biological information and the motion trail of the target persons;
and performing projection according to the projection scheme.
2. The method of claim 1, wherein the step of determining a projection scheme according to the biological information and the motion trajectory of the target person comprises:
determining a projection direction according to the motion trail of the target person;
determining the projection content of the projection direction according to the biological information of the target person;
and determining a projection scheme according to the projection content and the projection direction.
3. The method of claim 2, wherein the step of determining the projection content of the projection direction according to the biological information of the target person comprises:
determining a projection scene corresponding to the target person according to the biological information of the target person;
determining a projection scene in the projection direction as a projection scene corresponding to the target person in the projection direction;
and determining the projection content in the projection direction based on a preset mapping relation between the projection scene and the projection content.
4. The method of claim 3, wherein the step of determining the projection scene of the projection direction as the projection scene corresponding to the target person in the projection direction comprises:
and if at least two target persons exist in the same projection direction, determining that the projection scene in the projection direction is the projection scene with the highest preset priority in the projection scenes corresponding to the at least two target persons.
5. The method of claim 1, wherein the preset motion profile comprises a distance to near approach of the first person to the vehicle, the first person stopping while the distance to near approach to the vehicle, the first person being away from the vehicle after stopping while the distance to near approach to the vehicle, wherein a length of time the first person stops is less than or equal to a preset length of time.
6. The method of claim 1, wherein prior to obtaining the biometric information of the first person, the method further comprises:
determining the distance and orientation of the key relative to the vehicle from the signal emitted by the key;
determining a movement track of the first person according to the distance and the orientation of the key relative to the vehicle;
and detecting whether the motion trail of the first person meets the preset motion trail.
7. The method according to any one of claims 1 to 6, wherein the step of determining a third person from the second person based on the motion profile of the second person comprises:
determining the movement direction of the first person according to the movement track of the first person;
determining the movement direction of the second person according to the movement track of the second person;
determining the third person as a person of the second persons who moves in the same direction as the first person.
8. A projection device, comprising:
the device comprises an acquisition module, a processing module and a control module, wherein the acquisition module is used for acquiring biological information of a first person and acquiring biological information and a motion track of a second person if the motion track of the first person is detected to meet a preset motion track, the second person is a person in a preset range of the first person except the first person, and the first person is a key holder;
the determining module is used for determining a third person from the second persons according to the movement track of the second persons, wherein the third person and the first person are the same-row persons;
the decision module is used for determining target persons as the first person and the third person and determining a projection scheme according to the biological information and the motion trail of the target persons;
and the projection module is used for carrying out projection according to the projection scheme.
9. A vehicle, characterized by comprising:
a signal receiver;
a camera module;
a projection module;
a controller comprising a memory, one or more processors, and one or more applications, wherein the one or more applications are stored in the memory and configured to, when invoked by the one or more processors, cause the one or more processors to perform the method of any one of claims 1-7.
10. A computer-readable storage medium, having stored therein program code configured to, when invoked by a processor, cause the processor to perform the method of any of claims 1 to 7.
CN202210886902.4A 2022-07-26 2022-07-26 Projection method, projection device, vehicle and storage medium Active CN115412709B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210886902.4A CN115412709B (en) 2022-07-26 2022-07-26 Projection method, projection device, vehicle and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210886902.4A CN115412709B (en) 2022-07-26 2022-07-26 Projection method, projection device, vehicle and storage medium

Publications (2)

Publication Number Publication Date
CN115412709A true CN115412709A (en) 2022-11-29
CN115412709B CN115412709B (en) 2023-11-10

Family

ID=84158044

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210886902.4A Active CN115412709B (en) 2022-07-26 2022-07-26 Projection method, projection device, vehicle and storage medium

Country Status (1)

Country Link
CN (1) CN115412709B (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160229333A1 (en) * 2015-02-10 2016-08-11 Toyota Jidosha Kabushiki Kaisha Projection system and computer-readable storage medium
US20180253609A1 (en) * 2015-07-28 2018-09-06 Apple Inc. System and method for light and image projection
CN109204114A (en) * 2017-06-29 2019-01-15 长城汽车股份有限公司 The projective techniques and device of vehicle greeting lamp
CN110588551A (en) * 2019-09-26 2019-12-20 上海赫千电子科技有限公司 Automatic adjustment method and automatic adjustment device for vehicle-mounted personalization of user based on feature recognition
CN110758234A (en) * 2019-09-30 2020-02-07 深圳市火乐科技发展有限公司 Vehicle lamp projection method and related product
CN111488835A (en) * 2020-04-13 2020-08-04 北京爱笔科技有限公司 Method and device for identifying fellow persons
US20210001885A1 (en) * 2018-03-23 2021-01-07 Sensetime Group Limited Method for predicting direction of movement of target object, vehicle control method, and device
CN112298022A (en) * 2020-10-19 2021-02-02 上海仙塔智能科技有限公司 Welcome system, method, vehicle and computer storage medium
US20210088351A1 (en) * 2018-05-14 2021-03-25 Volkswagen Aktiengesellschaft Method for calculating an augmented reality (ar) display for displaying a navigation route on an ar display unit, device for carrying out the method, transportation vehicle and computer program
CN113978349A (en) * 2021-09-24 2022-01-28 合众新能源汽车有限公司 Intelligent vehicle welcoming method and device
CN114187565A (en) * 2021-12-23 2022-03-15 浙江宇视科技有限公司 Method for determining fellow persons, electronic equipment and storage medium
CN114500736A (en) * 2020-10-23 2022-05-13 广州汽车集团股份有限公司 Intelligent terminal motion trajectory decision method and system and storage medium

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160229333A1 (en) * 2015-02-10 2016-08-11 Toyota Jidosha Kabushiki Kaisha Projection system and computer-readable storage medium
US20180253609A1 (en) * 2015-07-28 2018-09-06 Apple Inc. System and method for light and image projection
CN109204114A (en) * 2017-06-29 2019-01-15 长城汽车股份有限公司 The projective techniques and device of vehicle greeting lamp
US20210001885A1 (en) * 2018-03-23 2021-01-07 Sensetime Group Limited Method for predicting direction of movement of target object, vehicle control method, and device
US20210088351A1 (en) * 2018-05-14 2021-03-25 Volkswagen Aktiengesellschaft Method for calculating an augmented reality (ar) display for displaying a navigation route on an ar display unit, device for carrying out the method, transportation vehicle and computer program
CN110588551A (en) * 2019-09-26 2019-12-20 上海赫千电子科技有限公司 Automatic adjustment method and automatic adjustment device for vehicle-mounted personalization of user based on feature recognition
CN110758234A (en) * 2019-09-30 2020-02-07 深圳市火乐科技发展有限公司 Vehicle lamp projection method and related product
CN111488835A (en) * 2020-04-13 2020-08-04 北京爱笔科技有限公司 Method and device for identifying fellow persons
CN112298022A (en) * 2020-10-19 2021-02-02 上海仙塔智能科技有限公司 Welcome system, method, vehicle and computer storage medium
CN114500736A (en) * 2020-10-23 2022-05-13 广州汽车集团股份有限公司 Intelligent terminal motion trajectory decision method and system and storage medium
CN113978349A (en) * 2021-09-24 2022-01-28 合众新能源汽车有限公司 Intelligent vehicle welcoming method and device
CN114187565A (en) * 2021-12-23 2022-03-15 浙江宇视科技有限公司 Method for determining fellow persons, electronic equipment and storage medium

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
MASASHIGE SUWA 等: "LED Projection Module enables a vehicle to communicate with pedestrians and other vehicles", 《2017 IEEE INTERNATIONAL CONFERENCE ON CONSUMER ELECTRONICS》 *
储亚婷: "基于新型迎宾灯的原理 分析与设计研究", 《汽车维修》 *
周盛华: "大功率LED车灯光学设计及其应用", 《中国优秀硕士学位论文全文库》 *

Also Published As

Publication number Publication date
CN115412709B (en) 2023-11-10

Similar Documents

Publication Publication Date Title
US12033373B2 (en) Dynamic image recognition model updates
JP4306397B2 (en) Recognition processing system
CN106952303B (en) Vehicle distance detection method, device and system
US20210012126A1 (en) Detecting illegal use of phone to prevent the driver from getting a fine
KR102132330B1 (en) Remote guidance apparatus and method capable of handling hyper-motion step based on augmented reality and machine learning
CN107480665B (en) Character detection method and device and computer readable storage medium
CN110892451A (en) Electronic device and method for detecting driving event of vehicle
CN110998601A (en) Method and device for identifying objects
CN107431761B (en) Image processing apparatus, image processing method, and image processing system
CN109523574B (en) Walking track prediction method and electronic equipment
EP2996067A1 (en) Method and device for generating motion signature on the basis of motion signature information
JP7282186B2 (en) situational awareness surveillance
CN111886612A (en) Mobile micropositioning
CN113442950B (en) Automatic driving control method, device and equipment based on multiple vehicles
CN114841377B (en) Federal learning model training method and recognition method applied to image target recognition
CN110619325B (en) Text recognition method and device
EP3121790B1 (en) Image sensing apparatus, object detecting method thereof and non-transitory computer readable recording medium
CN108965861B (en) Method and device for positioning camera, storage medium and intelligent interaction equipment
CN115412709B (en) Projection method, projection device, vehicle and storage medium
CN105652673B (en) Control method and device and electronic equipment
KR20150086840A (en) Apparatus and control method for mobile device using multiple cameras
CN115825979A (en) Environment sensing method and device, electronic equipment, storage medium and vehicle
US20230360402A1 (en) Video-based public safety incident prediction system and method therefor
CN115953710A (en) Behavior recognition method and device, electronic equipment and storage medium
CN114495072A (en) Occupant state detection method and apparatus, electronic device, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant