CN118087999A - Control method and device for vehicle tail gate, electronic equipment and storage medium - Google Patents

Control method and device for vehicle tail gate, electronic equipment and storage medium Download PDF

Info

Publication number
CN118087999A
CN118087999A CN202410321379.XA CN202410321379A CN118087999A CN 118087999 A CN118087999 A CN 118087999A CN 202410321379 A CN202410321379 A CN 202410321379A CN 118087999 A CN118087999 A CN 118087999A
Authority
CN
China
Prior art keywords
action
projection
user
target
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410321379.XA
Other languages
Chinese (zh)
Inventor
韩强
李颖
周川
罗皓文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Avatr Technology Chongqing Co Ltd
Original Assignee
Avatr Technology Chongqing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Avatr Technology Chongqing Co Ltd filed Critical Avatr Technology Chongqing Co Ltd
Priority to CN202410321379.XA priority Critical patent/CN118087999A/en
Publication of CN118087999A publication Critical patent/CN118087999A/en
Pending legal-status Critical Current

Links

Landscapes

  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)

Abstract

The application discloses a control method and device of a vehicle tail gate, electronic equipment and a storage medium, wherein the control method comprises the following steps: detecting a first action of a user in a preset sensing area of a target vehicle; projecting a projection pattern to a designated area according to a first action; detecting a second action of the user on the projection pattern; and generating a tail gate control instruction based on the second action, and controlling the opening or closing of a vehicle tail gate of the target vehicle according to the control instruction. The method provided by the embodiment of the application projects the projection pattern to the appointed area through the first action, thereby providing an intuitive prompt. Then, the tail gate control instruction can be timely generated by detecting the second action of the user on the projection pattern. Therefore, an interactive relation is established between the user and the vehicle through the projection pattern and the second action design, the user can accurately control the tail gate through the trample pattern, and the interactivity and participation of the user are increased.

Description

Control method and device for vehicle tail gate, electronic equipment and storage medium
Technical Field
The embodiment of the invention relates to the technical field of vehicle control, in particular to a method and a device for controlling a vehicle tail gate, electronic equipment and a storage medium.
Background
With the improvement of automobile experience requirements of people, various automobile manufacturers pay more attention to humanized consideration in automobile design. In order to provide better user experience and personalized pursuit, many automobiles are now designed with kick-type induction tail gates, so that users can conveniently open the automobile tail gate when the users are full of the hands.
However, there are some problems with the kick-type induction tailgates of current vehicles. First, the existing kick-type induction tail gate requires a detection key near the tail gate to trigger use. This approach relies on the positional positioning of the key, resulting in inaccurate positioning of the key, and sometimes the user may not be able to trigger opening when he wants to use the induction tailgate. Resulting in failure to control the opening or closing of the tailgate by an effective action or operation.
Disclosure of Invention
In view of the above problems, embodiments of the present invention provide a method, an apparatus, an electronic device, and a storage medium for controlling a tail gate of a vehicle, which are used for solving the problem that in the prior art, a kick-type induction tail gate firstly performs position positioning of a key, and the positioning of the key is inaccurate, so that a user cannot control the tail gate to be opened or closed through effective actions or operations.
According to an aspect of an embodiment of the present invention, there is provided a method for controlling a tail gate of a vehicle, the method including:
Detecting a first action of a user in a preset sensing area of a target vehicle;
Projecting a projection pattern to a designated area according to the first action;
Detecting a second action of a user on the projection pattern;
And generating a tail gate control instruction based on the second action, and controlling the opening or closing of a vehicle tail gate of the target vehicle according to the control instruction.
According to still another aspect of the embodiment of the present invention, there is provided a control device for a tailgate of a vehicle, the device including:
The first detection module is used for detecting a first action of a user in a preset sensing area of the target vehicle;
The triggering module is used for projecting a projection pattern to a specified area according to the first action;
a second detection module for detecting a second action of a user on the projection pattern;
And the control module is used for generating a tail gate control instruction based on the second action and controlling the opening or closing of a vehicle tail gate of the target vehicle according to the control instruction.
According to still another aspect of an embodiment of the present invention, there is provided a computer apparatus including: the memory and the processor are in communication connection, the memory stores computer instructions, and the processor executes the computer instructions to perform the method of the first aspect or any implementation manner corresponding to the first aspect.
According to a further aspect of the embodiments of the present invention, there is provided a computer readable storage medium having stored thereon computer instructions for causing a computer to perform the method of the first aspect or any of its corresponding embodiments.
The foregoing description is only an overview of the technical solutions of the embodiments of the present invention, and may be implemented according to the content of the specification, so that the technical means of the embodiments of the present invention can be more clearly understood, and the following specific embodiments of the present invention are given for clarity and understanding.
According to the method provided by the embodiment of the application, the first action generated by the user in the preset sensing area of the target vehicle is detected, so that the operation intention of the user can be accurately sensed, and misoperation or false triggering is avoided. By projecting the projected pattern to the designated area, an intuitive cue is provided and a unique operating experience is provided for the user. Then, by detecting the second action of the user on the projection pattern, the tail gate control instruction can be timely generated. Therefore, an interactive relation is established between the user and the vehicle through the projection pattern and the second action design, the user can accurately control the tail gate through the trample pattern, and the interactivity and participation of the user are increased.
Drawings
The drawings are only for purposes of illustrating embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to designate like parts throughout the figures. In the drawings:
fig. 1 shows a schematic flow chart of a control method of a tail gate of a vehicle provided by the invention;
Fig. 2 is a schematic flow chart of a control method of a tail gate of a vehicle according to the present invention;
FIG. 3 is a schematic diagram of a control system for a tailgate of a vehicle according to the invention;
fig. 4 shows a schematic structural diagram of a control device for a tail gate of a vehicle according to the present invention;
fig. 5 shows a schematic structural diagram of an embodiment of the electronic device provided by the invention.
Detailed Description
Exemplary embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present invention are shown in the drawings, it should be understood that the present invention may be embodied in various forms and should not be limited to the embodiments set forth herein.
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
According to the embodiments of the present invention, there is provided a method, apparatus, electronic device, and storage medium for controlling a tail gate of a vehicle, it being noted that the steps shown in the flowchart of the drawings may be performed in a computer system such as a set of computer executable instructions, and that although a logical order is shown in the flowchart, in some cases, the steps shown or described may be performed in an order different from that herein.
Fig. 1 shows a flow chart of an embodiment of a control method of a vehicle tailgate of the invention. As shown in fig. 1, the method comprises the steps of:
step S11, detecting a first action of a user in a preset sensing area of a target vehicle.
In the embodiment of the application, after the target vehicle is electrified, for example, a user takes a key to be electrified near the vehicle to sense or swipe a card to electrify or a mechanical button to electrify, each sensor and the projection lamp are electrified to be in a waiting working state, wherein the sensor detects a preset sensing area to detect whether the user generates a first action in the preset sensing area, the first action can be a foot kicking action or a leg lifting action, and the sensor can be an infrared sensor.
Step S12, projecting a projection pattern to a specified area according to the first action.
In an embodiment of the application, the kick sensor transmits a signal of the detected first action to the controller. After receiving the signal of the first action, the controller determines to execute the projection action, namely, the projector projects a projection pattern to a specified area. The designated area may be a ground area corresponding to a projector, where the projector is connected to a device (e.g., a computer or a storage device) storing the projected pattern, and is configured to project the desired pattern. The projection pattern can be text, image, video and other contents, and the user can set according to actual requirements.
Step S13, detecting a second action of the user on the projection pattern.
After sensor commissioning is completed, a threshold value needs to be set to determine the action of the user stepping on. The threshold may be set according to actual requirements, for example, when it is detected that the intensity of the infrared light received by the receiver exceeds a certain set value, it is determined that the user performs a second action, and the second action may be a stepping action or a foot sliding action.
Monitoring a second action: once the sensor is activated and the threshold is set, it will continuously monitor the second action of the user on the projected pattern. When the user steps on, the infrared sensor receives the reflected infrared light and judges whether the user performs a second action according to a preset threshold value.
Triggering corresponding actions: when the infrared sensor detects the second action of the user, the corresponding action may be triggered by a corresponding program or device. For example, a corresponding change in the projected pattern may be triggered by a computer program coupled to the projection system, or other actions may be triggered by means coupled to other devices, such as emitting a sound or a light change, etc.
Step S14, a tailgate control instruction is generated based on the second action, and the vehicle tailgate of the target vehicle is controlled to open or close according to the control instruction.
In the embodiment of the application, corresponding tail gate control instructions are generated according to the output of the sensor and the set threshold value. This may be achieved by a control system or embedded device to which the sensor is connected. For example, depending on the duration and intensity of the second action, a control instruction is generated to control whether the tailgate is opened or closed. And (3) controlling a tail gate: and transmitting the generated tail gate control instruction to a control system of the target vehicle. The vehicle control system controls the opening or closing of the tailgate accordingly, depending on the content of the control command. This may be achieved by means of a connection to a vehicle control system, such as via an on-board network or wireless communication, etc.
According to the method provided by the embodiment of the application, the first action generated by the user in the preset sensing area of the target vehicle is detected, so that the operation intention of the user can be accurately sensed, and misoperation or false triggering is avoided. By projecting the projected pattern to the designated area, an intuitive cue is provided and a unique operating experience is provided for the user. Then, by detecting the second action of the user on the projection pattern, the tail gate control instruction can be timely generated. Therefore, an interactive relation is established between the user and the vehicle through the projection pattern and the second action design, the user can accurately control the tail gate through the trample pattern, and the interactivity and participation of the user are increased.
Fig. 2 shows a flow chart of an embodiment of a control method of a vehicle tailgate of the invention. As shown in fig. 2, the method comprises the steps of:
Step S21, detecting a first action of a user in a preset sensing area of the target vehicle.
In the embodiment of the application, as shown in fig. 3, a projection lamp 1, an audio device 2, an infrared sensor 3 and a light sensor 4 are installed at a vehicle tail gate of a target vehicle, and the projection lamp is used for projecting different patterns to realize prompt of a sensing area. The infrared sensor is used for sensing a first action of a user at the tail gate. The light sensor is used for sensing the illuminance of the environment, and the subsequent controller can adjust the brightness of the projection lamp according to the illumination intensity. The audio device is used for outputting audio of different rhythms according to the speed of the user stepping on the pattern.
Specifically, a first action generated by a user in a preset sensing area is detected through an infrared sensor. The infrared sensor is composed of a transmitter and a receiver. The transmitter emits an infrared signal and the receiver receives the reflected infrared signal. A preset sensing area is created at the bottom or side of the tail gate of the vehicle, which may be a specific area range set to meet sensing requirements, and the infrared sensor is triggered and receives the reflected infrared signal only when the foot of the user enters the area. That is, when the user's foot enters the predetermined sensing area, a portion of the infrared signal is reflected by the foot back to the sensor's receiver. Resulting in a change in the infrared signal received by the receiver. The infrared sensor transmits the received infrared signals to the controller for processing. The controller analyzes the change of the signal and determines whether a user performs a first action. When the controller confirms that the user has performed the first action, the projection may be triggered.
Step S22, projecting a projection pattern to a specified area according to the first action.
In an embodiment of the present application, before projecting the projection pattern to the specified area according to the first action, the method further includes: acquiring the current geographic position of a target vehicle; acquiring landmark information or scenic spot information associated with a geographic position; a projection pattern is generated based on landmark information or sight spot information.
Specifically, a pre-established landmark or scenic spot database is first determined, wherein the landmark or scenic spot database contains landmark or scenic spot information of each geographic location. Such information may include names, descriptions, pictures, etc. During the projection process, the controller first removes the current geographic location of the overhead target vehicle from the positioning system. Next, the controller matches the current geographic location of the vehicle with a landmark or sight spot database to determine the landmark or sight spot currently in place. Based on the matched landmark or sight spot information, the controller may generate a corresponding projection pattern. The pattern may be an actual picture of a landmark, a display of a name, a related graphic, or a design of other creative.
As one example, the target vehicle mounts a GPS device, and acquires its geographic location in real-time. The GPS device acquires longitude and latitude coordinates of the target vehicle through satellite signals. The acquired longitude and latitude coordinates are converted into specific geographic position information, such as cities, streets, buildings and the like, through geographic position analysis service. And inquiring by using the geographic position of the target vehicle based on the landmark information or the scenic spot information database to acquire landmark information or scenic spot information related to the current position of the target vehicle. Querying surrounding landmark information may include: business centers, restaurants, parks, museums, shopping centers, etc.; querying surrounding sight information may include: points of interest, scenic spots, amusement parks, etc. And determining corresponding projection patterns according to the landmark information or the scenic spot information. For example, images, marks, names, and other elements of landmarks or scenery spots are combined to generate a theme-compliant projection pattern.
In an embodiment of the present application, before projecting the projection pattern to the specified area according to the first action, the method further includes: acquiring interest preference of a user; projection images associated with interest preferences are obtained from a pattern library.
Specifically, interest preference of the user can be obtained according to the interaction mode between the vehicle and the user. For example: the user expresses information such as favorites, interests, preferences and the like by touching the interactive interface or voice commands and the like. Interest preferences may include user preferences for particular subject matter, color, shape, style, or pattern types. The data may be recorded in digital or textual form for subsequent processing and analysis. The collected user interest preference data is then used to construct an interest preference model. Machine learning algorithms or data mining techniques may be used to analyze the user data to identify interesting features and preference patterns of the user. This can help to learn the preferences of the user and predict the projected images that they may like. A library of patterns is also created containing projected images associated with user interest preferences. Images matching the user preferences are selected and collected from a pattern library according to the user's interest preference model. The pattern library may contain various image resources such as pictures, illustrations, graphics, and the like.
When determining to perform the projection action, the user's interest features may be matched with the projection images in the pattern library according to the user's interest preference model. And selecting a projection image which is most suitable for the interest preference of the user according to the matching result.
In an embodiment of the present application, projecting a projection pattern to a specified area according to a first action includes the following steps A1 to A3:
and A1, detecting the illumination intensity of the environment where the target vehicle is located.
In the embodiment of the application, the illumination intensity can be detected by the light sensor. Specifically, the light sensor may sense light in the environment and convert it into an electrical signal. The electrical signal converted by the light sensor is transmitted to the controller for processing. The controller may calculate the value of the illumination intensity from the received electrical signal.
And step A2, determining a target brightness value matched with the illumination intensity.
In the embodiment of the application, the target brightness value corresponding to the measured illumination intensity is determined according to the mapping relation between the preset illumination intensity and the brightness value. For example: the mapping relation between illumination intensity and brightness value can be that illumination intensity is high, then projection lamp control module control projection lamp illuminance is higher, if illumination intensity is low, then projection lamp control module control projection lamp illuminance reduces, and the purpose can be in order to let the projection pattern clear, can reach the purpose of saving the energy again. The controller then adjusts the brightness of the projection lamp of the target vehicle based on the target brightness value. For example: the change in brightness can be achieved by adjusting the current of the lamp or the PWM signal.
And A3, controlling a projection lamp of the target vehicle to project a projection pattern to a specified area according to the target brightness value.
In the embodiment of the application, the controller transmits the projection pattern to the projection lamp after adjusting the brightness of the projection lamp to the target brightness value, and the projection lamp projects the projection pattern to the appointed area according to the target brightness value. The designated area may be a projection range of a projection lamp default setting.
Specifically, the projection lamp for controlling the target vehicle projects a projection pattern to a specified area according to a target brightness value, including: determining a preset projection area based on the designated area; detecting whether a target object affecting user treading exists in a preset projection area; if the target object exists in the preset projection area, scanning other areas without the target object in the ground, and determining a target projection area meeting the projection pattern from the other areas; and controlling the projection lamp to project a projection pattern to the target projection area according to the target brightness value. And if no target object exists in the preset projection area, projecting a projection pattern in the preset projection area.
Wherein, the preset projection area is determined based on the specified area, and the preset projection area may be an area directly under the projection lamp. A suitable sensor is used to detect the presence or absence of objects, such as water accumulation, snow accumulation, etc., within the predetermined projected area. For example: and acquiring a ground image through a camera. And using the trained recognition model to recognize the ground image, and determining whether a target object such as accumulated water, snow and the like exists in the preset projection area.
If the target object exists in the preset projection area, whether the user is influenced by the treading of the user can be judged according to the size, the height or other characteristics of the target object. If the object affects the user's stepping, it is necessary to scan other areas of the ground that are not affected by the object. The status of other areas may be detected and recorded by a motion sensor or scanning device. According to the scanning result, selecting a ground area which meets the requirement of the projection pattern and is not influenced by the target object as a target projection area. And then adjusting the brightness of the projection lamp according to the brightness requirement of the target projection area. It is possible to ensure that the projected pattern is clearly visible in the target projection area by controlling the brightness parameters of the projection device or the projection lamp. The selected projection pattern is projected toward the target projection area by controlling the projection device.
The method provided by the embodiment of the application detects whether the target object influencing the treading of the user exists in the preset projection area, so that the user can be ensured not to be unnecessarily blocked or injured when treading the projection pattern, and the use safety is improved. And if the preset projection area has an object, determining the object projection area meeting the projection pattern by scanning other ground areas. Thus, the problem that the complete pattern cannot be projected due to interference of the target object can be avoided. And meanwhile, according to the brightness requirement of the target projection area, controlling the projection lamp to project a projection pattern according to the target brightness value. This may ensure that the projected pattern is clearly visible, improving the user experience and visual effect.
In the embodiment of the application, the method further comprises the following steps: if the other areas do not have the target projection area meeting the projection pattern, acquiring environment data of the target vehicle; determining a wall closest to a tail gate of the vehicle based on the environmental data; and controlling the projection lamp to project a projection pattern to the wall body according to the target brightness value.
Specifically, if the other area does not have the target projection area satisfying the projection pattern, the environmental data in which the target vehicle is located may be acquired. By acquiring the environmental data of the target vehicle, the information of the surrounding environment, such as objects, walls, etc., can be acquired. Based on the environmental data, a wall closest to the tail gate of the target vehicle may be determined. And controlling the projection lamp to project a projection pattern to the wall body according to the target brightness value. By adjusting the brightness of the projection lamp, it is ensured that the projected pattern is clearly visible on the wall.
According to the method provided by the embodiment of the application, under the condition that other areas cannot meet the projection pattern, the nearest wall body is selected as the projection area according to the environmental data of the target vehicle, so that the projected pattern is ensured to be complete, clear and visible. This may provide a better user experience and ensure an efficient presentation of the projected pattern.
Step S23, detecting a second action of the user on the projection pattern.
In the embodiment of the application, the vehicle tail gate side of the target vehicle is provided with the camera so as to completely capture the area of the projection pattern and the second action of the user. The second motion may be a stepping motion, capturing an image through the camera, and then processing the image captured by the camera using computer vision techniques. Including background modeling, target detection and tracking, etc. algorithms to extract and identify projected pattern areas and user foot motions from the images.
Through the results of the image processing, a model may be trained to recognize the second action of the user using machine learning or deep learning methods. This may be matched and classified with a predefined second action by extracting key features such as the position, trajectory, shape, etc. of the foot. When the second action of the user is identified and classified, a corresponding control may be triggered.
Step S24, a tail gate control command is generated based on the second action, and the opening or closing of the vehicle tail gate of the target vehicle is controlled according to the control command.
In the embodiment of the application, a corresponding tail gate control instruction is generated according to the second action. For example, when the user finishes stepping, an instruction to open the tail gate is generated, or an instruction to close the tail gate is generated. And sending the generated control instruction to a tail gate of the vehicle. After receiving the control command, the vehicle tail gate executes corresponding actions, namely opening or closing, according to the command.
According to the method provided by the embodiment of the application, the first action generated by the user in the preset sensing area of the target vehicle is detected, so that the operation intention of the user can be accurately sensed, and misoperation or false triggering is avoided. By projecting the projected pattern to the designated area, an intuitive cue is provided and a unique operating experience is provided for the user. Then, by detecting the second action of the user on the projection pattern, the tail gate control instruction can be timely generated. Therefore, an interactive relation is established between the user and the vehicle through the projection pattern and the second action design, the user can accurately control the tail gate through the trample pattern, and the interactivity and participation of the user are increased.
In an embodiment of the present application, after detecting the second action on the projection pattern, the method further comprises: detecting the execution speed corresponding to the second action; acquiring audio data corresponding to the execution speed; the audio device of the target vehicle is controlled to play the audio data.
Specifically, the second motion of the user is monitored by a sensor or a camera, etc., and the movement speed of the foot is measured. The speed of execution of the user may be calculated using motion capture techniques or calculating the change in displacement of an object in an image between successive frames. Corresponding audio data is set according to the execution speed of the user. A set of audio data may be predefined and classified or matched according to different execution speeds. For example: the execution speed of the user is judged by calculating the time length of the second action. If the time length is shorter, the second action is faster; if the length of time is longer, it is indicated that the second action is slower. And selecting different music to play according to the execution speed of the user. A set of fast-paced and slow-paced music may be prepared in advance. When the execution speed is high, selecting fast-paced music to play; when the execution speed is slow, music with a slow tempo is selected for play.
And transmitting the audio data corresponding to the execution speed to the audio equipment of the target vehicle by connecting with the audio equipment of the target vehicle. The audio data may be transmitted to the audio device of the target vehicle using wireless transmission techniques, such as bluetooth or Wi-Fi, or using a wired connection. The audio device of the target vehicle receives the audio data and decodes and plays the audio data so that the user can hear the audio matching the execution speed thereof.
According to the method provided by the embodiment of the application, the audio data corresponding to the execution speed is obtained by detecting the execution speed corresponding to the second action, and the audio data is controlled to be played by the audio equipment of the target vehicle, so that an interactive system for playing the audio according to the execution speed of the user can be realized. Such a system may provide an audio experience that matches the user's actions, enhancing the interaction and entertainment of the user. The specific implementation mode can be adjusted according to the application scene and the requirement.
The embodiment also provides a control device for a tail gate of a vehicle, which is used for implementing the foregoing embodiments and preferred embodiments, and is not described in detail. As used below, the term "module" may be a combination of software and/or hardware that implements a predetermined function. While the means described in the following embodiments are preferably implemented in software, implementation in hardware, or a combination of software and hardware, is also possible and contemplated.
The present embodiment provides a control device for a tail gate of a vehicle, as shown in fig. 4, including:
A first detection module 41, configured to detect a first action of a user in a preset sensing area of the target vehicle;
A trigger module 42 for projecting a projection pattern to a specified area according to a first action;
a second detecting module 43 for detecting a second action of the user on the projection pattern;
the control module 44 is configured to generate a tail gate control command based on the second action, and control opening or closing of a vehicle tail gate of the target vehicle according to the control command.
In an embodiment of the present application, the triggering module 42 includes:
The detection unit is used for detecting the illumination intensity of the environment where the target vehicle is located;
An adjusting unit for determining a target brightness value matched with the illumination intensity;
and the control unit is used for controlling the projection lamp of the target vehicle to project a projection pattern to the designated area according to the target brightness value.
In the embodiment of the application, the device further comprises: the generation unit is used for acquiring the current geographic position of the target vehicle; acquiring landmark information or scenic spot information associated with a geographic position; a projection pattern is generated based on landmark information or sight spot information.
In the embodiment of the application, the generating unit is used for acquiring interest preference of the user; projection images associated with interest preferences are obtained from a pattern library.
In the embodiment of the application, a control unit is used for determining a preset projection area based on a designated area; detecting whether a target object affecting user treading exists in a preset projection area; if the target object exists in the preset projection area, scanning other areas without the target object in the ground, and determining a target projection area meeting the projection pattern from the other areas; and controlling the projection lamp to project a projection pattern to the target projection area according to the target brightness value.
In the embodiment of the application, the control unit is further configured to acquire environmental data in which the target vehicle is located if the target projection area satisfying the projection pattern does not exist in other areas; determining a wall closest to a tail gate of the vehicle based on the environmental data; and controlling the projection lamp to project a projection pattern to the wall body according to the target brightness value.
In the embodiment of the application, the device further comprises: the processing module is used for detecting the execution speed corresponding to the second action; acquiring audio data corresponding to the execution speed; the audio device of the target vehicle is controlled to play the audio data.
Fig. 5 shows a schematic structural diagram of an embodiment of the electronic device of the present invention, as shown in fig. 5, which may include: a processor 402, a communication interface (Communications Interface) 404, a memory 406, and a communication bus 408.
Wherein: processor 402, communication interface 404, and memory 406 communicate with each other via communication bus 408. A communication interface 404 for communicating with network elements of other devices, such as clients or other servers. Processor 402 is configured to execute program 410, and may specifically perform the relevant steps described above for the method embodiments described above.
In particular, program 410 may include program code including computer-executable instructions.
The processor 402 may be a central processing unit CPU, or an Application-specific integrated Circuit ASIC (Application SPECIFIC INTEGRATED Circuit), or one or more integrated circuits configured to implement embodiments of the present invention. The one or more processors comprised by the device may be the same type of processor, such as one or more CPUs; but may also be different types of processors such as one or more CPUs and one or more ASICs.
Memory 406 for storing programs 410. Memory 406 may comprise high-speed RAM memory or may also include non-volatile memory (non-volatile memory), such as at least one disk memory.
Program 410 may be specifically invoked by processor 402 to cause an electronic device to:
Detecting a first action of a user in a preset sensing area of a target vehicle; projecting a projection pattern to a designated area according to a first action; detecting a second action of the user on the projection pattern; and generating a tail gate control instruction based on the second action, and controlling the opening or closing of a vehicle tail gate of the target vehicle according to the control instruction.
In an alternative way, projecting the projection pattern to the specified area according to the first action includes: detecting the illumination intensity of the environment where the target vehicle is located; determining a target brightness value matched with the illumination intensity; the projection lamp of the control target vehicle projects a projection pattern to a specified area according to the target brightness value.
In an alternative way, before projecting the projection pattern to the specified area according to the first action, the method further comprises: acquiring the current geographic position of a target vehicle; acquiring landmark information or scenic spot information associated with a geographic position; a projection pattern is generated based on landmark information or sight spot information.
In an alternative way, before projecting the projection pattern to the specified area according to the first action, the method further comprises: acquiring interest preference of a user; projection images associated with interest preferences are obtained from a pattern library.
In an alternative manner, controlling a projection lamp of a target vehicle to project a projection pattern to a specified area in accordance with a target brightness value includes:
Determining a preset projection area based on the designated area; detecting whether a target object affecting user treading exists in a preset projection area; if the target object exists in the preset projection area, scanning other areas without the target object in the ground, and determining a target projection area meeting the projection pattern from the other areas; and controlling the projection lamp to project a projection pattern to the target projection area according to the target brightness value.
In an alternative manner, the method further comprises:
If the other areas do not have the target projection area meeting the projection pattern, acquiring environment data of the target vehicle; determining a wall closest to a tail gate of the vehicle based on the environmental data; and controlling the projection lamp to project a projection pattern to the wall body according to the target brightness value.
In an alternative way, after detecting the second action on the projected pattern, the method further comprises: detecting the execution speed corresponding to the second action; acquiring audio data corresponding to the execution speed; the audio device of the target vehicle is controlled to play the audio data.
An embodiment of the present invention provides a computer readable storage medium, where at least one executable instruction is stored, where the executable instruction when executed on a control device of a vehicle tail gate causes the control device of the vehicle tail gate to perform a method according to any of the above method embodiments.
The executable instructions may be particularly useful for causing an apparatus to:
Detecting a first action of a user in a preset sensing area of a target vehicle; projecting a projection pattern to a designated area according to a first action; detecting a second action of the user on the projection pattern; and generating a tail gate control instruction based on the second action, and controlling the opening or closing of a vehicle tail gate of the target vehicle according to the control instruction.
In an alternative way, projecting the projection pattern to the specified area according to the first action includes: detecting the illumination intensity of the environment where the target vehicle is located; determining a target brightness value matched with the illumination intensity; the projection lamp of the control target vehicle projects a projection pattern to a specified area according to the target brightness value.
In an alternative way, before projecting the projection pattern to the specified area according to the first action, the method further comprises: acquiring the current geographic position of a target vehicle; acquiring landmark information or scenic spot information associated with a geographic position; a projection pattern is generated based on landmark information or sight spot information.
In an alternative way, before projecting the projection pattern to the specified area according to the first action, the method further comprises: acquiring interest preference of a user; projection images associated with interest preferences are obtained from a pattern library.
In an alternative manner, controlling a projection lamp of a target vehicle to project a projection pattern to a specified area in accordance with a target brightness value includes:
Determining a preset projection area based on the designated area; detecting whether a target object affecting user treading exists in a preset projection area; if the target object exists in the preset projection area, scanning other areas without the target object in the ground, and determining a target projection area meeting the projection pattern from the other areas; and controlling the projection lamp to project a projection pattern to the target projection area according to the target brightness value.
In an alternative manner, the method further comprises:
If the other areas do not have the target projection area meeting the projection pattern, acquiring environment data of the target vehicle; determining a wall closest to a tail gate of the vehicle based on the environmental data; and controlling the projection lamp to project a projection pattern to the wall body according to the target brightness value.
In an alternative way, after detecting the second action on the projected pattern, the method further comprises: detecting the execution speed corresponding to the second action; acquiring audio data corresponding to the execution speed; the audio device of the target vehicle is controlled to play the audio data.
The algorithms or displays presented herein are not inherently related to any particular computer, virtual system, or other apparatus. In addition, embodiments of the present invention are not directed to any particular programming language.
In the description provided herein, numerous specific details are set forth. It will be appreciated, however, that embodiments of the invention may be practiced without such specific details. Similarly, in the above description of exemplary embodiments of the invention, various features of embodiments of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. Wherein the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Those skilled in the art will appreciate that the modules in the apparatus of the embodiments may be adaptively changed and disposed in one or more apparatuses different from the embodiments. The modules or units or components of the embodiments may be combined into one module or unit or component and, furthermore, they may be divided into a plurality of sub-modules or sub-units or sub-components. Except that at least some of such features and/or processes or elements are mutually exclusive.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The use of the words first, second, third, etc. do not denote any order. These words may be interpreted as names. The steps in the above embodiments should not be construed as limiting the order of execution unless specifically stated.

Claims (10)

1. A method of controlling a vehicle tailgate, the method comprising:
Detecting a first action of a user in a preset sensing area of a target vehicle;
Projecting a projection pattern to a designated area according to the first action;
Detecting a second action of a user on the projection pattern;
And generating a tail gate control instruction based on the second action, and controlling the opening or closing of a vehicle tail gate of the target vehicle according to the control instruction.
2. The method of claim 1, wherein projecting a projected pattern to a designated area according to the first action comprises:
Detecting the illumination intensity of the environment where the target vehicle is located;
determining a target brightness value matched with the illumination intensity;
and controlling a projection lamp of the target vehicle to project the projection pattern to a designated area according to the target brightness value.
3. The method of claim 1, wherein prior to projecting a projected pattern to a designated area in accordance with the first action, the method further comprises:
acquiring the current geographic position of the target vehicle;
acquiring landmark information or scenic spot information associated with the geographic position;
The projection pattern is generated based on the landmark information or the sight spot information.
4. The method of claim 1, wherein prior to projecting a projected pattern to a designated area in accordance with the first action, the method further comprises:
acquiring interest preferences of the user;
And obtaining projection images related to the interest preference from a pattern library.
5. The method of claim 2, wherein controlling the projection lamp of the target vehicle to project the projection pattern to a specified area in accordance with the target brightness value comprises:
Determining a preset projection area based on the designated area;
detecting whether a target object affecting user treading exists in the preset projection area;
if the target object exists in the preset projection area, scanning other areas where the target object does not exist in the appointed area, and determining a target projection area meeting the projection pattern from the other areas;
And controlling the projection lamp to project the projection pattern to the target projection area according to the target brightness value.
6. The method of claim 5, wherein the method further comprises:
If the other areas do not have the target projection area meeting the projection pattern, acquiring environment data of the target vehicle;
determining a wall closest to the vehicle tail gate based on the environmental data;
And controlling the projection lamp to project the projection pattern to the wall body according to the target brightness value.
7. The method of claim 1, wherein after detecting the second action on the projected pattern, the method further comprises:
Detecting the execution speed corresponding to the second action;
acquiring audio data corresponding to the execution speed;
And controlling the audio equipment of the target vehicle to play the audio data.
8. A control device for a vehicle tailgate, the device comprising:
The first detection module is used for detecting a first action of a user in a preset sensing area of the target vehicle;
The triggering module is used for projecting a projection pattern to a specified area according to the first action;
a second detection module for detecting a second action of a user on the projection pattern;
And the control module is used for generating a tail gate control instruction based on the second action and controlling the opening or closing of a vehicle tail gate of the target vehicle according to the control instruction.
9. An electronic device, comprising: the device comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface complete communication with each other through the communication bus;
The memory is configured to store at least one executable instruction that causes the processor to perform the operations of the method of any one of claims 1-7.
10. A computer readable storage medium, characterized in that at least one executable instruction is stored in the storage medium, which executable instruction, when run on a control device of a vehicle tailgate, causes the control device of a vehicle tailgate to perform the operations of the method according to any one of claims 1-7.
CN202410321379.XA 2024-03-20 2024-03-20 Control method and device for vehicle tail gate, electronic equipment and storage medium Pending CN118087999A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410321379.XA CN118087999A (en) 2024-03-20 2024-03-20 Control method and device for vehicle tail gate, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410321379.XA CN118087999A (en) 2024-03-20 2024-03-20 Control method and device for vehicle tail gate, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN118087999A true CN118087999A (en) 2024-05-28

Family

ID=91142149

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410321379.XA Pending CN118087999A (en) 2024-03-20 2024-03-20 Control method and device for vehicle tail gate, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN118087999A (en)

Similar Documents

Publication Publication Date Title
US10699557B2 (en) System and method for tracking a passive wand and actuating an effect based on a detected wand path
CN103718134B (en) Eye gaze to alter device behavior
CN109634263A (en) Based on data synchronous automatic Pilot method, terminal and readable storage medium storing program for executing
EP3875160A1 (en) Method and apparatus for controlling augmented reality
US20120226981A1 (en) Controlling electronic devices in a multimedia system through a natural user interface
KR20180125885A (en) Electronic device and method for detecting a driving event of vehicle
US20030026461A1 (en) Recognition and identification apparatus
JP2014119295A (en) Control device and portable terminal
JP2024500650A (en) System and method for generating augmented reality objects
US20200174489A1 (en) Electronic device and operation method therefor
US11461980B2 (en) Methods and systems for providing a tutorial for graphic manipulation of objects including real-time scanning in an augmented reality
US10306193B2 (en) Trigger zones for objects in projected surface model
US20080024615A1 (en) Camera control
EP3590002A1 (en) A portable device for rendering a virtual object and a method thereof
KR20190129673A (en) Method and apparatus for executing cleaning operation
CN108687759A (en) Mobile device, the control method of mobile device and recording medium
CN118087999A (en) Control method and device for vehicle tail gate, electronic equipment and storage medium
CN112911769A (en) Virtual interaction method, equipment and storage medium for head shaking stage lamp
US20230191618A1 (en) Robot for expressing emotional state of pet and control method thereof
CN107330722A (en) A kind of advertisement placement method of shared equipment
CN113918004A (en) Gesture recognition method, device, medium, and system thereof
US11375275B2 (en) Method and system for using lip sequences to control operations of a device
CN113539010A (en) Driving simulation method, device, equipment and storage medium
KR102635477B1 (en) Device for providing performance content based on augmented reality and method therefor
JP2021511997A (en) Corner negotiation method for self-driving vehicles that do not require maps and positioning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination