CN115214535A - Vehicle control method, device, system and medium - Google Patents

Vehicle control method, device, system and medium Download PDF

Info

Publication number
CN115214535A
CN115214535A CN202210827842.9A CN202210827842A CN115214535A CN 115214535 A CN115214535 A CN 115214535A CN 202210827842 A CN202210827842 A CN 202210827842A CN 115214535 A CN115214535 A CN 115214535A
Authority
CN
China
Prior art keywords
vehicle
target
projection area
vehicle control
projection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210827842.9A
Other languages
Chinese (zh)
Inventor
黄坚材
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Geely Holding Group Co Ltd
Zhejiang Zeekr Intelligent Technology Co Ltd
Original Assignee
Zhejiang Geely Holding Group Co Ltd
Zhejiang Zeekr Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Geely Holding Group Co Ltd, Zhejiang Zeekr Intelligent Technology Co Ltd filed Critical Zhejiang Geely Holding Group Co Ltd
Priority to CN202210827842.9A priority Critical patent/CN115214535A/en
Publication of CN115214535A publication Critical patent/CN115214535A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • B60R25/20Means to switch the anti-theft system on or off
    • B60R25/2018Central base unlocks or authorises unlocking
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • B60R25/10Fittings or systems for preventing or indicating unauthorised use or theft of vehicles actuating a signalling device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Mechanical Engineering (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Lighting Device Outwards From Vehicle And Optical Signal (AREA)

Abstract

The application discloses a vehicle control method, a device, a system and a medium, wherein the method comprises the steps of generating a first vehicle control instruction under the condition that a wake-up signal aiming at a vehicle is detected; controlling the target projection assembly to project at least one projection area based on the first vehicle control instruction; generating a second vehicle control instruction in response to the selection operation instruction of the target object to the target projection area; and controlling the vehicle component corresponding to the target projection area to execute the target operation based on the second vehicle control instruction. Therefore, accurate identification of vehicle execution target operation is achieved, misoperation is reduced, and vehicle control accuracy is improved.

Description

Vehicle control method, device, system and medium
Technical Field
The present application relates to the field of vehicle technologies, and in particular, to a method, an apparatus, a system, and a medium for controlling a vehicle.
Background
With the intelligentization of vehicles, people put more demands on the operating performance of vehicles.
In the related art, although the ease of vehicle operation can be improved by some communication techniques, the execution of the related vehicle control operation is automatically triggered. However, since the signal recognition range of these techniques is generally rough, some misoperation is easily caused, and the accuracy of vehicle control is low.
Disclosure of Invention
In view of the above, the present application aims to provide a vehicle control method, apparatus, system and medium to solve at least one of the above technical problems. The technical scheme is as follows:
in one aspect, the present application provides a vehicle control method, including:
generating a first vehicle control instruction if a wake-up signal for the vehicle is detected; the first vehicle control command is used for starting at least one projection assembly installed on the vehicle;
controlling a target projection assembly to project at least one projection area based on the first vehicle control instruction; the target projection assembly is at least one of the projection assemblies;
generating a second vehicle control instruction in response to the selection operation instruction of the target object to the target projection area; the target projection area is at least one of the projection areas;
and controlling the vehicle component corresponding to the target projection area to execute the target operation based on the second vehicle control command.
In an optional embodiment, before the step of generating the second vehicle control instruction in response to the operation instruction of selecting the target projection area by the target object, the method further includes:
detecting position posture data between the target object and each projected projection area;
and determining at least one projection area meeting preset pose conditions as a selected target projection area based on the detection result, and generating a selection operation instruction corresponding to the target projection area.
In an optional embodiment, the position and posture data includes a position and space for characterizing a target object and a central point of each projection region;
the determining, based on the detection result, at least one of the projection areas meeting the preset pose condition as the selected target projection area comprises:
and when the detection result indicates that the position distance in the position posture data is smaller than or equal to a preset distance value, determining that the corresponding projection area meets a preset pose condition, and taking at least one determined projection area as a selected target projection area.
In an optional embodiment, the controlling, based on the second vehicle control instruction, the vehicle component corresponding to the target projection area to perform the target operation includes:
based on the second vehicle control instruction, controlling a projection component corresponding to the target projection area to be switched from a first projection state to a second projection state, wherein the second projection state is used for indicating that control response information is displayed in the target projection area;
and controlling the vehicle component corresponding to the target projection area to execute target operation when the preset starting condition is determined to be met.
In an optional embodiment, the control response message includes at least one of a dynamically displayed time response message and a dynamically changed graphic response message;
before determining that the preset starting condition is met and controlling the vehicle component corresponding to the target projection area to execute the target operation step, the method further comprises the following steps:
if the time value corresponding to the time response information is detected to meet a preset time threshold value, determining that a preset starting condition is met; alternatively, the first and second liquid crystal display panels may be,
and if the current display pattern of the graphic response information is detected to be a preset pattern, determining that a preset starting condition is met.
In an optional embodiment, the controlling the vehicle component corresponding to the target projection area to perform the target operation includes:
controlling a vehicle part corresponding to the target projection area to be switched from a closed state to an open state;
wherein the vehicle component comprises at least one of a door, a window, and a trunk lid.
In an optional embodiment, the generating the first vehicle control instruction in case of detecting the wake-up signal for the vehicle comprises:
detecting whether a wake-up signal exists in an environment within a preset distance from the vehicle in real time;
under the condition that a wake-up signal aiming at the vehicle is detected, carrying out identity verification on target equipment corresponding to the wake-up signal;
and after the identity authentication is legal, establishing communication connection between the vehicle and the target equipment, and generating a first vehicle control instruction.
In another aspect, the present application also provides a vehicle control apparatus including:
the first instruction generation module is used for generating a first vehicle control instruction under the condition that a wake-up signal aiming at the vehicle is detected; the first vehicle control command is used to activate at least one projection assembly mounted on the vehicle;
the first control module is used for controlling the target projection assembly to project at least one projection area based on the first vehicle control instruction; the target projection assembly is at least one of the projection assemblies;
the second instruction generation module is used for responding to a selection operation instruction of the target object to the target projection area and generating a second vehicle control instruction; the target projection area is at least one of the projection areas;
and the second control module is used for controlling the vehicle component corresponding to the target projection area to execute the target operation based on the second vehicle control instruction.
In an alternative embodiment, the apparatus further comprises:
the pose detection module is used for detecting position and attitude data between the target object and each projected projection area;
and the operation instruction generation module is used for determining at least one projection area meeting a preset pose condition as a selected target projection area based on the detection result and generating a selection operation instruction corresponding to the target projection area.
In an alternative embodiment, the position and orientation data includes data characterizing a position and orientation between the target object and a center point of each of the projection regions. The second instruction generation module is specifically configured to:
and indicating that the position distance in the position posture data is smaller than or equal to a preset distance value in the detection result, determining that the corresponding projection area meets the preset position and posture condition, and taking at least one determined projection area as a selected target projection area.
In an alternative embodiment, the second control module comprises:
the first control submodule is used for controlling the projection component corresponding to the target projection area to be switched from a first projection state to a second projection state based on the second vehicle control instruction, and the second projection state is used for indicating that control response information is displayed in the target projection area;
and the second control submodule is used for controlling the vehicle component corresponding to the target projection area to execute target operation when the preset starting condition is determined to be met.
In an optional embodiment, the control response message includes at least one of a time response message that is dynamically displayed and a graphical response message that is dynamically changed. The second control module further comprises:
the first detection submodule is used for determining that a preset starting condition is met if the time value corresponding to the time response information is detected to meet a preset time threshold; alternatively, the first and second electrodes may be,
and the second detection submodule is used for determining that a preset starting condition is met if the current display pattern of the graphical response information is detected to be a preset pattern.
In an optional implementation manner, the second control module is specifically configured to:
controlling a vehicle component corresponding to the target projection area to be switched from a closed state to an open state;
wherein the vehicle component comprises at least one of a door, a window, and a trunk lid.
In an optional implementation manner, the first instruction generating module is specifically configured to:
detecting whether a wake-up signal exists in an environment within a preset distance from the vehicle in real time;
when the wake-up signal aiming at the vehicle is detected, carrying out identity verification on target equipment corresponding to the wake-up signal;
and after the identity authentication is legal, establishing communication connection between the vehicle and the target equipment, and generating a first vehicle control command.
In another aspect, the present application further provides a vehicle control system, including a vehicle and a target device communicatively connected to the vehicle; the vehicle comprises a processor and a memory, wherein at least one instruction or at least one program is stored in the memory, and the at least one instruction or the at least one program is loaded by the processor and executed to realize the vehicle control method.
In another aspect, the present application further provides a computer device, which includes a processor and a memory, where at least one instruction or at least one program is stored in the memory, and the at least one instruction or the at least one program is loaded by the processor and executed to implement the vehicle control method according to any one of the above descriptions.
In another aspect, the present application further provides a computer-readable storage medium, in which at least one instruction or at least one program is stored, and the at least one instruction or the at least one program is loaded and executed by a processor to implement the vehicle control method according to any one of the above-mentioned embodiments.
According to the vehicle control method, the vehicle control device, the vehicle control system and the vehicle control medium, the first vehicle control instruction is generated under the condition that the wake-up signal aiming at the vehicle is detected; the first vehicle control command is used for starting at least one projection assembly installed on the vehicle; controlling a target projection assembly to project at least one projection area based on the first vehicle control instruction; the target projection assembly is at least one of the projection assemblies; generating a second vehicle control instruction in response to the selection operation instruction of the target object to the target projection area; the target projection area is at least one of the projection areas; and controlling the vehicle component corresponding to the target projection area to execute the target operation based on the second vehicle control command. By means of at least one projection area projected by a projection assembly installed on a vehicle and under the condition of responding to a selection operation instruction aiming at the target projection area, a second vehicle control instruction for controlling a vehicle component corresponding to the target projection area to execute target operation is generated, so that accurate identification of the vehicle executing target operation is realized, the occurrence of misoperation is reduced, and the accuracy of vehicle control is improved; meanwhile, the interactivity of the target object on the vehicle operation and the visualization of the interactive process are increased, and the vehicle intelligence is optimized.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
In order to more clearly illustrate the technical solutions and advantages of the embodiments of the present application or the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a schematic diagram of an implementation environment of a vehicle control method according to an embodiment of the present disclosure;
FIG. 2 is a schematic flow chart diagram of a vehicle control method provided by an embodiment of the present application;
FIG. 3 is a schematic flow chart illustrating a portion of a vehicle control method according to an embodiment of the present disclosure;
FIG. 4 is a partial schematic flow chart diagram of a vehicle control method provided by an embodiment of the present application;
FIG. 5 is a process diagram of a vehicle control method provided by an embodiment of the present application;
fig. 6 is a block diagram of a vehicle control device provided in an embodiment of the present application;
fig. 7 is a block diagram of another vehicle control device provided in the embodiment of the present application.
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and, together with the description, serve to explain the principles of the application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, the present application will be further described in detail with reference to the accompanying drawings. It should be apparent that the described embodiment is only one embodiment of the present application, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Reference herein to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic may be included in at least one implementation of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the accompanying drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or server that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
A vehicle control method, a device, a system, and a medium according to embodiments of the present application will be described in detail below with reference to the accompanying drawings.
Referring to fig. 1, a schematic diagram of an implementation environment provided by the embodiment of the present application is shown. The vehicle control method provided by the application can be applied to a vehicle control system 100 shown in fig. 1. The vehicle control system 100 includes a vehicle 110 and a target device 120, the vehicle 110 and the target device 120 perform wireless communication via a first communication module (not shown) and a second communication module (not shown) of the same type that are mounted, respectively, the vehicle 110 is further mounted with a projector (not shown), and vehicle control interaction with respect to vehicle components is performed via the projector and the target device 120. The vehicle 110 includes, but is not limited to, a fuel vehicle, a new energy vehicle, a dual drive vehicle, etc., and the target device 120 includes, but is not limited to, various vehicle keys, smart wearable devices, smart phones, tablet computers, etc.
Fig. 2 is a flowchart of a vehicle control method provided in an embodiment of the present application. Taking the application of the method to the vehicle control system as an example, the method may be implemented by software and/or hardware, and referring to fig. 2, the method may include:
s201, generating a first vehicle control instruction under the condition that a wake-up signal aiming at a vehicle is detected; the first vehicle control command is used to activate at least one projection assembly mounted on the vehicle.
The wake-up signal refers to a signal for waking up the vehicle to perform a corresponding operation. The wake-up signal includes but is not limited to being implemented based on bluetooth technology, UWB (Ultra Wide Band) technology, and the like.
The projection assembly may include, but is not limited to, at least one of a DLP (Digital Light Processing) projection, a microlens, and the like.
Optionally, the target device may be triggered to send a wake-up signal to the surrounding environment, and when the vehicle detects the wake-up signal, a first vehicle control instruction is generated to start at least one projection component mounted on the vehicle, so as to implement corresponding vehicle control. The number of the projection assemblies may be one or more, and the projection assemblies may be disposed at least one position of a side of a vehicle, a side sill, a mirror housing, a roof of a vehicle body, a rear of a vehicle, and the like.
In an optional embodiment, the generating the first vehicle control instruction in case of detecting the wake-up signal for the vehicle comprises:
s2011, detecting whether an awakening signal exists in an environment within a preset distance away from the vehicle in real time;
s2013, under the condition that the wake-up signal for the vehicle is detected, performing identity verification on target equipment corresponding to the wake-up signal;
s2015, after the identity authentication is legal, establishing communication connection between the vehicle and the target device, and generating a first vehicle control command.
Optionally, the vehicle detects in real time whether an awakening signal exists in an environment within a preset distance from the vehicle. For example, the environment within a preset distance from the host vehicle may be a monitoring recognition area which is 5 to 10m away from the vehicle, and the target object carrying the target device enters the monitoring recognition area, and then the vehicle may be activated to wake up. And if the wake-up signal aiming at the vehicle is not detected, continuously monitoring the surrounding environment. And if the wake-up signal aiming at the vehicle is detected, the identity of the target equipment initiating the wake-up signal is verified. And if the identity authentication information corresponding to the target equipment is matched with the legal identity authentication information corresponding to the vehicle, determining that the identity of the target equipment is legal. After the identity authentication is legal, a communication connection between the vehicle and the target device is established, and a first vehicle control command is generated to start at least one projection component installed on the vehicle so as to realize corresponding vehicle control.
S203, controlling the target projection component to project at least one projection area based on the first vehicle control instruction; the target projection assembly is at least one of the projection assemblies.
Wherein the projection area may be a spot formed by projection onto the projection carrier. The projection carrier includes, but is not limited to, the ground, a wall, a window, other objects, and the like. Each projection area can display one light spot, and can also display a plurality of light spots, for example, 2, 3, 4, 5, and the like, and the plurality of light spots can be arranged randomly, or can be arranged in a matrix, in a ring, in other specific layout manners, and the like. The shape of each spot may include, but is not limited to, circular, square, triangular, other regular shapes, or irregular shapes.
Optionally, the vehicle controls a target projection assembly mounted on the vehicle to project at least one projection area on a corresponding preset projection carrier based on the first vehicle control instruction, so as to form a plurality of light spots. The target projection assembly is at least one of at least one projection assembly mounted on the host vehicle.
For example, if the number of light spots in each projection area projected by each projection assembly is only one, the target projection assembly refers to a plurality of projection assemblies, and a plurality of light spots can be formed by the target projection assembly. If the number of the light spots projected by each projection component in each projection area can be multiple, the target projection component refers to one or more projection components, and the target projection component can also form multiple light spots.
S205, responding to the selection operation instruction of the target object to the target projection area, and generating a second vehicle control instruction; the target projection area is at least one of the projection areas.
Optionally, the target object may perform a selection operation on projection areas projected by the target projection assembly, each projection area corresponds to a human-vehicle interaction area, and each projection area is used for indicating activation of a vehicle component bound to the projection area. And under the condition that the target projection area is determined, generating a selection operation instruction corresponding to the target projection area. At this time, the vehicle may generate, in response to the selection operation instruction, a second vehicle control instruction for the vehicle component bound to the target projection area, the second vehicle control instruction being for controlling the vehicle to perform a corresponding control operation.
In an alternative embodiment, as shown in fig. 3, before the step of generating the second vehicle control instruction in response to the operation instruction of selecting the target projection area by the target object, the method further includes:
and S301, detecting position and posture data between the target object and each projected projection area.
The position and posture data can represent the position and posture relation between the target object and the projected projection area. The position and orientation data may include at least one of relative position data and relative orientation data. Wherein the relative position data may comprise a position separation between the target object and a center point of each of the projection regions. The relative pose data may be a pose of a limb portion of the target object relative to the vehicle. The target object may include at least a user object carrying a target device in communication with the vehicle, which may include, but is not limited to, various vehicle keys, smart wearable devices (e.g., smart band, smart watch, etc.), smartphones, tablets, and the like.
Optionally, a positioning sensor, an infrared detector, a ranging sensor, and the like may be used to detect the position and posture data between the target object and each of the projected projection areas in real time, so as to obtain a detection result of whether the preset pose condition is met. Only as an example, after the vehicle is awakened, target equipment (pre-embedded UVB antenna) carried by the user object can be monitored in real time through the pre-embedded UVB antenna at the vehicle end, position and posture data of the user object can be obtained through engineering calculation, and then a detection result of whether a preset pose condition is met is obtained. The preset pose condition may include at least one of a preset position condition and a preset posture condition.
And S303, determining at least one projection area meeting the preset pose condition as a selected target projection area based on the detection result, and generating a selection operation instruction corresponding to the target projection area.
Optionally, the vehicle may determine, according to the detection result, at least one projection area that meets a preset position condition or meets a preset posture condition as the selected target projection area, and then generate a selection operation instruction corresponding to the target projection area. If the detection result includes the detection result corresponding to the preset position condition and the detection result corresponding to the preset posture condition, the detection result with the higher priority can be determined as the final detection result according to the priorities of the detection result and the detection result. The priority may be that the priority of the preset posture condition is higher than the priority corresponding to the preset position condition, or that the priority of the user object carrying the target device is higher than the priorities of other user objects, and the like.
In an optional embodiment, in a case that the preset position condition is a preset position condition, the position and posture data includes a position and distance for characterizing a position and distance between the target object and a central point of each of the projection regions. The center point may be the midpoint of a certain spot in the projected area. The step of determining at least one projection area meeting the preset pose condition as the selected target projection area based on the detection result comprises the following steps:
and indicating that the position distance in the position posture data is smaller than or equal to a preset distance value in the detection result, determining that the corresponding projection area meets the preset position and posture condition, and taking at least one determined projection area as a selected target projection area.
Optionally, the vehicle indicates that the position distance in the position posture data is smaller than or equal to a preset distance value in the detection result, it is determined that the projection area corresponding to the preset position condition meets the preset pose condition, and the determined at least one projection area is taken as a selected target projection area, so that a selection operation instruction corresponding to the target projection area is generated. And if the detection result indicates that the position distance in the position posture data is larger than the preset distance value, determining that the preset pose condition is not met, and not generating a corresponding selection operation instruction.
Optionally, in a case that the preset position condition is a preset posture condition, the preset posture condition may be determined according to a motion posture of a limb portion of the target object in a certain projection area and a preset limb posture, and if it is determined that the motion posture matches a posture trajectory of the preset limb posture, it is determined that the preset posture condition is satisfied, and otherwise, it is determined that the preset posture condition is not satisfied. The preset limb posture can comprise moving the limb part upwards, moving the limb part left and right, drawing a preset pattern track on the limb part and the like.
Alternatively, in order to avoid the misoperation and improve the accuracy of control recognition, at least one projection area meeting the preset position condition and the preset posture condition at the same time may be determined as the selected target projection area according to the detection result, and then the selection operation instruction corresponding to the target projection area is generated.
In the embodiment, the position and posture data between the target object and each projected projection area is detected, at least one projection area meeting the preset posture condition is determined as the selected target projection area based on the detection result, and the selection operation instruction corresponding to the target projection area is generated. Therefore, the visual and accurate interaction area and the visual interaction operation mode are realized, and the accuracy and the convenience of vehicle control are improved.
And S207, controlling the vehicle component corresponding to the target projection area to execute target operation based on the second vehicle control command.
The target operation may include, but is not limited to, an activation operation, a turn-on operation, and the like.
Optionally, the vehicle controls a vehicle component corresponding to the target projection area to switch from a closed state to an open state based on the second vehicle control instruction. Wherein the vehicle component comprises at least one of a door, a window, and a trunk lid. Specifically, the vehicle door may specifically include at least four vehicle doors at left front, left rear, right front, and right rear positions. The vehicle window also can comprise a vehicle window at four positions of left front, left back, right front and right back.
In an optional implementation, as shown in fig. 4, the controlling, based on the second vehicle control instruction, the vehicle component corresponding to the target projection area to perform the target operation includes:
s401, based on the second vehicle control instruction, controlling the projection component corresponding to the target projection area to be switched from a first projection state to a second projection state, wherein the second projection state is used for indicating that control response information is displayed in the target projection area.
The first projection state refers to a state indicated when the projection area is not selected, and the projection area in the first projection state may display at least one of selected operation guidance content (for example, please select to turn on the vehicle component 1, the vehicle component 2, and the like), display a first display attribute (for example, a color, a shape, a pattern, and the like), and the like, or may not display any content. The second projection state is a state indicated when the projection area has been selected for prompting the user that the corresponding vehicle component of the object is about to be turned on. Optionally, the second projection state is specifically used to indicate that control response information is presented in the target projection area. The presentation control response message may be embodied by at least one of presentation text content and presentation attributes.
Optionally, based on the second vehicle control instruction, the projection component corresponding to the control target projection area is switched from the first projection state to the second projection state. The second projection state is generally different from the first projection state, such as at least one of displaying text content differently (e.g., displaying the selected vehicle component m, or the vehicle component n is ready to be turned on.), displaying a second display attribute (e.g., color, shape, pattern, etc.) that is different from the first display attribute of the first projection state, and so forth. In the specific implementation process, a plurality of different projection display combinations can be presented through the DLP technology, and meanwhile, the second projection state can be rapidly adjusted through OTA upgrading, so that flexible and variable and personalized adjustment of the projection assembly display interaction effect is realized.
And S403, controlling the vehicle component corresponding to the target projection area to execute target operation when the preset starting condition is met.
Optionally, the vehicle component corresponding to the target projection area is controlled to execute the target operation under the condition that the preset starting condition is met. Taking the target operation as the turning-on operation as an example, the preset turning-on condition may include a condition that the vehicle component satisfies automatic turning-on. The vehicle component includes, but is not limited to, at least one of a door, a window, and a trunk lid. Specifically, the vehicle door may specifically include at least four vehicle doors at left front, left rear, right front, and right rear positions. The vehicle window also can comprise a vehicle window at four positions of left front, left back, right front and right back.
In an optional embodiment, the control response message includes at least one of a time response message that is dynamically displayed and a graphical response message that is dynamically changed. Before determining that the preset starting condition is met and controlling the vehicle component corresponding to the target projection area to execute the target operation step, the method further comprises the following steps:
s4021, if the time value corresponding to the time response information is detected to meet a preset time threshold, determining that a preset starting condition is met; alternatively, the first and second liquid crystal display panels may be,
s4023, if the current display pattern of the graphical response information is detected to be a preset pattern, determining that a preset starting condition is met.
The time response message may be a display start countdown, such as a dynamically displayed start countdown of, for example, 10 seconds, 9 seconds, 8 seconds. If the time value corresponding to the time response information is detected to reach the preset time threshold, the control response information is determined to meet the preset starting condition, otherwise, the control response information is determined not to meet the preset starting condition. The corresponding preset time thresholds may be configured differently for different vehicle components. These preset time thresholds may be specifically set according to actual situations, and the present disclosure is not particularly limited thereto.
Alternatively, the graphical response message may be a message that displays different patterns, such as dynamically displaying pattern 1, pattern 2, and pattern 3 in sequence. And if the current display pattern of the graphic response information is detected to be a preset pattern, determining that the control response information meets the preset starting condition, otherwise, determining that the control response information does not meet the preset starting condition.
For ease of understanding, the technical solution of the present application is described in its entirety below with reference to fig. 5. As shown in fig. 5, for example only, the user object carries the target device in the vicinity of the vehicle, and when the position of the user object is detected to be outside the monitoring identification area 51, the vehicle is not woken up. When the user object continues to approach the vehicle, the vehicle is awakened when the position of the user object is detected to fall within the monitoring identification area 51. The vehicle control activates at least one projection assembly (not shown) mounted on the vehicle to project projection areas, such as five projection areas 52a, 52b, 52c, 52d, 52e in fig. 5. Then, the user object continues to move to the target projection area (for example, the projection area 52 b), the vehicle understands that the user object has an intention to open the corresponding vehicle door, counts down or changes the pattern in the target projection area, and prompts that the vehicle door (that is, the left rear vehicle door) corresponding to the user projection area 52b is about to be opened through an interactive form of the counting down or the change of the pattern; then, the door corresponding to the projection area 52b is activated, automatically opened, or other corresponding operation is performed.
In an alternative embodiment, the five projection areas 52a, 52b, 52c, 52d, 52e projected in fig. 5 are not limited to the positions shown in the figure, and the five projection areas may be arranged together in a matrix, a ring, a specific layout, and the like. For example, the five projection areas are all laid out in the same orientation of the vehicle (e.g., left, right, rear, front, etc.), wherein the projection areas 52a, 52b, 52d, 52e may be arranged in a matrix a of two rows and two columns, and the projection area 52c is arranged on the right side of the matrix a, forming the entire projection area P of the new layout. Taking the example that all the projection areas P in the new layout are distributed on the left side of the vehicle, if the user object moves to a target projection area (for example, the projection area 52 e) in the projection area arrangement diagram P on the left side, the vehicle understands that the user object has an intention to open the corresponding vehicle door, performs countdown or pattern change in the target projection area, and prompts that the vehicle door corresponding to the user projection area 52e (namely, the right rear vehicle door) is about to be opened through an interactive form of the countdown or pattern change; then, the door corresponding to the projection area 52e is activated, automatically opened or other corresponding operations are performed. Therefore, the steps that the user objects move to the corresponding projection areas are reduced by arranging all the projection areas together, and the vehicle control efficiency and the convenience are improved.
In the above embodiment, the projection component corresponding to the control target projection area is switched from the first projection state to the second projection state based on the second vehicle control instruction, and the second projection state is used for indicating that the control response information is displayed in the target projection area. And controlling the vehicle component corresponding to the target projection area to execute the target operation when the preset starting condition is determined to be met. Therefore, when the target object is determined to have the intention of executing the target operation, the projection assembly makes visualized response feedback, and visual feedback response and prompt are realized by switching the first projection state into the second projection state, so that the ceremony and humanization of interaction are improved.
In the above embodiment, the first vehicle control instruction is generated by detecting the wake-up signal for the vehicle; the first vehicle control command is used for starting at least one projection assembly installed on the vehicle; controlling a target projection assembly to project at least one projection area based on the first vehicle control instruction; the target projection assembly is at least one of the projection assemblies; generating a second vehicle control instruction in response to a selection operation instruction of the target object to the target projection area; the target projection area is at least one of the projection areas; and controlling the vehicle component corresponding to the target projection area to execute the target operation based on the second vehicle control instruction. Through at least one projection area projected by a projection component arranged on the vehicle, and under the condition of responding to a selection operation instruction aiming at the target projection area, a second vehicle control instruction for controlling a vehicle part corresponding to the target projection area to execute target operation is generated, so that the accurate identification of the vehicle execution target operation is realized, the occurrence of misoperation is reduced, and the accuracy of vehicle control is improved; meanwhile, the interactivity of the target object to the vehicle operation and the visualization of the interactive process are increased, and the vehicle intelligence is optimized.
The following are embodiments of the apparatus of the present application that may be used to perform the method embodiments of the present application described above. For details and advantages not disclosed in the embodiments of the apparatus of the present application, reference is made to the embodiments of the method of the present application.
Referring to fig. 6, a block diagram of a vehicle control device according to an embodiment of the present application is shown. The vehicle control device has functions of realizing the method examples, and the functions can be realized by hardware or hardware and corresponding software. The vehicle control apparatus may include:
a first instruction generating module 610, configured to generate a first vehicle control instruction when a wake-up signal for the vehicle is detected; the first vehicle control command is used to activate at least one projection assembly mounted on the vehicle;
a first control module 620 for controlling a target projection assembly to project at least one projection area based on the first vehicle control instruction; the target projection assembly is at least one of the projection assemblies;
a second instruction generating module 630, configured to generate a second vehicle control instruction in response to the selection operation instruction of the target object on the target projection area; the target projection area is at least one of the projection areas;
and the second control module 640 is configured to control the vehicle component corresponding to the target projection area to perform the target operation based on the second vehicle control instruction.
In an alternative embodiment, as shown in fig. 7, the apparatus further comprises:
a pose detection module 650 for detecting position and pose data between the target object and each of the projected projection regions;
an operation instruction generating module 660, configured to determine, based on the detection result, at least one projection area that meets a preset pose condition as a selected target projection area, and generate a selection operation instruction corresponding to the target projection area.
In an alternative embodiment, the position and orientation data includes data characterizing a position and orientation between the target object and a center point of each of the projection regions. The second instruction generation module is specifically configured to:
and when the detection result indicates that the position distance in the position posture data is smaller than or equal to a preset distance value, determining that the corresponding projection area meets a preset pose condition, and taking at least one determined projection area as a selected target projection area.
In an alternative embodiment, the second control module comprises:
the first control submodule is used for controlling the projection component corresponding to the target projection area to be switched from a first projection state to a second projection state based on the second vehicle control instruction, and the second projection state is used for indicating that control response information is displayed in the target projection area;
and the second control sub-module is used for controlling the vehicle part corresponding to the target projection area to execute the target operation when the preset starting condition is met.
In an optional embodiment, the control response message includes at least one of a dynamically displayed time response message and a dynamically changing graphical response message. The second control module further comprises:
the first detection submodule is used for determining that a preset starting condition is met if the time value corresponding to the time response information is detected to meet a preset time threshold; alternatively, the first and second electrodes may be,
and the second detection submodule is used for determining that a preset starting condition is met if the current display pattern of the graphical response information is detected to be a preset pattern.
In an optional implementation manner, the second control module is specifically configured to:
controlling a vehicle part corresponding to the target projection area to be switched from a closed state to an open state;
wherein the vehicle component comprises at least one of a door, a window, and a trunk lid.
In an optional implementation manner, the first instruction generating module is specifically configured to:
detecting whether a wake-up signal exists in an environment within a preset distance from the vehicle in real time;
under the condition that a wake-up signal aiming at the vehicle is detected, carrying out identity verification on target equipment corresponding to the wake-up signal;
and after the identity authentication is legal, establishing communication connection between the vehicle and the target equipment, and generating a first vehicle control command.
For details and advantages not disclosed in the embodiments of the system of the present application, reference is made to the above-described embodiments of the present application.
The application also provides a vehicle control system, which comprises a vehicle and a target device in communication connection with the vehicle; the vehicle comprises a processor and a memory, wherein at least one instruction or at least one program is stored in the memory, and the at least one instruction or the at least one program is loaded by the processor and executed to realize the vehicle control method.
The present application also provides a computer device comprising a processor and a memory, wherein at least one instruction or at least one program is stored in the memory, and the at least one instruction or at least one program is loaded and executed by the processor to implement the vehicle control method according to any one of the above.
The present application also provides a computer readable storage medium having at least one instruction or at least one program stored therein, the at least one instruction or at least one program being loaded and executed by a processor to implement a vehicle control method as in any above.
In some embodiments, the computer device (not shown) may include a processor, memory, and a network interface connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a vehicle control method.
It should be noted that: the sequence of the embodiments of the present application is only for description, and does not represent the advantages or disadvantages of the embodiments. And that specific embodiments have been described above. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, as for the device and server embodiments, since they are substantially similar to the method embodiments, the description is simple, and the relevant points can be referred to the partial description of the method embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The foregoing is a preferred embodiment of the present application, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present application, and these modifications and decorations are also regarded as the protection scope of the present application.

Claims (10)

1. A vehicle control method, characterized by comprising:
generating a first vehicle control instruction if a wake-up signal for the vehicle is detected; the first vehicle control command is used for starting at least one projection assembly installed on the vehicle;
controlling a target projection assembly to project at least one projection area based on the first vehicle control instruction; the target projection assembly is at least one of the projection assemblies;
generating a second vehicle control instruction in response to the selection operation instruction of the target object to the target projection area; the target projection area is at least one of the projection areas;
and controlling the vehicle component corresponding to the target projection area to execute the target operation based on the second vehicle control instruction.
2. The method according to claim 1, wherein before the step of generating a second vehicle control command in response to a target object selecting an operation command for a target projection area, the method further comprises:
detecting position posture data between the target object and each projected projection area;
and determining at least one projection area meeting preset pose conditions as a selected target projection area based on the detection result, and generating a selection operation instruction corresponding to the target projection area.
3. The method of claim 2, wherein the position and orientation data comprises data characterizing a position and orientation distance between a target object and a center point of each of the projection regions;
the determining, based on the detection result, at least one of the projection areas meeting the preset pose condition as the selected target projection area comprises:
and when the detection result indicates that the position distance in the position posture data is smaller than or equal to a preset distance value, determining that the corresponding projection area meets a preset pose condition, and taking at least one determined projection area as a selected target projection area.
4. The method according to any one of claims 1-3, wherein controlling the vehicle component corresponding to the target projection area to perform the target operation based on the second vehicle control command comprises:
based on the second vehicle control instruction, controlling a projection component corresponding to the target projection area to be switched from a first projection state to a second projection state, wherein the second projection state is used for indicating that control response information is displayed in the target projection area;
and controlling the vehicle component corresponding to the target projection area to execute target operation when the preset starting condition is determined to be met.
5. The method of claim 4, wherein the control response message comprises at least one of a dynamically displayed time response message, a dynamically changing graphical response message;
before determining that the preset starting condition is met and controlling the vehicle component corresponding to the target projection area to execute the target operation step, the method further comprises the following steps:
if the time value corresponding to the time response information is detected to meet a preset time threshold, determining that a preset starting condition is met; alternatively, the first and second electrodes may be,
and if the current display pattern of the graphic response information is detected to be a preset pattern, determining that a preset starting condition is met.
6. The method according to any one of claims 1-3, wherein controlling the vehicle component corresponding to the target projection area to perform the target operation comprises:
controlling a vehicle component corresponding to the target projection area to be switched from a closed state to an open state;
wherein the vehicle component comprises at least one of a door, a window, and a trunk lid.
7. The method according to any one of claims 1-3, wherein generating a first vehicle control instruction upon detection of a wake-up signal for a vehicle comprises:
detecting whether a wake-up signal exists in an environment within a preset distance from the vehicle in real time;
under the condition that a wake-up signal aiming at the vehicle is detected, carrying out identity verification on target equipment corresponding to the wake-up signal;
and after the identity authentication is legal, establishing communication connection between the vehicle and the target equipment, and generating a first vehicle control command.
8. A vehicle control apparatus, characterized by comprising:
the first instruction generation module is used for generating a first vehicle control instruction under the condition that a wake-up signal aiming at the vehicle is detected; the first vehicle control command is used for starting at least one projection assembly installed on the vehicle;
the first control module is used for controlling the target projection assembly to project at least one projection area based on the first vehicle control instruction; the target projection assembly is at least one of the projection assemblies;
the second instruction generation module is used for responding to a selection operation instruction of the target object to the target projection area and generating a second vehicle control instruction; the target projection area is at least one of the projection areas;
and the second control module is used for controlling the vehicle component corresponding to the target projection area to execute the target operation based on the second vehicle control instruction.
9. A vehicle control system characterized by comprising a vehicle and a target apparatus communicatively connected to the vehicle; the vehicle comprises a processor and a memory, wherein at least one instruction or at least one program is stored in the memory, and the at least one instruction or the at least one program is loaded by the processor and executed to implement the vehicle control method according to any one of claims 1 to 7.
10. A computer-readable storage medium, characterized in that at least one instruction or at least one program is stored in the storage medium, which is loaded and executed by a processor to implement the vehicle control method according to any one of claims 1 to 7.
CN202210827842.9A 2022-07-13 2022-07-13 Vehicle control method, device, system and medium Pending CN115214535A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210827842.9A CN115214535A (en) 2022-07-13 2022-07-13 Vehicle control method, device, system and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210827842.9A CN115214535A (en) 2022-07-13 2022-07-13 Vehicle control method, device, system and medium

Publications (1)

Publication Number Publication Date
CN115214535A true CN115214535A (en) 2022-10-21

Family

ID=83612370

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210827842.9A Pending CN115214535A (en) 2022-07-13 2022-07-13 Vehicle control method, device, system and medium

Country Status (1)

Country Link
CN (1) CN115214535A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115653442A (en) * 2022-11-23 2023-01-31 中国第一汽车股份有限公司 Vehicle door control method, device, equipment and storage medium
CN116181188A (en) * 2022-12-22 2023-05-30 重庆长安汽车股份有限公司 Control method and system for opening vehicle door and vehicle

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115653442A (en) * 2022-11-23 2023-01-31 中国第一汽车股份有限公司 Vehicle door control method, device, equipment and storage medium
CN116181188A (en) * 2022-12-22 2023-05-30 重庆长安汽车股份有限公司 Control method and system for opening vehicle door and vehicle

Similar Documents

Publication Publication Date Title
CN115214535A (en) Vehicle control method, device, system and medium
CN108804010B (en) Terminal control method, device and computer readable storage medium
EP3232299B1 (en) Physical key component, terminal, and touch response method and device
CN104660799B (en) Mobile terminal and control method thereof
KR102091161B1 (en) Mobile terminal and control method for the mobile terminal
CN105940365B (en) A kind of notification information processing method, device and terminal
US20210397309A1 (en) Interface display method and apparatus, and storage medium
CN107305458B (en) Method, system and terminal for customizing application software interactive interface
KR101404234B1 (en) Mobile terminal and control method for the mobile terminal
WO2017118329A1 (en) Method and apparatus for controlling tab bar
EP3279786A1 (en) Terminal control method and device, and terminal
KR20150059517A (en) Mobile terminal and control method for the mobile terminal
JP2018518751A (en) Operation method, apparatus, and mobile terminal using fingerprint recognition
EP3754959B1 (en) Quick access to an application in the lock screen
US20170277319A1 (en) Flexible display device
EP4184303A1 (en) Wallpaper display control method and apparatus, and electronic device
CN109460179A (en) Virtual object control method and device, electronic equipment, storage medium
EP3509012B1 (en) Fingerprint recognition method and device
KR20140008643A (en) Mobile terminal and control method for mobile terminal
CN115699713A (en) Electronic device including a plurality of displays and method of operating the same
CN109002339A (en) touch operation method, device, storage medium and electronic equipment
CN107979701B (en) Method and device for controlling terminal display
CN107544740B (en) Application processing method and device, storage medium and electronic equipment
CN112147964A (en) Device management method, device, electronic device and medium
CN110851189A (en) Method and device for starting application program, readable storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination