CN113247007A - Vehicle control method and vehicle - Google Patents

Vehicle control method and vehicle Download PDF

Info

Publication number
CN113247007A
CN113247007A CN202110692567.XA CN202110692567A CN113247007A CN 113247007 A CN113247007 A CN 113247007A CN 202110692567 A CN202110692567 A CN 202110692567A CN 113247007 A CN113247007 A CN 113247007A
Authority
CN
China
Prior art keywords
vehicle
user
projection
control
control operation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110692567.XA
Other languages
Chinese (zh)
Inventor
钱文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhaoqing Xiaopeng New Energy Investment Co Ltd
Original Assignee
Zhaoqing Xiaopeng New Energy Investment Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhaoqing Xiaopeng New Energy Investment Co Ltd filed Critical Zhaoqing Xiaopeng New Energy Investment Co Ltd
Priority to CN202110692567.XA priority Critical patent/CN113247007A/en
Publication of CN113247007A publication Critical patent/CN113247007A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B29/00Combinations of cameras, projectors or photographic printing apparatus with non-photographic non-optical apparatus, e.g. clocks or weapons; Cameras having the shape of other objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0001Details of the control system

Abstract

The invention provides a vehicle control method and a vehicle, wherein the method comprises the following steps: projecting a control mode mapped with a vehicle function in a projectable area outside the vehicle; and responding to the control operation sent by the user based on the projection, and executing the vehicle function corresponding to the control operation. Therefore, the user can control the vehicle outside the vehicle, so that the user can control the vehicle more conveniently and more conveniently, and the user experience is improved.

Description

Vehicle control method and vehicle
Technical Field
The invention relates to the technical field of vehicles, in particular to a vehicle control method and a vehicle.
Background
In the prior art, the control of the vehicle can be usually realized by operating a vehicle-mounted display screen or a function button arranged inside the vehicle. For example, opening a trunk, opening/closing a vehicle charging port, adjusting an in-vehicle temperature, and the like. In this case, if a user needs to control the vehicle, the user usually needs to enter the vehicle to control the vehicle, but it is difficult to control the vehicle outside the vehicle, so that the user cannot conveniently control the vehicle in some cases. For example, if a user needs to charge a vehicle, the user finds that a vehicle charging port is not opened after the user leaves the vehicle, and then the user needs to return to the inside of the vehicle to open the vehicle charging port again, so that the user needs to turn back and forth at the vehicle charging port and the inside of the vehicle, and the user experience is poor.
Disclosure of Invention
In view of the above, the present invention has been made to provide a vehicle control method and a corresponding vehicle that overcome or at least partially solve the above problems.
In order to solve the above problems, the present invention discloses a vehicle control method, including:
projecting a control mode mapped with a vehicle function in a projectable area outside the vehicle;
and responding to the control operation sent by the user based on the projection, and executing the vehicle function corresponding to the control operation.
Optionally, the method further comprises:
and if the state of the vehicle component corresponding to the vehicle function changes, adjusting the projection content corresponding to the vehicle function to prompt the user that the projected vehicle function changes.
Optionally, the step of executing a vehicle function corresponding to the control operation in response to the control operation issued by the user based on the projection includes:
responding to a plurality of times of continuous control operation sent by a user to a vehicle function based on projection, and executing the vehicle function corresponding to the control operation to adjust a vehicle component and simultaneously adjusting projection content corresponding to the vehicle function for each time of the control operation.
Optionally, the step of projecting the control mode mapped with the vehicle function in the projectable area outside the vehicle includes:
the control modes mapped to the vehicle functions are projected in an area where the vehicle component is located, and/or the control modes mapped to the vehicle functions are projected in a projectable area in a user's line of sight direction outside the vehicle.
Optionally, the step of projecting the control mode mapped with the vehicle function in the area where the vehicle component is located includes:
and projecting the control modes mapped with the vehicle functions on the vehicle parts corresponding to the control icons.
Optionally, the step of projecting the control mode mapped with the vehicle function in the same plane according to the distribution of the body positions of the vehicle components around the vehicle includes:
and projecting the control modes mapped by the vehicle functions on the same plane according to the vehicle body position distribution of the vehicle components and the control mode mapped by the vehicle functions corresponding to each vehicle component around the vehicle.
Optionally, the step of projecting the control mode mapped with the vehicle function in the same plane according to the distribution of the body positions of the vehicle components around the vehicle includes:
and projecting the control modes mapped by the vehicle functions on the same plane according to the vehicle body position distribution of the vehicle components and the control mode mapped by the similar adjacent vehicle components corresponding to the vehicle functions around the vehicle.
Optionally, the step of executing a vehicle function corresponding to the control operation in response to the control operation issued by the user based on the projection includes:
and under the condition that the user gestures of one or more users are recognized to respectively point to any projection identifier, determining the control operation corresponding to the pointed projection identifier, and executing the vehicle function corresponding to the control operation.
Optionally, in a case that the user gestures of the one or more users are recognized to be respectively directed to any one of the projection identifiers, the step of determining a control operation corresponding to the directed projection identifier, and executing a vehicle function corresponding to the control operation includes:
and if the user postures of the users are recognized to respectively point to a plurality of projection identifications corresponding to the multi-user control operation, determining the control operation sent by the users as the multi-user control operation, and executing the vehicle function corresponding to the multi-user control operation.
Optionally, the step of executing a vehicle function corresponding to the control operation in response to the control operation issued by the user based on the projection includes:
responding to control operation sent by a user based on projection, and receiving an adjustment instruction sent by the user;
and executing the vehicle function corresponding to the control operation according to the adjustment instruction.
Optionally, the method further comprises:
in response to a projection setting operation issued by a user, a vehicle function allowing projection is set and/or a projection area is set.
Optionally, the method comprises:
and in response to the control operation sent by the user based on the projection, projecting feedback display content in a projectable area outside the vehicle to prompt the user that the vehicle has detected the control operation of the user.
The invention also discloses a vehicle comprising:
the projection module is used for projecting a control mode mapped with the vehicle function in a projectable area outside the vehicle;
and the execution module is used for responding to the control operation sent by the user based on the projection and executing the vehicle function corresponding to the control operation.
Optionally, the apparatus further comprises:
and the projection adjusting module is used for adjusting projection contents corresponding to the vehicle functions to prompt a user that the projected vehicle functions change if the vehicle component states corresponding to the vehicle functions change.
Optionally, the execution module includes:
the continuous execution sub-module is used for responding to a plurality of times of continuous control operation sent by a user to a vehicle function based on projection, aiming at each time of the control operation, executing the vehicle function corresponding to the control operation to adjust a vehicle component, and simultaneously adjusting projection content corresponding to the vehicle function.
Optionally, the execution module includes:
the vehicle component projection sub-module is used for projecting the control modes mapped with the vehicle functions in the area where the vehicle component is located, and/or the sight line projection sub-module is used for projecting the control modes mapped with the vehicle functions in the projectable area in the user sight line direction outside the vehicle.
Optionally, the region projection sub-module includes:
the plane projection unit is used for projecting the control modes mapped with the vehicle functions on the same plane according to the vehicle body positions of the vehicle parts around the vehicle, and/or the part projection unit is used for projecting the control modes mapped with the vehicle functions on the vehicle parts corresponding to the control icons.
Optionally, the planar projection unit comprises:
and the corresponding projection subunit is used for projecting the control modes mapped by the vehicle functions on the same plane according to the vehicle body position distribution of the vehicle components and the control mode mapped by the vehicle functions corresponding to each vehicle component around the vehicle.
Optionally, the planar projection subunit includes:
and the classified projection subunit is used for projecting the control modes mapped by the vehicle functions on the same plane according to the vehicle body position distribution of the vehicle components and the control mode mapped by the similar adjacent vehicle components corresponding to the vehicle functions around the vehicle.
Optionally, the execution module includes:
and the gesture recognition submodule is used for determining the control operation corresponding to the pointed projection identifier and executing the vehicle function corresponding to the control operation under the condition that the user gestures of one or more users are recognized to point to any projection identifier respectively.
Optionally, the gesture recognition sub-module comprises:
and the multi-user identification unit is used for determining the control operation sent by the user as the multi-user control operation and executing the vehicle function corresponding to the multi-user control operation if the user postures of the users are identified as the projection identifications corresponding to the multi-user control operation respectively.
Optionally, the execution module includes:
the instruction receiving submodule is used for responding to control operation sent by a user based on projection and receiving an adjustment instruction sent by the user;
and the instruction adjusting submodule is used for executing the vehicle function corresponding to the control operation according to the adjusting instruction.
Optionally, the apparatus further comprises:
and the area setting module is used for setting the vehicle function allowing projection and/or setting the projection area in response to the projection setting operation sent by the user.
Optionally, the apparatus comprises:
and the feedback module is used for responding to the control operation sent by the user based on the projection, and projecting feedback display content in a projectable area outside the vehicle so as to prompt the user that the vehicle detects the control operation of the user.
The invention also discloses a vehicle comprising:
one or more processors; and
one or more machine readable media having instructions stored thereon that, when executed by the one or more processors, cause the vehicle to perform one or more methods according to the present disclosure.
One or more machine-readable media having instructions stored thereon, which when executed by one or more processors, cause the processors to perform one or more methods as described herein are also disclosed.
The invention has the following advantages:
by the vehicle control method, the control mode mapped with the vehicle function is projected in the projectable area outside the vehicle; and responding to the control operation sent by the user based on the projection, and executing the vehicle function corresponding to the control operation. Therefore, the user can control the vehicle outside the vehicle, so that the user can control the vehicle more conveniently and more conveniently, and the user experience is improved.
Drawings
FIG. 1 is a flow chart of the steps of an embodiment of a vehicle control method of the present invention;
FIG. 2 is a flow chart of steps in another vehicle control method embodiment of the present invention;
FIG. 3 is a schematic projection view of the present invention;
fig. 4 is a block diagram of a vehicle embodiment of the invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
One of the core ideas of the invention is that aiming at the problem that a user cannot conveniently control the vehicle outside the vehicle, the user can control the vehicle based on the control operation sent by projection outside the vehicle by projecting the control mode mapped with the vehicle function in the projectable area outside the vehicle, so that the vehicle can execute different functions. Therefore, the user can control the vehicle outside the vehicle without controlling the vehicle inside the vehicle, and convenience of vehicle operation is further improved.
Referring to fig. 1, a flowchart illustrating steps of an embodiment of a vehicle control method of the present invention is shown, which may specifically include the following steps:
step 101, projecting a control mode mapped with vehicle functions in a projectable area outside a vehicle;
in the present invention, in order to facilitate the user to control the vehicle outside the vehicle, the control method mapped with the vehicle function may be projected in a projectable area outside the vehicle.
Specifically, the user may control the vehicle externally to the vehicle in a variety of different control manners, such as by stepping on the vehicle once, stepping on the vehicle continuously, placing the vehicle on the vehicle's hands, pointing the vehicle's hands, and gesturing the vehicle's hands. Therefore, the projection can be correspondingly carried out according to the control mode which can be used by the user, so that the user can adopt the required control mode to control the vehicle. Meanwhile, the control mode can be mapped with at least one vehicle function, so that a user can control the vehicle to execute the at least one vehicle function through projection.
The projectable region may be a vehicle and a region around the vehicle where the control icon can be clearly displayed, for example, a vehicle body, a ground around the vehicle, a side wall around the vehicle, a ceiling around the vehicle, a device around the vehicle such as a charging pile, a fuel dispenser, and the like.
The content of the projection may be determined according to actual needs, for example, one or a combination of various characters, icons, and animations may be projected, which is not limited by the present invention.
The size and shape of the projected content can be determined according to actual needs. For example, the shape of the projected content may be circular, rectangular, irregular polygonal, or the like. The size of the projection content may be 1cm in diameter length, 10cm in diameter length, 50cm in diagonal length, 1m in diagonal length, etc., which is not limited in the present invention.
In a specific implementation, the vehicle may be provided with at least one projection device that may be used for projection. The projection device can be arranged at the top, the side, the bottom and the like of the exterior of the vehicle, and also can be arranged in the vehicle and projected at the positions of windows, walls and the like, so that a user can view and operate the control icons at the exterior of the vehicle. The projection device which is communicated with the vehicle can be arranged around the vehicle, the vehicle can be communicated with the projection device, and the projection device is controlled to project the control mode which is mapped with the vehicle function in the projectable area outside the vehicle.
And selecting a proper projectable area according to actual needs, and projecting the control icon. For example, in the case that the projection content can be clearly shown on the ground, the control mode mapped with the vehicle function can be projected on the ground, so that the user can control the vehicle without using both hands. And under the condition that the control icon cannot be clearly displayed due to the existence of accumulated water and the like on the ground, the control mode mapped with the vehicle function can be projected on the side walls around the vehicle or the vehicle body, so that a user can clearly view the projection and operate the projection. The control mode mapped with the vehicle function can be projected to devices such as a charging pile and the like around the vehicle, so that a user can operate the device conveniently and can operate the vehicle through the projection.
In a specific implementation, the method of recognizing a human face, recognizing an electronic device or a bluetooth key held by a person, and the like may be adopted to detect whether the person around the vehicle includes the user of the vehicle. The user of the vehicle may be a driver of the vehicle or may be a stationary passenger of the vehicle. In the case where it is detected that the person around the vehicle includes a user of the vehicle, the control manner mapped with the vehicle function may be projected in a projectable area outside the vehicle to provide the user with the vehicle control function operable outside the vehicle.
102, responding to control operation sent by a user based on projection, and executing a vehicle function corresponding to the control operation;
in the invention, after the control mode mapped with the vehicle function is projected by the projectable area outside the vehicle, the vehicle function corresponding to the control operation can be executed in response to the control operation sent by the user based on the projection, so as to control the vehicle through the projection.
Specifically, the gesture of the user may be detected, it may be determined whether the gesture of the user has an intention to point to the projection, and in the case where the gesture of the user has an intention to point to the projection, it may be considered that an operation is performed on the control icon. For example, if the current posture of the user is detected as a pointing projection, it may be considered that a control operation is issued based on the projection. For example, if the user's foot overlaps the projection, the user may be considered to issue a control operation based on the projection.
After the control operation sent by the user based on the projection is detected, the control operation sent by the user based on the projection can be determined, and a vehicle control function corresponding to the control operation is executed, so that the vehicle can be controlled outside the vehicle.
In a specific implementation, the control manner of the projection may be mapped with at least one vehicle function, so that in a case where a user issues a control operation based on the projection, it may be considered that the control manner of the user is directed to a projection, and it is desired to execute the vehicle function corresponding to the control manner of the projection.
The vehicle function may be control of a component outside the vehicle, or may be control of a component inside the vehicle. For example, the function of controlling the component outside the vehicle may include at least one of a charging port control function, a trunk control function, a front trunk control function, a mirror control function, an air suspension control function, and a door control function. The in-vehicle control function comprises at least one of a temperature control function, a sound playing control function, a light control function, a seat control function and a display screen control function.
Therefore, the vehicle function corresponding to the control operation can be executed in response to the control operation sent by the user based on the projection, and the control of the vehicle internal part or the external part can be realized.
As an example of the present invention, a control icon may be projected on the ground outside the vehicle to provide a control manner in which control is achieved by stepping on with a foot. Under the condition that the user detects the control icon that opens and close with foot contact control trunk, can respond to the control operation that the user sent based on the projection, carry out the vehicle function that control operation corresponds controls the trunk of closing originally and opens to the user also can realize the control to vehicle trunk simply under the condition that article were held to both hands.
As another example of the present invention, control icons may be projected onto the ground outside the vehicle to provide a control that is controlled by a foot step. When the control icon for controlling the air suspension is detected to be touched by the hand of the user, the vehicle function corresponding to the control operation can be executed in response to the control operation sent by the user based on the projection, and the air suspension can be adjusted, so that the user can observe the air suspension outside the vehicle and adjust the air suspension through the control icon, and the convenience of the user operation is improved.
As another example of the present invention, a control icon may be projected on a vehicle body outside the vehicle to provide a control manner in which control is achieved by hand contact. Under the condition that the control icon for controlling the temperature in the vehicle by touching the hand of the user is detected, the control operation sent by the user based on projection can be responded, the vehicle function corresponding to the control operation is executed, and the temperature in the vehicle is adjusted, so that the temperature in the vehicle can be adjusted without entering the vehicle under the condition that the user parks the vehicle outdoors, the condition that the user needs to enter the interior of the vehicle which is overheated due to insolation or too cold due to cold weather is avoided, and the discomfort of the user is reduced.
As another example of the present invention, control icons may be projected on a wall outside the vehicle to provide a way of control through hand pointing. Under the condition that the user is detected to point at the control icon for controlling the vehicle seat by using the hand, the vehicle function corresponding to the control operation can be executed in response to the control operation sent by the user based on projection, and the vehicle seat is adjusted, so that the user can adjust the vehicle seat before entering the vehicle, and the condition that the state of the vehicle seat is not matched with the body type of the user and the discomfort brought to the user is avoided.
By the vehicle control method, the control mode mapped with the vehicle function is projected in the projectable area outside the vehicle; and responding to the control operation sent by the user based on the projection, and executing the vehicle function corresponding to the control operation. Therefore, the user can control the vehicle outside the vehicle, so that the user can control the vehicle more conveniently and more conveniently, and the user experience is improved.
Referring to fig. 2, a flowchart illustrating steps of an embodiment of a vehicle control method of the present invention is shown, which may specifically include the following steps:
step 201, projecting a control mode mapped with a vehicle function in an area where a vehicle component is located, and/or projecting the control mode mapped with the vehicle function in a projectable area in a user sight line direction outside the vehicle;
in the present invention, in order to facilitate the user to control the vehicle outside the vehicle, the control method mapped with the vehicle function may be projected in a projectable area outside the vehicle.
Specifically, the user may control the vehicle externally to the vehicle in a variety of different control manners, such as by stepping on the vehicle once, stepping on the vehicle continuously, placing the vehicle on the vehicle's hands, pointing the vehicle's hands, and gesturing the vehicle's hands. Therefore, the projection can be correspondingly carried out according to the control mode which can be used by the user, so that the user can adopt the required control mode to control the vehicle. Meanwhile, the control mode can be mapped with at least one vehicle function, so that a user can control the vehicle to execute the at least one vehicle function through projection.
The projectable region may be a vehicle and a region around the vehicle where the control icon can be clearly displayed, for example, a vehicle body, a ground around the vehicle, a side wall around the vehicle, a ceiling around the vehicle, a device around the vehicle such as a charging pile, a fuel dispenser, and the like.
In the present invention, the vehicle function mapped by the projected control mode can be generally used for controlling a certain vehicle component. For example, the projection control mode may correspond to a trunk opening and closing function, which may be used to control a trunk of the vehicle. The control mode of the projection may correspond to a rear-view mirror adjustment function, which may be used to control a rear-view mirror of a vehicle. The control mode of the projection may correspond to a temperature adjustment function, which may be used to control an air conditioning device of the vehicle. The control mode of the projection may correspond to a navigation adjustment function, which may be used to control how the display screen of the vehicle shows navigation, etc.
Thus, in order to facilitate the user to adjust the vehicle component based on the projected control method, the control method mapped to the vehicle function may be projected in the area where the vehicle component is located. Therefore, when the user needs to adjust the vehicle component, the corresponding projection control mode can be found near the vehicle component, the control operation is conveniently sent out based on the projection, the corresponding vehicle function is executed, and the vehicle component is controlled.
As an example of the present invention, an air suspension of a vehicle may be generally located at the rear of the vehicle, so that a control word corresponding to the air suspension may be projected at the rear of the vehicle to provide a control manner for realizing control by stepping on. The air suspension is convenient for a user to adjust while observing the air suspension.
In the present invention, since the size of the vehicle is significantly larger than that of a person, in the case where the user is located near the vehicle, it may be difficult for the user to comprehensively observe all areas around the vehicle. For example, with the user at the rear of the vehicle, the user may have difficulty viewing the front as well as the sides of the vehicle. With the user on the right side of the vehicle, the user may have difficulty viewing the left side, front, and rear of the vehicle. In this case, if the control method mapped with the vehicle function is projected in the area where the vehicle component is located, there is a possibility that the user cannot observe the projection of some control methods, so that the user cannot conveniently find the position of the projection of the control method corresponding to the vehicle control function that the user needs to use.
Thus, in order to facilitate the user to view the projected control method, the control method mapped with the vehicle function may be projected in a projectable area in the user's sight line direction outside the vehicle. The projectable region of the user's gaze direction may refer to a region along which the user's gaze direction may be projected. Therefore, the user can observe the projection of the control mode mapped with the vehicle function without moving, and can control the vehicle based on the projection. However, when the user's visual line direction changes, the projectable area in the user's visual line direction may be newly determined accordingly, and the control method mapped to the vehicle function may be projected again in the projectable area in the user's visual line direction outside the vehicle.
As an example of the present invention, if the user stands on the right side of the vehicle and the viewing direction is directed from the right side of the vehicle to the left side of the vehicle, along the viewing direction of the user, the user may be a projectable area on the right side of the vehicle, on the right side of the ground, on the left side of the vehicle, on the side wall not covered by the vehicle, or near the ceiling on the left side of the vehicle.
As an example of the present invention, if the user stands at the rear of the vehicle and the sight line direction points from the rear of the vehicle to the front of the vehicle, along the sight line direction of the user, the user may be a projectable area in the sight line direction of the user on the rear of the vehicle body, the rear floor, the side wall of the front of the vehicle that is not covered by the vehicle, and the ceiling close to the front of the vehicle.
In a specific implementation, the control method mapped with the vehicle function is projected in an area where the vehicle component is located and/or the control method mapped with the vehicle function is projected in a projectable area in the user sight line direction outside the vehicle according to actual needs, which is not limited by the invention.
As an optional embodiment of the present invention, in a case where the control icon has a strong correlation with the vehicle component, the control method mapped to the vehicle function may be projected in an area where the vehicle component is located. For example, if the vehicle function mapped by the control method is to control the opening and closing of the left door of the vehicle, the control of the left door is usually only required if the user is located near the left door, and the control method mapped by the control method can be projected near the left door. In the case where the control icon has a strong correlation with the user's needs, the control manner mapped with the vehicle function may be projected in a projectable area in the user's sight line direction outside the vehicle. For example, in the case where the vehicle function mapped to the control method is to adjust the vehicle temperature, and there may be a need for adjusting the vehicle temperature regardless of where the user is located, the control method mapped to the temperature adjustment function may be projected in the line-of-sight direction of the user.
In the present invention, the step of projecting the control mode mapped with the vehicle function in the area where the vehicle component is located includes:
and S11, projecting the control modes mapped with the vehicle functions on the same plane according to the vehicle body positions of the vehicle parts around the vehicle, and/or projecting the control modes mapped with the vehicle functions on the vehicle parts corresponding to the control icons.
In the embodiment of the present invention, when the area where the vehicle component is located projects the control manner mapped to the vehicle function, the control manner mapped to the vehicle function may be projected around the vehicle according to the distribution of the body positions of the vehicle component in the same plane, and/or the control manner mapped to the vehicle function may be projected on the vehicle component corresponding to the control icon, according to actual needs.
In particular, the vehicle component can generally have a fixed setting position, whereby the vehicle component can have a fixed position distribution. The control modes mapped with the vehicle functions can be projected on the same plane according to the vehicle body positions of the vehicle parts around the vehicle, so that a user can quickly find the projection of the control mode corresponding to the vehicle part needing to be controlled in the projection by referring to the positions of the vehicle parts. Meanwhile, the control mode mapped with the vehicle function is projected on the same plane, so that a user can conveniently and quickly search the control icon needing to be operated on the same plane.
For example, fig. 3 is a schematic projection diagram of the present invention. The control modes mapped to vehicle functions may be projected on the ground around the vehicle according to the position distribution of the vehicle components, and may be located in the vicinity of the corresponding vehicle component. Specifically, on the ground of the area where the trunk is located, a projection providing a trunk opening and closing control function and a mapping control mode can be set. The ground of the area where the spare box is located can be provided with a projection providing a mapping control mode of the opening and closing control function of the spare box. The projection providing the mapping control mode of the lifting control function of the air suspension can be arranged on the ground of the area where the air suspension is located. On the ground of the area where the rearview mirror is located, a projection which provides a mapping control mode of a rearview mirror adjusting control function can be arranged. The projection providing the on-off adjustment control function of the slow charging port and the phase mapping control mode can be arranged on the ground in the area where the slow charging port is located. On the ground of the area where the quick charging port is located, a projection providing a phase mapping control mode of the opening and closing adjustment control function of the quick charging port can be arranged. Therefore, when the user needs to operate the vehicle component, the control of the vehicle component can be realized by sending out control operation based on projection at the area where the vehicle component is located without reciprocating between the interior of the vehicle and the position where the vehicle component is located.
In particular, since the user may need to touch the vehicle component while operating the vehicle component. In this way, the control method mapped to the vehicle function can be projected on the vehicle component corresponding to the control icon. Therefore, the user can touch the vehicle part to operate the vehicle part and simultaneously realize the control of the vehicle part through touch projection.
For example, the control mode mapped with the trunk opening and closing function may be projected in an area that a user would normally touch when the trunk lid is opened and closed. Therefore, under the condition that the trunk needs to be opened, the trunk cover can be directly touched to touch projection, and control operation based on projection is sent out to realize control of unlocking and opening of the trunk. And then the user can directly lift the trunk cover without moving the touch position to complete the opening of the trunk. Therefore, the user can simply open the trunk by touch projection without operating the trunk in the vehicle or opening the trunk by a key or the like.
In the present invention, the step of projecting the control mode mapped with the vehicle function in the same plane according to the distribution of the body positions of the vehicle components around the vehicle includes:
and S21, projecting the control modes mapped by the vehicle functions on the same plane according to the vehicle body position distribution of the vehicle components and the control mode mapped by the vehicle functions corresponding to each vehicle component around the vehicle.
In the invention, in order to facilitate a user to control each vehicle component in a targeted manner, the control modes mapped by the vehicle functions can be projected on the same plane around the vehicle according to the distribution of the body positions of the vehicle components and the control mode mapped by each vehicle component corresponding to one vehicle function. So that the user can conveniently control each vehicle component in the projection.
As an example of the present invention, the window may include a left front window, a left rear window, a right front window, and a right rear window. If the control mode mapped by the window lifting function needs to be projected, the control modes mapped by the window control function corresponding to each window can be projected in the same plane according to the vehicle body position distribution of the vehicle parts around the vehicle. Therefore, each window can be respectively provided with the projection of the corresponding control mode, and a user can conveniently operate each window.
The step of projecting the control mode mapped with the vehicle function in the same plane according to the distribution of the body positions of the vehicle components around the vehicle comprises the following steps:
s211, projecting the control mode mapped by the vehicle functions on the same plane according to the vehicle body position distribution of the vehicle parts and the control mode mapped by the similar adjacent vehicle parts corresponding to the vehicle functions around the vehicle.
In the invention, in order to simplify the control modes contained in the projection, a user can conveniently and quickly find the projection of the control mode required by the user in the projection of various control modes. The control modes mapped by the vehicle functions can also be projected on the same plane according to the vehicle body position distribution of the vehicle components and the control mode mapped by the similar adjacent vehicle components corresponding to the vehicle functions around the vehicle. Therefore, adjacent similar vehicle components can share the projection of one control mode, and a user can conveniently operate a plurality of vehicle components simultaneously while the control mode contained in the projection is simplified.
As an example of the present invention, the window may include a left front window, a left rear window, a right front window, and a right rear window. If the control mode mapped by the window lifting function needs to be projected, the control modes can be distributed around the vehicle according to the position of the vehicle body of the vehicle part, and the projection can be performed in the same plane according to the control mode mapped by the window control function corresponding to the adjacent window. Therefore, the left front window and the left rear window can correspond to a control mode mapped by a window lifting function together, the right front window and the right rear window can correspond to a control mode mapped by a window lifting function together, a user can operate the left front window and the left rear window simultaneously through the control mode mapped by the window lifting function, and can also operate the right front window and the right rear window simultaneously through the control mode mapped by the window lifting function.
And 202, responding to a control operation sent by a user based on projection, and executing a vehicle function corresponding to the control operation.
In the invention, after the control mode mapped with the vehicle function is projected by the projectable area outside the vehicle, the vehicle function corresponding to the control operation can be executed in response to the control operation sent by the user based on the projection, so as to control the vehicle through the projection.
Specifically, it may be detected whether there is an overlap of the limb of the user with the projection, and in the case of the overlap, it may be considered that the user has operated on the projection. For example, if the user's foot overlaps the projection, the user may be considered to issue a control operation based on the projection.
The gesture of the user may also be detected, and it may be determined whether the gesture of the user has an intention to point to the projection, and in the case that the gesture of the user has an intention to point to the projection, it may be considered that an operation has been performed on the control icon. For example, if the current posture of the user is detected as a pointing projection, it may be considered that a control operation is issued based on the projection.
After the control operation sent by the user based on the projection is detected, the control operation sent by the user based on the projection can be determined, and a vehicle control function corresponding to the control operation is executed, so that the vehicle can be controlled outside the vehicle.
In the present invention, the step of executing a vehicle function corresponding to a control operation issued by a user based on projection includes:
and S31, in the case that the user gestures of one or more users are recognized to respectively point to any projection mark, determining the control operation corresponding to the pointed projection mark, and executing the vehicle function corresponding to the control operation.
In the present invention, the projection may contain at least one projection identification. Each projected identity may correspond to a control manner to which the vehicle function is mapped. In the case where the user operates the projection flag in a different control manner, it can be considered that the user has performed a control operation in a control manner mapped with the vehicle function. The projection identifier may include at least one of a text, a picture, and an animation, which is not limited in this disclosure.
In case the user needs to control the vehicle by projection, the user may usually indicate an intention to point to the projected identity by a physical action. For example, the user may step on the projected logo, or touch the projected logo with a hand, or point the projected logo with a hand, and so forth. At this time, the user gestures of the users may be changed, so that, when the user gestures of one or more users are recognized as being respectively directed to any projection identifier, it may be determined that the users need to control the vehicle through projection, and a control operation corresponding to the directed projection identifier is determined, and a vehicle function corresponding to the control operation is executed.
In the present invention, in a case that the user gestures of one or more users are recognized as pointing to any projection identifiers respectively, the step of determining a control operation corresponding to the pointed projection identifier and executing a vehicle function corresponding to the control operation includes:
s311, if the user postures of the users are identified to respectively point to a plurality of projection identifications corresponding to the multi-user control operation, determining the control operation sent by the users as the multi-user control operation, and executing the vehicle function corresponding to the multi-user control operation.
In the present invention, the vehicle function may be executed by a plurality of users operating simultaneously. For example, where the vehicle function is vehicle unlock, it may require a trainee driver and a driving trainer to simultaneously control vehicle unlock. For another example, in the case where the vehicle function is a multiple-person photo, it may be necessary for multiple persons to confirm that the photo can be taken and then complete the photo taking.
In this case, in a case where the vehicle function corresponding to the multi-person control operation needs to be executed, a plurality of projection identifiers may be set for the vehicle function corresponding to the multi-person control operation needs, so that each user can determine the projection identifier that the user needs to operate. Then, if the user gestures of the users are recognized as pointing to a plurality of projection identifiers corresponding to a multi-user control operation, the users can be considered to operate the projection identifiers in a control mode respectively and send out a control operation, and the control operation points to the multi-user control operation. Therefore, the control operation sent by the user can be determined to be the multi-user control operation, and the vehicle function corresponding to the multi-user control operation is executed, so that the vehicle can be controlled by multiple users at the same time.
In a specific implementation, whether the user gesture points to the projection identifier or not can be detected by adopting a camera, an infrared detection device and the like, so as to determine whether the user needs to control the vehicle through the projection identifier or not.
In the present invention, the step of detecting whether the user gesture is directed to the projection identifier comprises:
s41, detecting whether the contact area of the hand or the foot of the user and the projection mark is larger than a preset area threshold value and whether the contact duration is larger than a first preset duration;
in the present invention, when the user indicates an intention to point to the projected mark through a body action, he may generally contact the projected mark with his hand or foot. However, when the user touches the projected mark, the user may only unintentionally touch the projected mark, or there may be an intention to control the vehicle by operating the touch icon. There is therefore a need to further determine if the user does have an intention to touch the projected identity to control the vehicle.
Generally, if there is an intention of the user to touch the projected mark, the user can contact the projected mark more obviously and can maintain the contact for a longer time. Therefore, whether the contact area of the hand or the foot of the user and the projection identifier is larger than a preset area threshold value and whether the contact duration is larger than a preset duration can be detected to determine whether the user has the intention of touching the projection identifier.
The preset area threshold value can be determined based on the area of the projection identifier, the projection position of the projection identifier, and a possible operation mode of a user. For example, if the projection identifier is projected onto the vehicle body and the area of the projection identifier is small, the user is more likely to operate the projection identifier by a hand, and the preset area threshold may be set to be a threshold at which most of the hand can be overlapped with the projection identifier, such as one quarter, one third, one half, and the like. If the projection identifier is projected on the ground, and the area of the projection identifier is large, the user is more likely to operate through the feet, and then a threshold value that most feet can be overlapped with the projection identifier can be set, wherein the preset area threshold value is one eighth, one sixth and the like.
The first preset time duration may be determined according to actual needs, as long as it can be considered that the user has a definite intention to touch the projection identifier. E.g., 2s, 5s, etc.
And S42, if the contact area between the hand or foot of the user and the projection identifier is larger than a preset area threshold value and the contact duration is larger than a first preset duration, determining that the user contacts the projection identifier.
In the invention, if the contact area between the hand or foot of the user and the projection identifier is greater than a preset area threshold value and the contact duration is greater than a first preset duration, it can be considered that the user has a definite intention to control the projection identifier, rather than mistakenly touching the projection identifier, it can be determined that the user touches the projection identifier, and then the vehicle function corresponding to the projection identifier is executed in response to the operation of the user on the projection identifier.
In the present invention, the step of detecting whether the user gesture is directed to the projection identifier comprises:
s51, detecting whether the extending direction of the hand or foot or sight line of the user points to the projection identifier or not and whether the user posture maintaining time length is greater than a second preset time length or not;
in the invention, the projection mark can be arranged at a place which cannot be directly contacted by a user. Such as side walls, ceilings, etc. In this case, the user cannot operate the projected mark by touching the projected mark, but can operate the projected mark by a specific gesture. Specifically, the projected marks may be operated in such a manner that a hand or a foot is extended to point at the projected marks or in such a manner that eyes are kept looking at a certain projected mark.
In this case, it may be detected whether the extending direction of the hand or foot or the sight line of the user is directed to a certain projection identifier, and whether the user gesture maintaining time length is greater than a second preset time length, so as to determine whether the projection identifier is being operated through a specific gesture.
The second preset time duration may be determined according to actual needs, and it is only necessary to determine a time duration that the user can be considered to have a clear intention to point to the projection identifier. E.g., 2s, 5s, etc.
And S52, if the extending direction of the hand or foot or the sight line of the user points to the projection identifier and the user posture maintaining time length is longer than a second preset time length, determining that the user posture points to the projection identifier.
In the invention, if the extending direction of the hand, the foot or the sight of the user points to the projection identifier and the maintaining time of the user posture is longer than a second preset time, the user can be considered to have a definite intention to control the projection identifier instead of unintentionally pointing to the projection identifier, the user posture can be determined to point to the projection identifier, and then the vehicle function corresponding to the projection identifier is executed in response to the operation of the user on the projection identifier.
The method further comprises the following steps:
and S61, if the state of the vehicle component corresponding to the vehicle function changes, adjusting the projection content corresponding to the vehicle function to prompt the user that the projected vehicle function changes.
In the present invention, the vehicle function mapped by the projected control manner may remain corresponding to a vehicle component, but the vehicle function mapped by the projected control manner may have two possible operations. And the vehicle function may correspond to having different available operations when the vehicle component is in different states. In this case, if the state of the vehicle component changes, the projected vehicle function may also change accordingly. In this case, in order to prompt the user that the projected vehicle function has changed, the projection may be continued while maintaining the control method mapped to the vehicle function, and the projection content corresponding to the vehicle function may be adjusted. Therefore, the user can know that the projected vehicle function changes according to the change of the projection content.
Specifically, the projected content corresponding to the vehicle function may be matched to the state of the vehicle component. For example, in the case that the rear view mirror is in the folded state, the projection content corresponding to the vehicle function may accordingly show the pattern of the rear view mirror in the folded state.
Alternatively, the projected content corresponding to the vehicle function may be matched to the operation that may be performed by the vehicle component. For example, when the trunk is in a closed state, the projection content corresponding to the vehicle function may correspondingly show a pattern that the trunk can be opened.
If the state of the vehicle component changes, the projection content corresponding to the vehicle function can be adjusted to change the state of the vehicle component shown by the projection content corresponding to the vehicle function, or the operation that can be performed by the vehicle component shown by the projection content corresponding to the vehicle function can be performed, so that the user can clearly know that the operation that can be performed by the vehicle control function corresponding to the control icon changes.
As an example of the present invention, the projection may be used to perform a trunk open and close function. The trunk is in a closed state. The projection content corresponding to the trunk opening and closing function can show the pattern of the trunk opening operation, so that a user can know that the trunk opening operation can be performed currently through the control icon. And then, the user touches the control icon through the foot part to control the trunk to be opened, so that the trunk can be in an opened state. The projection content corresponding to the trunk opening and closing function can be changed into a pattern for displaying the operation of closing the trunk so as to prompt a user that the trunk opening and closing function changes, and the projection of the trunk opening and closing function can be used for controlling the trunk to be closed at present.
In the present invention, the step of executing a vehicle function corresponding to a control operation issued by a user based on projection includes:
s611, responding to a plurality of times of continuous control operation sent by a user to a vehicle function based on projection, and aiming at each time of control operation, executing the vehicle function corresponding to the control operation to adjust a vehicle component, and simultaneously adjusting projection content corresponding to the vehicle function.
In the invention, a user can issue a plurality of times of continuous control operations for a vehicle function based on projection, so that vehicle components corresponding to the vehicle function can be continuously switched in different states. At this time, in response to a plurality of times of continuous control operations issued by a user to a vehicle function based on projection, for each control operation, the vehicle function corresponding to the control operation is executed to adjust a vehicle component, and meanwhile, projection content corresponding to the vehicle function is adjusted, so that the user can know how the vehicle component is adjusted correspondingly by each executed control operation based on projection.
For example, when the vehicle function corresponding to the projection is the window adjustment function, multiple continuous control operations sent by a user to the window adjustment function based on the projection may be responded, and for each control operation, the window adjustment function corresponding to the control operation may be executed to adjust the switching state of the window between closed, half-opened, and fully-opened.
In the present invention, the step of executing a vehicle function corresponding to a control operation issued by a user based on projection includes:
s71, responding to the control operation sent by the user based on the projection, and receiving the adjustment instruction sent by the user;
in the present invention, the projection of the corresponding vehicle function has at least two possible operations. For example, the air suspension can be controlled to rise or fall by projection. The temperature in the vehicle can be controlled to rise or fall through projection. The lighting can be controlled to be on or off through projection. The mirror angle can be controlled to increase or decrease by projection. The vehicle-mounted display screen can be controlled through projection to display a navigation interface, a music playing interface, a video playing interface and the like.
In this case, the vehicle function is performed in order to determine how the user specifically needs to perform. Adjustment instructions issued by the user through gestures, voice, electronic equipment held by the user and the like can be further received in response to control operation issued by the user based on the projection so as to determine how to control the vehicle through the vehicle function.
Specifically, the user may form a specific gesture by swinging the hand to issue the adjustment instruction. For example, while the user touches the projection corresponding to the air suspension control function with the foot, the user may also swing the hand up and down to issue a gesture for controlling the air suspension to ascend or descend, thereby forming an adjustment instruction for controlling the air suspension to ascend or descend.
The user may also speak directly to issue the adjustment indication. For example, the user may issue a voice "mirror angle increase" while pointing the hand to the projection corresponding to the mirror control function, thereby forming an adjustment instruction to control the mirror angle increase.
The user may also issue the adjustment instruction through an electronic device in his possession. For example, a user can touch the projection corresponding to the control function of the vehicle-mounted display screen through the foot part, and simultaneously select a music song desired to be played through the electronic device, so as to form an instruction for controlling the vehicle-mounted display screen to play the music.
And S72, executing the vehicle function corresponding to the control operation according to the adjustment instruction.
In the embodiment of the invention, after the adjustment instruction is received, the vehicle function corresponding to the control operation can be executed according to the adjustment instruction, so that the vehicle control function is controlled to be executed by the projection control vehicle to adjust the vehicle component, and the operation mode of the vehicle component can be determined according to the requirement of the adjustment instruction.
In the present invention, the method further comprises:
s81, in response to the projection setting operation issued by the user, setting a vehicle function allowing projection and/or setting a projection area.
In the invention, a user can allow the projected vehicle function and/or set the projection area according to the requirement of the user, so that the user can control the vehicle outside the vehicle according to the preference of the user.
Specifically, the user may select a vehicle function preferred to be used by the user from a plurality of preset vehicle functions through interaction with the vehicle-mounted display screen or through an electronic device in communication connection with the vehicle.
And then, the user can set the projection area according to actual needs. For example, the projection area is set as the ground of the corresponding area where the vehicle component is located, the projection area is set as a wall, the projection area is set as the vehicle component, and the like, which is not limited in the present invention.
In the present invention, the method comprises:
and S91, in response to the control operation sent by the user based on the projection, projecting feedback display content in a projectable area outside the vehicle to prompt the user that the vehicle has detected the control operation of the user.
In the present invention, the corresponding vehicle control function may be used to control smaller vehicle components due to the projection. For example, if the vehicle control function is used to control the opening and closing of the slow charging port, the user may not be able to clearly notice that the slow charging port is opened or closed even if the user has responded to the control operation of the control icon. So that it is difficult for the user to know whether the control operation issued by the user based on the projection has been detected by the vehicle, and the vehicle control function corresponding to the control operation can be executed.
Therefore, after responding to the control operation sent by the user based on the projection, the feedback display content can be projected in the projectable area outside the vehicle to prompt the user that the vehicle has detected the control operation sent by the user based on the projection, and the vehicle can execute the vehicle control function corresponding to the control icon.
Specifically, the feedback display content may be displayed at a projection pointed by the user, or may be displayed in a part of or all of the projectable area, which is not limited in the present invention.
The feedback content may include a light flash, a light color change, a display of a designated text, a display of a designated pattern, and the like, which is not limited in the present invention.
As an example of the present invention, a user touches a projection on the ground with a foot. After determining that the control operation issued by the user based on the projection is detected, the continuously flickering projection content can be projected in response to the control operation issued by the user based on the projection to prompt the user that the vehicle has detected the control operation issued by the user based on the projection.
As another example of the present invention, after determining that the control operation issued by the user based on the projection is detected, the user may project a text "xx function is being executed" in the side wall area in response to the control operation issued by the user based on the projection by pointing to the control icon located on the side wall with a hand, so as to prompt the user that the vehicle has detected the control operation issued by the user based on the projection.
As another example of the present invention, after detecting that the user's gaze direction is directed to the projection, and determining that the control operation issued by the user based on the projection is detected, the user may display the feedback presentation on the projection of the ceiling in response to the control operation issued by the user based on the projection, and dynamically change the projected pattern to another pattern to prompt the user that the control operation issued by the user based on the projection has been detected by the vehicle.
By the vehicle control method, the control mode mapped with the vehicle function is projected in the area where the vehicle component is located, and/or the control mode mapped with the vehicle function is projected in the projectable area in the user sight line direction outside the vehicle; and responding to the control operation sent by the user based on the projection, and executing the vehicle function corresponding to the control operation. Therefore, the user can control the vehicle outside the vehicle, so that the user can control the vehicle more conveniently and more conveniently, and the user experience is improved.
It is noted that, for simplicity of explanation, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will appreciate that the present invention is not limited by the order of acts, as some steps may, in accordance with the present invention, occur in other orders and/or concurrently. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.
Referring to fig. 4, a block diagram of a vehicle embodiment of the present invention is shown, which may specifically include the following modules:
a projection module 401 for projecting a control manner mapped with a vehicle function in a projectable region outside the vehicle;
and the executing module 402 is used for responding to the control operation sent by the user based on the projection and executing the vehicle function corresponding to the control operation.
Optionally, the apparatus further comprises:
and the projection adjusting module is used for adjusting projection contents corresponding to the vehicle functions to prompt a user that the projected vehicle functions change if the vehicle component states corresponding to the vehicle functions change.
Optionally, the execution module includes:
the continuous execution sub-module is used for responding to a plurality of times of continuous control operation sent by a user to a vehicle function based on projection, aiming at each time of the control operation, executing the vehicle function corresponding to the control operation to adjust a vehicle component, and simultaneously adjusting projection content corresponding to the vehicle function.
Optionally, the execution module includes:
the vehicle component projection sub-module is used for projecting the control modes mapped with the vehicle functions in the area where the vehicle component is located, and/or the sight line projection sub-module is used for projecting the control modes mapped with the vehicle functions in the projectable area in the user sight line direction outside the vehicle.
Optionally, the region projection sub-module includes:
the plane projection unit is used for projecting the control modes mapped with the vehicle functions on the same plane according to the vehicle body positions of the vehicle parts around the vehicle, and/or the part projection unit is used for projecting the control modes mapped with the vehicle functions on the vehicle parts corresponding to the control icons.
Optionally, the planar projection unit comprises:
and the corresponding projection subunit is used for projecting the control modes mapped by the vehicle functions on the same plane according to the vehicle body position distribution of the vehicle components and the control mode mapped by the vehicle functions corresponding to each vehicle component around the vehicle.
Optionally, the planar projection subunit includes:
and the classified projection subunit is used for projecting the control modes mapped by the vehicle functions on the same plane according to the vehicle body position distribution of the vehicle components and the control mode mapped by the similar adjacent vehicle components corresponding to the vehicle functions around the vehicle.
Optionally, the execution module includes:
and the gesture recognition submodule is used for determining the control operation corresponding to the pointed projection identifier and executing the vehicle function corresponding to the control operation under the condition that the user gestures of one or more users are recognized to point to any projection identifier respectively.
Optionally, the gesture recognition sub-module comprises:
and the multi-user identification unit is used for determining the control operation sent by the user as the multi-user control operation and executing the vehicle function corresponding to the multi-user control operation if the user postures of the users are identified as the projection identifications corresponding to the multi-user control operation respectively.
Optionally, the execution module includes:
the instruction receiving submodule is used for responding to control operation sent by a user based on projection and receiving an adjustment instruction sent by the user;
and the instruction adjusting submodule is used for executing the vehicle function corresponding to the control operation according to the adjusting instruction.
Optionally, the apparatus further comprises:
and the area setting module is used for setting the vehicle function allowing projection and/or setting the projection area in response to the projection setting operation sent by the user.
Optionally, the apparatus comprises:
and the feedback module is used for responding to the control operation sent by the user based on the projection, and projecting feedback display content in a projectable area outside the vehicle so as to prompt the user that the vehicle detects the control operation of the user.
For the device embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
The present invention also provides a vehicle comprising:
one or more processors; and
one or more machine readable media having instructions stored thereon that, when executed by the one or more processors, cause the vehicle to perform the method of the present invention.
The present invention also provides one or more machine-readable media having instructions stored thereon, which when executed by one or more processors, cause the processors to perform the methods of the present invention.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention has been described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing terminal to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing terminal to cause a series of operational steps to be performed on the computer or other programmable terminal to produce a computer implemented process such that the instructions which execute on the computer or other programmable terminal provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the invention.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or terminal that comprises the element.
The above detailed description is provided for a vehicle control method and a vehicle, and the principle and the embodiment of the present invention are explained by applying specific examples, and the description of the above embodiments is only used to help understanding the method and the core idea of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (15)

1. A vehicle control method characterized by comprising:
projecting a control mode mapped with a vehicle function in a projectable area outside the vehicle;
and responding to the control operation sent by the user based on the projection, and executing the vehicle function corresponding to the control operation.
2. The method of claim 1, further comprising:
and if the state of the vehicle component corresponding to the vehicle function changes, adjusting the projection content corresponding to the vehicle function to prompt the user that the projected vehicle function changes.
3. The method of claim 2, wherein the step of executing the vehicle function corresponding to the control operation in response to the control operation issued by the user based on the projection comprises:
responding to a plurality of times of continuous control operation sent by a user to a vehicle function based on projection, and executing the vehicle function corresponding to the control operation to adjust a vehicle component and simultaneously adjusting projection content corresponding to the vehicle function for each time of the control operation.
4. The method of claim 1, wherein the step of projecting the control pattern mapped to the vehicle function at the projectable region outside the vehicle comprises:
the control modes mapped to the vehicle functions are projected in an area where the vehicle component is located, and/or the control modes mapped to the vehicle functions are projected in a projectable area in a user's line of sight direction outside the vehicle.
5. The method of claim 4, wherein the step of projecting the control pattern mapped to the vehicle function in the area of the vehicle component comprises:
and projecting the control modes mapped with the vehicle functions on the vehicle parts corresponding to the control icons.
6. The method of claim 5, wherein the step of projecting the control pattern mapped to the vehicle function around the vehicle according to the body position distribution of the vehicle component in the same plane comprises:
and projecting the control modes mapped by the vehicle functions on the same plane according to the vehicle body position distribution of the vehicle components and the control mode mapped by the vehicle functions corresponding to each vehicle component around the vehicle.
7. The method of claim 5, wherein the step of projecting the control pattern mapped to the vehicle function around the vehicle according to the body position distribution of the vehicle component in the same plane comprises:
and projecting the control modes mapped by the vehicle functions on the same plane according to the vehicle body position distribution of the vehicle components and the control mode mapped by the similar adjacent vehicle components corresponding to the vehicle functions around the vehicle.
8. The method of claim 1, wherein the step of executing the vehicle function corresponding to the control operation in response to the control operation issued by the user based on the projection comprises:
and under the condition that the user gestures of one or more users are recognized to respectively point to any projection identifier, determining the control operation corresponding to the pointed projection identifier, and executing the vehicle function corresponding to the control operation.
9. The method according to claim 8, wherein in the case that the user gestures of one or more users are recognized to be respectively directed to any one of the projected identifiers, the step of determining the control operation corresponding to the directed projected identifier, and executing the vehicle function corresponding to the control operation comprises:
and if the user postures of the users are recognized to respectively point to a plurality of projection identifications corresponding to the multi-user control operation, determining the control operation sent by the users as the multi-user control operation, and executing the vehicle function corresponding to the multi-user control operation.
10. The method of claim 1, wherein the step of executing the vehicle function corresponding to the control operation in response to the control operation issued by the user based on the projection comprises:
responding to control operation sent by a user based on projection, and receiving an adjustment instruction sent by the user;
and executing the vehicle function corresponding to the control operation according to the adjustment instruction.
11. The method of claim 1, further comprising:
in response to a projection setting operation issued by a user, a vehicle function allowing projection is set and/or a projection area is set.
12. The method according to claim 1, characterized in that it comprises:
and in response to the control operation sent by the user based on the projection, projecting feedback display content in a projectable area outside the vehicle to prompt the user that the vehicle has detected the control operation of the user.
13. A vehicle, characterized by comprising:
the projection module is used for projecting a control mode mapped with the vehicle function in a projectable area outside the vehicle;
and the execution module is used for responding to the control operation sent by the user based on the projection and executing the vehicle function corresponding to the control operation.
14. A vehicle, characterized by comprising:
one or more processors; and
one or more machine readable media having instructions stored thereon that, when executed by the one or more processors, cause the vehicle to perform the method of any of claims 1-12.
15. One or more machine readable media having instructions stored thereon that, when executed by one or more processors, cause the processors to perform the method of any of claims 1-12.
CN202110692567.XA 2021-06-22 2021-06-22 Vehicle control method and vehicle Pending CN113247007A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110692567.XA CN113247007A (en) 2021-06-22 2021-06-22 Vehicle control method and vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110692567.XA CN113247007A (en) 2021-06-22 2021-06-22 Vehicle control method and vehicle

Publications (1)

Publication Number Publication Date
CN113247007A true CN113247007A (en) 2021-08-13

Family

ID=77189161

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110692567.XA Pending CN113247007A (en) 2021-06-22 2021-06-22 Vehicle control method and vehicle

Country Status (1)

Country Link
CN (1) CN113247007A (en)

Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060145825A1 (en) * 2005-01-05 2006-07-06 Mccall Clark E Virtual keypad for vehicle entry control
DE102008021989A1 (en) * 2008-05-02 2008-12-18 Daimler Ag Device for locking and unlocking unit of electronic conditional access system secured vehicle door for vehicle, has unlocking sensor that is formed as project control panel for unlocking vehicle door
CN101519935A (en) * 2007-06-01 2009-09-02 通用汽车环球科技运作公司 Arms full vehicle closure activation apparatus and method
JP2016029532A (en) * 2014-07-25 2016-03-03 小島プレス工業株式会社 User interface
CN105625857A (en) * 2015-12-25 2016-06-01 北京新能源汽车股份有限公司 Control device and method for vehicle window glass and vehicle
CN205971284U (en) * 2016-08-05 2017-02-22 威马汽车技术有限公司 Virtual button carries out device controlled to vehicle through operation
CN106437396A (en) * 2016-09-23 2017-02-22 奇瑞汽车股份有限公司 Automobile window glass lifting control system and automobile
US9616802B1 (en) * 2015-11-02 2017-04-11 AISIN Technical Center of America, Inc. Apparatus and method to visually communicate with a vehicle
US20170106836A1 (en) * 2014-03-26 2017-04-20 Magna Mirrors Of America, Inc. Vehicle function control system using sensing and icon display module
WO2017084286A1 (en) * 2015-11-20 2017-05-26 广景视睿科技(深圳)有限公司 Multi-interactive projection system and method
CN108204187A (en) * 2016-12-19 2018-06-26 大众汽车(中国)投资有限公司 For the method and apparatus of the boot of unlocking vehicle
CN108399636A (en) * 2017-02-08 2018-08-14 现代自动车株式会社 Projection orientation amendment system for vehicle
CN108482386A (en) * 2018-03-30 2018-09-04 北京新能源汽车股份有限公司 Vehicular interaction system and method for vehicle control
US20190164344A1 (en) * 2016-08-18 2019-05-30 Apple Inc. System and method for interactive scene projection
US20190217715A1 (en) * 2018-01-15 2019-07-18 Ford Global Technologies, Llc Ev charging projector
US20200055397A1 (en) * 2016-11-03 2020-02-20 Visteon Global Technologies, Inc. User interface and method for the input and output of information in a vehicle
CN110825271A (en) * 2019-11-13 2020-02-21 一汽轿车股份有限公司 Vehicle-mounted AR holographic projection interaction device
US20210070221A1 (en) * 2016-10-20 2021-03-11 Google Llc Automated pacing of vehicle operator content interaction
CN112837407A (en) * 2021-01-22 2021-05-25 中汽创智科技有限公司 Intelligent cabin holographic projection system and interaction method thereof
CN112959945A (en) * 2021-03-02 2021-06-15 广州小鹏汽车科技有限公司 Vehicle window control method and device, vehicle and storage medium

Patent Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060145825A1 (en) * 2005-01-05 2006-07-06 Mccall Clark E Virtual keypad for vehicle entry control
CN101519935A (en) * 2007-06-01 2009-09-02 通用汽车环球科技运作公司 Arms full vehicle closure activation apparatus and method
DE102008021989A1 (en) * 2008-05-02 2008-12-18 Daimler Ag Device for locking and unlocking unit of electronic conditional access system secured vehicle door for vehicle, has unlocking sensor that is formed as project control panel for unlocking vehicle door
US20170106836A1 (en) * 2014-03-26 2017-04-20 Magna Mirrors Of America, Inc. Vehicle function control system using sensing and icon display module
JP2016029532A (en) * 2014-07-25 2016-03-03 小島プレス工業株式会社 User interface
US20170120798A1 (en) * 2015-11-02 2017-05-04 AISIN Technical Center of America, Inc. Apparatus and method to visually communicate with a vehicle
US9616802B1 (en) * 2015-11-02 2017-04-11 AISIN Technical Center of America, Inc. Apparatus and method to visually communicate with a vehicle
WO2017084286A1 (en) * 2015-11-20 2017-05-26 广景视睿科技(深圳)有限公司 Multi-interactive projection system and method
CN105625857A (en) * 2015-12-25 2016-06-01 北京新能源汽车股份有限公司 Control device and method for vehicle window glass and vehicle
CN205971284U (en) * 2016-08-05 2017-02-22 威马汽车技术有限公司 Virtual button carries out device controlled to vehicle through operation
US20190164344A1 (en) * 2016-08-18 2019-05-30 Apple Inc. System and method for interactive scene projection
CN106437396A (en) * 2016-09-23 2017-02-22 奇瑞汽车股份有限公司 Automobile window glass lifting control system and automobile
US20210070221A1 (en) * 2016-10-20 2021-03-11 Google Llc Automated pacing of vehicle operator content interaction
US20200055397A1 (en) * 2016-11-03 2020-02-20 Visteon Global Technologies, Inc. User interface and method for the input and output of information in a vehicle
CN108204187A (en) * 2016-12-19 2018-06-26 大众汽车(中国)投资有限公司 For the method and apparatus of the boot of unlocking vehicle
CN108399636A (en) * 2017-02-08 2018-08-14 现代自动车株式会社 Projection orientation amendment system for vehicle
US20190217715A1 (en) * 2018-01-15 2019-07-18 Ford Global Technologies, Llc Ev charging projector
CN108482386A (en) * 2018-03-30 2018-09-04 北京新能源汽车股份有限公司 Vehicular interaction system and method for vehicle control
CN110825271A (en) * 2019-11-13 2020-02-21 一汽轿车股份有限公司 Vehicle-mounted AR holographic projection interaction device
CN112837407A (en) * 2021-01-22 2021-05-25 中汽创智科技有限公司 Intelligent cabin holographic projection system and interaction method thereof
CN112959945A (en) * 2021-03-02 2021-06-15 广州小鹏汽车科技有限公司 Vehicle window control method and device, vehicle and storage medium

Similar Documents

Publication Publication Date Title
US11366513B2 (en) Systems and methods for user indication recognition
CN104204729B (en) User terminal apparatus and its control method
JP5666264B2 (en) In-vehicle device control system and method using augmented reality
JP4333697B2 (en) Vehicle display device
KR102029842B1 (en) System and control method for gesture recognition of vehicle
CN208207729U (en) A kind of vehicle operating system and a kind of electric vehicle for operating vehicle
EP3260331A1 (en) Information processing device
CN111002946B (en) Vehicle control method and control system
US10649587B2 (en) Terminal, for gesture recognition and operation command determination, vehicle having the same and method for controlling the same
EP2881878A2 (en) Vehicle control by means of gestural input on external or internal surface
EP2876529A1 (en) Unlocking mobile device with various patterns on black screen
EP3395600A1 (en) In-vehicle device
US9052750B2 (en) System and method for manipulating user interface by 2D camera
KR20160024899A (en) robot cleaner
KR20150028152A (en) robot cleaner system and a control method of the same
KR20140079162A (en) System and method for providing a user interface using finger start points shape recognition in a vehicle
CN104881117B (en) A kind of apparatus and method that speech control module is activated by gesture identification
US11052761B2 (en) Vehicle display system using gaze interactions with multiple display areas
US20180316911A1 (en) Information processing apparatus
US20140379212A1 (en) Blind control system for vehicle
WO2022062491A1 (en) Vehicle-mounted smart hardware control method based on smart cockpit, and smart cockpit
KR20180036556A (en) Operating apparatus for vehicle
KR20200093091A (en) Terminal device, vehicle having the same and method for controlling the same
WO2022267354A1 (en) Human-computer interaction method and apparatus, and electronic device and storage medium
US20210087869A1 (en) Method and Device for the Non-Mechanical Activation of a Closure Arrangement

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210813

RJ01 Rejection of invention patent application after publication