CN109969125B - Human-vehicle interaction method and system during vehicle locking and vehicle - Google Patents

Human-vehicle interaction method and system during vehicle locking and vehicle Download PDF

Info

Publication number
CN109969125B
CN109969125B CN201910267504.2A CN201910267504A CN109969125B CN 109969125 B CN109969125 B CN 109969125B CN 201910267504 A CN201910267504 A CN 201910267504A CN 109969125 B CN109969125 B CN 109969125B
Authority
CN
China
Prior art keywords
real
vehicle
time
intensity
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910267504.2A
Other languages
Chinese (zh)
Other versions
CN109969125A (en
Inventor
陆宇哲
叶奕航
任建华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Xiaopeng Motors Technology Co Ltd
Original Assignee
Guangzhou Xiaopeng Motors Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Xiaopeng Motors Technology Co Ltd filed Critical Guangzhou Xiaopeng Motors Technology Co Ltd
Priority to CN201910267504.2A priority Critical patent/CN109969125B/en
Publication of CN109969125A publication Critical patent/CN109969125A/en
Application granted granted Critical
Publication of CN109969125B publication Critical patent/CN109969125B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • B60R25/10Fittings or systems for preventing or indicating unauthorised use or theft of vehicles actuating a signalling device
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • B60R25/10Fittings or systems for preventing or indicating unauthorised use or theft of vehicles actuating a signalling device
    • B60R25/1004Alarm systems characterised by the type of sensor, e.g. current sensing means

Abstract

A human-vehicle interaction method and system during vehicle locking and a vehicle are provided, and the method comprises the following steps: detecting real-time distance change between a vehicle and a pedestrian when a vehicle locking command is detected; controlling the vehicle to output a target light effect and a target sound effect in response to the real-time distance change. By implementing the embodiment of the invention, the user can be prompted that the vehicle is locked through the continuous light effect and the sound effect, and the user does not need to confirm again, so that the operation of confirming the vehicle locking by the user can be reduced, and the user experience is improved.

Description

Human-vehicle interaction method and system during vehicle locking and vehicle
Technical Field
The invention relates to the technical field of intelligent automobiles, in particular to a human-vehicle interaction method and system during vehicle locking and a vehicle.
Background
With the increase of the automobile holding capacity, vehicles have become common transportation tools for people to go out daily. In order to ensure the safety of the vehicle, the driver often performs the vehicle locking operation after parking: the locking of the door is triggered by a key so that the door cannot be opened from the outside of the vehicle.
However, in practice, it is found that since the operation of locking the vehicle is simple, the driver can easily forget whether the operation of locking the vehicle has been performed. For safety reasons, most drivers will return to the side of the door again, pulling the door handle to confirm that the vehicle is locked; or, the key for locking the vehicle is pressed again. Therefore, unnecessary trouble is easily brought to a driver, the vehicle locking operation is complicated, and the user experience is poor.
Disclosure of Invention
The embodiment of the invention discloses a human-vehicle interaction method and system during vehicle locking and a vehicle, which can reduce the operation of vehicle locking confirmation by a user and improve the user experience.
The embodiment of the invention discloses a human-vehicle interaction method during vehicle locking, which comprises the following steps:
detecting real-time distance change between a vehicle and a pedestrian when a vehicle locking command is detected;
controlling the vehicle to output a target light effect and a target sound effect in response to the real-time distance change.
As an alternative implementation, in the first aspect of the embodiment of the present invention, the controlling the vehicle to output a target lamp light effect and a target sound effect in response to the real-time distance change includes:
and when the real-time distance between the vehicle and the pedestrian is detected to be changed, controlling the vehicle to output a target lamp light effect and a target sound effect.
As an alternative implementation, in the first aspect of the embodiment of the present invention, the controlling the vehicle to output a target lamp light effect and a target sound effect in response to the real-time distance change includes:
determining real-time light intensity corresponding to the real-time distance change and real-time sound intensity corresponding to the real-time distance change;
controlling the vehicle to output a target light effect according to the real-time light intensity;
controlling the vehicle to output a target sound effect according to the real-time sound intensity;
when the real-time distance is increased, the real-time light intensity and the real-time sound intensity are both reduced;
when the real-time distance is increased, the real-time lamplight intensity is increased and the real-time sound intensity is reduced;
when the real-time distance is increased, the real-time light intensity is decreased and the real-time sound intensity is increased;
when the real-time distance is increased, the real-time lamplight intensity and the real-time sound intensity are increased;
when the real-time distance is reduced, the real-time light intensity and the real-time sound intensity are increased;
when the real-time distance is decreased, the real-time light intensity is decreased and the real-time sound intensity is increased;
when the real-time distance is decreased, the real-time light intensity is increased and the real-time sound intensity is decreased;
when the real-time distance is decreased, both the real-time light intensity and the real-time sound intensity are decreased.
As an alternative implementation, in the first aspect of the embodiment of the present invention, the vehicle front portion is provided with a through-type light strip; the control the vehicle is according to real-time light intensity output target light effect includes:
and controlling the penetrating lamp strip to output the light effect of the target lamp according to the real-time light intensity.
As an alternative implementation, in the first aspect of the embodiment of the present invention, before the controlling the vehicle to output a target light effect and a target sound effect in response to the real-time distance change, the method further includes:
judging whether the current date is a preset specific date or not;
if yes, determining the light effect corresponding to the specific date as a target light effect, and determining the sound effect corresponding to the specific date as a target sound effect;
if not, determining the light effect corresponding to the vehicle locking instruction as a target light effect, and determining the sound effect corresponding to the vehicle locking instruction as a target sound effect;
the lighting effect corresponding to the specific date is different from the lighting effect corresponding to the vehicle locking instruction; the sound effect corresponding to the specific date is different from the sound effect corresponding to the car locking instruction.
As an alternative implementation, in the first aspect of the embodiment of the present invention, the detecting a real-time distance change between the vehicle and the pedestrian includes:
detecting a wireless signal transmitted by an electronic key of a vehicle;
determining real-time distance change between the vehicle and the pedestrian according to the strength of the wireless signal;
alternatively, the detecting a real-time distance change between the vehicle and the pedestrian includes:
detecting the illumination intensity of the environment where the vehicle is located, and judging whether the illumination intensity is higher than a preset light intensity threshold value or not;
if the light intensity is higher than the light intensity threshold value, controlling a camera of the vehicle to shoot an image, and identifying real-time distance change between the vehicle and the pedestrian according to the image;
and if the light intensity is not higher than the light intensity threshold value, detecting the real-time distance change between the vehicle and the pedestrian by using the distance measuring sensor of the vehicle.
As an alternative implementation, in the first aspect of the embodiment of the present invention, after the controlling the vehicle to output a target light effect and a target sound effect in response to the real-time distance change, the method further includes:
and if the real-time distance change indicates that the distance between the pedestrian and the vehicle is reduced, controlling the vehicle to stop outputting the target light effect and the target sound effect.
The second aspect of the embodiments of the present invention discloses a human-vehicle interaction system, which includes:
the detection unit is used for detecting the real-time distance change between the vehicle and the pedestrian when the vehicle locking instruction is detected;
a control unit for controlling the vehicle to output a target light effect and a target sound effect in response to the real-time distance change.
As an alternative implementation, in the second aspect of the embodiment of the present invention:
the first control unit is specifically used for controlling the vehicle to output a target light effect and a target sound effect when the fact that the real-time distance between the vehicle and the pedestrian is changed is detected.
As an optional implementation manner, in a second aspect of the embodiment of the present invention, the control unit includes:
the first determining subunit is used for determining the real-time lamplight intensity corresponding to the real-time distance change and the real-time sound intensity corresponding to the real-time distance change;
the first control subunit is used for controlling the vehicle to output a target lamp light effect according to the real-time light intensity;
the second control subunit is used for controlling the vehicle to output a target sound effect according to the real-time sound intensity;
when the real-time distance is increased, the real-time light intensity and the real-time sound intensity are both reduced;
when the real-time distance is increased, the real-time lamplight intensity is increased and the real-time sound intensity is reduced;
when the real-time distance is increased, the real-time light intensity is decreased and the real-time sound intensity is increased;
when the real-time distance is increased, the real-time lamplight intensity and the real-time sound intensity are increased;
when the real-time distance is reduced, the real-time light intensity and the real-time sound intensity are increased;
when the real-time distance is decreased, the real-time light intensity is decreased and the real-time sound intensity is increased;
when the real-time distance is decreased, the real-time light intensity is increased and the real-time sound intensity is decreased;
when the real-time distance is decreased, both the real-time light intensity and the real-time sound intensity are decreased.
As an alternative implementation, in the second aspect of the embodiment of the present invention, the vehicle is provided with a pass-through light strip;
and the first control subunit is specifically used for controlling the through type lamp strip to output the light effect of the target lamp according to the real-time light intensity.
As an optional implementation manner, in the second aspect of the embodiment of the present invention, the system further includes:
the second judging unit is used for judging whether the current date is a preset specific date or not;
an effect determination unit configured to determine, when the second determination unit determines that the current date is the specific date, a light effect corresponding to the specific date as a target light effect, and determine a sound effect corresponding to the specific date as a target sound effect; when the second judging unit judges that the current date is not the specific date, determining the light effect corresponding to the vehicle locking instruction as a target light effect and determining the sound effect corresponding to the vehicle locking instruction as a target sound effect;
the lighting effect corresponding to the specific date is different from the lighting effect corresponding to the vehicle locking instruction; the sound effect corresponding to the specific date is different from the sound effect corresponding to the car locking instruction.
As an optional implementation manner, in the second aspect of the embodiment of the present invention, the manner of detecting the real-time distance change between the vehicle and the pedestrian by the detection unit is specifically:
the detection unit is used for detecting a wireless signal sent by an electronic key of a vehicle and determining real-time distance change between the vehicle and a pedestrian according to the intensity of the detected wireless signal;
or the detection unit is used for detecting the illumination intensity of the environment where the vehicle is located and judging whether the illumination intensity is higher than a preset light intensity threshold value or not; when the illumination intensity is judged to be higher than the light intensity threshold value, controlling a camera of the vehicle to shoot an image, and identifying real-time distance change between the vehicle and the pedestrian according to the image; and when the illumination intensity is judged not to be higher than the light intensity threshold value, detecting the real-time distance change between the vehicle and the pedestrian by using the distance measuring sensor of the vehicle.
As an alternative implementation, in the second aspect of the embodiment of the present invention, the control unit is further configured to, after controlling the vehicle to output a target light effect and a target sound effect in response to the real-time distance change, control the vehicle to stop outputting the target light effect and the target sound effect if the real-time distance change indicates that the distance between the pedestrian and the vehicle decreases.
The third aspect of the embodiment of the invention discloses a vehicle which comprises any one of the human-vehicle interaction systems disclosed in the second aspect of the embodiment of the invention.
A fourth aspect of the present invention discloses a computer-readable storage medium storing a computer program, wherein the computer program causes a computer to execute any one of the methods disclosed in the first aspect of the embodiments of the present invention.
A fifth aspect of the embodiments of the present invention discloses a computer program product, which, when running on a computer, causes the computer to execute any one of the methods disclosed in the first aspect of the embodiments of the present invention.
Compared with the prior art, the embodiment of the invention has the following beneficial effects:
according to the embodiment of the invention, when the vehicle locking instruction is detected, the light effect and the sound effect are output in response to the real-time distance change between the vehicle and the person. That is, the light effect and sound effect output by the vehicle are varied with the distance. As the user moves away from the vehicle, there may be a course of changes in light and sound effects rather than an instantaneous alert effect. Therefore, the vehicle can prompt the user that the vehicle is locked through continuous light effect and sound effect, and the user does not need to confirm again, so that the operation of confirming the vehicle locking by the user can be reduced, and the user experience is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is a schematic structural diagram of a human-vehicle interaction system disclosed by an embodiment of the invention;
FIG. 2 is a schematic flow chart of a human-vehicle interaction method during vehicle locking according to an embodiment of the present invention;
FIG. 3 is a diagram illustrating an exemplary control logic for a lighting effect according to an embodiment of the present invention;
FIG. 4 is a flow chart illustrating another method for human-vehicle interaction when a vehicle is locked according to an embodiment of the present invention;
fig. 5 is an exemplary diagram of a through light strip according to an embodiment of the disclosure;
FIG. 6 is a schematic structural diagram of another human-vehicle interaction system disclosed by the embodiment of the invention;
fig. 7 is a schematic structural diagram of another human-vehicle interaction system disclosed in the embodiment of the invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It is to be noted that the terms "comprises" and "comprising" and any variations thereof in the embodiments and drawings of the present invention are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
The embodiment of the invention discloses a human-vehicle interaction method and system during vehicle locking and a vehicle, which can reduce the frequency operation of vehicle locking confirmation performed by a user and improve the user experience. The following are detailed below.
In order to better understand the human-vehicle interaction method, system and vehicle when locking the vehicle disclosed in the embodiment of the present invention, a system architecture applicable to the embodiment of the present invention is described below. Referring to fig. 1, fig. 1 is a schematic structural diagram of a human-vehicle interaction system according to an embodiment of the present invention. As shown in fig. 1, the system may include: a Central Gateway (CGW), a Body Controller (BCM), an automotive low-speed driving alert System (AVAS) controller, a lamp controller, a Vehicle Central Display Unit (CDU), a remote Control Module (Telematics Box, T-Box), a Camera (Camera), an in-Vehicle intelligent Unit, and the like.
Each of the controllers may be any Electronic Control Unit (ECU) provided in the vehicle. The central gateway CGW is used to connect the various controllers and sensors described above, thereby establishing interconnections between the various networks within the vehicle and transmitting data. The BCM is used for controlling electric control equipment in the vehicle, such as a wiper, a door lock, an electric window, a skylight and the like; the AVAS controller is used for controlling the loudspeaker to play sound; the lamp controller is used for controlling the lamp to be lightened; the AVAS controller, the Body controller BCM and the luminaire controller may be connected to the central gateway CGW via a Body control bus (Body CAN, BCAN).
The vehicle-mounted central control screen controller CDU is used for performing man-machine interaction with a user so as to enable the user to control functions of an air conditioner, a sound system and the like in the vehicle; a mobile communication module can be arranged in the remote control module T-BOX, the mobile communication module can support mobile communication networks such as 4G, 5G, Bluetooth and the like, and mobile terminals such as smart phones and the like can remotely control the vehicle through the remote control module T-BOX; the intelligent unit in the vehicle is used for executing data operation so as to realize functions such as image identification and the like; the vehicle-mounted central control screen controller CDU, the remote control module T-BOX and the in-vehicle intelligent unit CAN be connected to the central gateway CGW through an ICAN (independent CAN-bus Application Protocol, ICAN); the camera is used for shooting images and can also be connected to the central gateway CGW.
Example one
Referring to fig. 2, fig. 2 is a schematic flow chart illustrating a human-vehicle interaction method during vehicle locking according to an embodiment of the present invention. The method shown in fig. 2 is suitable for the human-vehicle interaction system shown in fig. 1. As shown in fig. 2, the human-vehicle interaction method during vehicle locking may include the following steps:
201. the human-vehicle interaction system detects real-time distance change between a vehicle and a pedestrian when detecting a vehicle locking instruction.
In the embodiment of the present invention, the car locking instruction may be triggered by a user, and the manner of triggering the car locking instruction includes, but is not limited to, the following cases: 1. a user triggers a vehicle locking instruction by pressing a vehicle locking key of the electronic key; 2. a user triggers a car locking instruction through an Application program (APP) installed on mobile equipment such as a smart phone or a smart tablet; 3. if the vehicle is detected to be flameout and enters a parking state (a P gear is engaged), the user automatically triggers a vehicle locking instruction by closing the vehicle door.
Further, as an optional real-time manner, the manner of detecting the real-time distance change between the vehicle and the pedestrian may specifically be:
and detecting a wireless signal sent by an electronic key of the vehicle, and determining the real-time distance between the vehicle and the pedestrian according to the strength of the wireless signal so as to determine the real-time distance change between the vehicle and the pedestrian. The electronic key and the vehicle may transmit a wireless signal based on Radio Frequency (RF) or Near Field Communication (NFC), and the embodiments of the present invention are not limited thereto. As can be seen from the propagation characteristics of the wireless signal, the intensity of the wireless signal transmitted by the electronic key received by the vehicle decreases as the distance between the vehicle and the electronic key increases. Specifically, the real-time distance between the vehicle and the electronic key can be calculated according to the propagation model of the wireless signal in the air and the strength of the wireless signal received in real time. The pedestrian carrying the vehicle electronic key is generally the driver of the vehicle and the target person who needs the vehicle locking prompt most, so the real-time distance between the vehicle and the pedestrian can be approximately regarded as the real-time distance between the vehicle and the electronic key.
Or, as another optional implementation, the manner of detecting the real-time distance change between the vehicle and the pedestrian may specifically be:
and controlling a camera of the vehicle to shoot an image, and identifying the real-time distance between the vehicle and the pedestrian according to the image so as to determine the real-time distance change between the vehicle and the pedestrian. The camera can be a camera which is positioned on a front windshield of the vehicle and is arranged towards the driving direction, and can also be a camera arranged on the roof of the vehicle; the camera arranged at the top of the vehicle is connected with the roof supporting device, the roof supporting device comprises a mechanical arm, and the mechanical arm can drive the camera to lift and rotate, so that the warp taking range of the camera can comprise the environment around the vehicle. Specifically, a certain target pedestrian can be tracked in a multi-frame image obtained by continuous shooting, and the depth information of the target pedestrian from the vehicle can be recovered from the image, so that the depth information can be used as the real-time distance between the vehicle and the pedestrian. Optionally, when the image includes a plurality of pedestrians, the identity of the driver of the vehicle may be obtained, the driver of the vehicle is identified from the plurality of pedestrians by performing identity recognition on the plurality of pedestrians in the image, and the real-time distance between the vehicle and the driver of the vehicle is detected.
Or, as another alternative, the manner of detecting the real-time distance change between the vehicle and the pedestrian may be:
the real-time distance between the vehicle and the pedestrian is detected by a ranging sensor of the vehicle, so that the real-time distance change between the vehicle and the pedestrian is determined. The distance measuring sensor of the vehicle may include an infrared distance measuring sensor, a thermal imaging distance measuring sensor, an ultrasonic distance measuring sensor, a laser distance measuring sensor, and the like, and the embodiment of the present invention is not limited. Specifically, the ranging sensor may detect a distance of an obstacle in the periphery of the vehicle with respect to the vehicle, wherein if there is a constantly changing distance data among the plurality of detected distance data, the constantly changing distance data may be determined as a real-time distance between the vehicle and the pedestrian.
In the above-described embodiment, on the one hand, the distance measuring sensor can return only the distance of a certain obstacle from the vehicle, and it is difficult to further determine whether the obstacle is a pedestrian; on the other hand, the camera can accurately position the pedestrian through image recognition, but is easily influenced by light of the surrounding environment, and is difficult to shoot clear images at night or in scenes such as underground parking lots, so that the image recognition has great difficulty. Preferably, the distance measuring sensor and the camera can be fused to detect the real-time distance between the vehicle and the pedestrian. Specifically, the embodiment may be:
detecting the illumination intensity of the environment where the vehicle is located, and judging whether the detected illumination intensity is higher than a preset light intensity threshold value or not; if the distance is higher than the preset distance, the light of the surrounding environment is considered to be sufficient, so that a camera of the vehicle can be controlled to shoot an image, and the real-time distance change between the vehicle and the pedestrian is identified according to the image (detected by the camera); if not, the ambient light is considered to be dim, so the distance sensor of the vehicle can be used to detect the real-time distance change between the vehicle and the pedestrian (detected by the distance sensor).
Further optionally, among the three detection modes (the electronic key, the camera and the ranging sensor), the real-time distance change between the vehicle and the pedestrian can be preferentially detected through the signal intensity of the electronic key, the power consumption required by the electronic key detection mode is low, the influence of light and other environmental factors is small, and the detection is relatively reliable; if the wireless signal transmitted by the electronic key cannot be received, the illumination intensity of the environment where the vehicle is located can be detected, and the real-time distance change between the vehicle and the pedestrian is detected by the camera or the real-time distance change between the vehicle and the pedestrian is detected by the distance measuring sensor according to the illumination intensity.
202. And the man-vehicle interaction system controls the vehicle to respond to the real-time distance change to output a target lamp light effect and a target sound effect.
In the embodiment of the invention, the lamp controller can be used for controlling the lamp of the vehicle to output the target lamp light effect, and the AVAS controller is used for controlling the loudspeaker of the vehicle to output the target sound effect. Wherein, the speaker can be the loudspeaker outside the car, and the highest digital audio of 75dB about can play.
The target light effect and the target sound effect can be preset and stored light effect and music effect, and the target light effect and the target sound effect can be bound with the vehicle locking instruction. Specifically, the human-vehicle interaction system can acquire a target light effect and a target sound effect bound with the vehicle locking instruction through an Over-the-Air Technology (OTA) Technology; or acquiring a target light effect and a target sound effect which are self-defined by a user on an APP (application) of the smart phone end through the T-BOX; or, acquiring a target light effect and a target sound effect selected from a plurality of light effects and sound effects output by the vehicle-mounted central control screen controller CDU by the user.
For example, referring to fig. 3, fig. 3 is a diagram illustrating an example of a control logic for a lighting effect according to an embodiment of the present invention. Wherein fig. 3 shows the control logic for the light fixtures arranged in front of the vehicle head; the above-mentioned lamps arranged in front of the vehicle head may be formed by combining a plurality of LED light particles, and may be specifically divided into right headlights (AR 1-AR 21 in fig. 3), left headlights (AL 1-AR 21 in fig. 3), and intermediate headlights (BM 1-BM 48 in fig. 3). In fig. 3, the horizontal axis represents the light particles, the vertical axis represents the time sequence, the unit of each vertical axis can be t milliseconds, and t can be set artificially (for example, 20). The blank cells in fig. 3 correspond to the momentary state of the light-emitting device being off, and the remaining cells correspond to the momentary state of the light-emitting device being on. Further, the light particles may be turned on for different brightness and for different colors (e.g., white, red, and yellow). It is understood that the brightness level shown in fig. 3 is a possible example, and in other possible embodiments, the brightness level of the light granules may also be 100%, 50%, 30%, and the like, and may be set manually, and the embodiment of the present invention is not limited thereto. Executing the control logic of the light effects as shown in fig. 3, the user can see the light effects gradually shrinking from both ends to the center.
Assuming that the light effect corresponding to the vehicle locking instruction is the light effect shown in fig. 3, which can be found by the human-vehicle interaction system from the pre-stored light effects when the vehicle locking instruction is detected in step 201, how to control the vehicle to output the target light effect in response to the real-time distance change between the vehicle and the pedestrian is described below with the light effect shown in fig. 3 as the target light effect:
as an alternative embodiment, the vehicle may be controlled to output the target light effect and the target sound effect when a change in the real-time distance between the vehicle and the pedestrian is detected. That is, when it is detected that the real-time distance change between the vehicle and the pedestrian is zero and the pedestrian is relatively stationary with respect to the vehicle, the light effect is not output. When the real-time distance change between the vehicle and the pedestrian is detected to be not zero, and the pedestrian moves relative to the vehicle, the light effect of the target lamp is triggered and output.
Further, assuming that t is 20ms as an example, the time required for completely outputting the light effect shown in fig. 3 is at least 1.22 s. Optionally, assuming that the light effect shown in fig. 3 is a target lamp light effect, after the target lamp light effect is completely output once, the target lamp light effect may be stopped from being output; or, the target lamp light effect can be output continuously and circularly until the number of times of circularly outputting the target lamp light effect exceeds the specified number of times, and the output is stopped.
As another optional real-time mode, the corresponding real-time light intensity may also be determined according to the real-time distance change, and the vehicle is controlled to output the target light effect according to the real-time light intensity. For example, at the time T1 shown in fig. 3 and each subsequent time, the real-time distance between the vehicle and the pedestrian is detected, so as to determine the real-time distance change between the vehicle and the pedestrian, and the corresponding real-time light intensity is determined according to the real-time distance change, and the corresponding light particles are lighted with the real-time light intensity. Assuming that the initial light intensity of the target light effect is 100%, and the real-time distance between the vehicle and the pedestrian is detected to be 0.5 m at the time T1 shown in fig. 3, the light particles AR1 and AL1 are lit at the light intensity of 100% at the time T1; assuming that the real-time distance between the vehicle and the pedestrian is detected to be 2 meters at the time point T62 shown in fig. 3, and the real-time distance between the vehicle and the pedestrian increases, the real-time light intensity decreases, for example, the real-time light intensity may decrease to 30%, and then the light granule BM24 and the light granule BM25 are lit at 30% light intensity at the time point T62.
In addition, in the above example, the real-time distance between the vehicle and the pedestrian is increased, and the real-time light intensity is decreased, that is, the farther the distance between the pedestrian and the vehicle is, the darker the brightness of the target light effect is, so that the visual effect that the vehicle stops working gradually can be created for the user, and the effect that the vehicle falls asleep can be simulated. The corresponding real-time distance when the real-time light intensity is 0 can be set as the maximum distance that the vehicle can receive the wireless signal sent by the electronic key; when a user is far away from the vehicle, the light effect gradually becomes dark, when the distance between the user and the vehicle exceeds the farthest distance, the light is turned off, the user can visually see that the vehicle cannot be remotely controlled through the electronic key, and the user needs to approach the vehicle if other operations need to be performed by using the electronic key.
Or, when the real-time distance between the vehicle and the pedestrian is increased, the real-time light intensity is increased, that is, the farther the distance between the pedestrian and the vehicle is, the stronger the brightness of the target lamp light effect is, so that the user can enhance the prompt function of the target lamp light effect by seeing the target lamp light effect at the farther place.
It will be appreciated that the specific implementation method of controlling the vehicle to output the target sound effect in response to the real-time distance between the vehicle and the pedestrian is similar to the above-described method of outputting the target light effect: outputting the light effect of the target lamp when the real-time distance between the vehicle and the pedestrian changes; or determining the real-time sound intensity corresponding to the real-time distance change, and controlling the vehicle to output the target sound effect according to the real-time sound intensity. The relationship between the real-time distance change and the real-time light intensity and real-time sound intensity is combined as follows:
when the real-time distance is increased, the real-time light intensity and the real-time sound intensity are both reduced; namely, as the pedestrian is far away from the vehicle (the real-time distance is increased), the brightness of the target light effect becomes dark, the volume of the target sound effect becomes small, and therefore the effect that the vehicle stops working is created through the visual sense and the auditory sense; or when the real-time distance is reduced, the real-time light intensity and the real-time sound intensity are increased; that is, as the pedestrian approaches the vehicle (the real-time distance decreases), the brightness of the target light effect becomes strong, and the volume of the target sound effect becomes large.
When the real-time distance is increased, the real-time light intensity is increased and the real-time sound intensity is reduced; the pedestrian keeps away from the vehicle promptly, and the luminance of target lamp light effect becomes strong, and the volume of target sound effect diminishes to can strengthen the prompt facility of light, make the user still can confirm through light that the vehicle has locked when far away, nevertheless weaken the volume of target sound effect, with the noise pollution that the reduction target sound effect caused. The implementation mode is more suitable for the closed environment such as an underground parking lot and the like with dim environmental light and easy echo interference; or, when the real-time distance decreases, the real-time light intensity decreases and the real-time sound intensity increases; that is, as the pedestrian approaches the vehicle, the brightness of the target light effect becomes dark, and the volume of the target sound effect becomes large.
When the real-time distance is increased, the real-time light intensity is reduced and the real-time sound intensity is increased; that is, as the pedestrian gets away from the vehicle, the brightness of the target light effect becomes dark, and the volume of the target sound effect becomes large, so that the prompt function of sound can be enhanced, and the user can confirm that the vehicle is locked by sound when the pedestrian gets away from the vehicle. The implementation mode is more suitable for scenes with sufficient ambient light and open space, such as on-ground parking lots in the daytime; or, when the real-time distance decreases, the real-time light intensity increases and the real-time sound intensity decreases; that is, as the pedestrian approaches the vehicle, the brightness of the target light effect becomes strong and the volume of the target sound effect becomes small.
When the real-time distance is increased, the real-time light intensity and the real-time sound intensity are increased; the brightness of the target light effect is enhanced and the volume of the target sound effect is enhanced along with the fact that the pedestrian is far away from the vehicle, so that the prompting effects of light and sound can be simultaneously enhanced, and a user can combine the light and the sound to confirm that the vehicle is locked when the user is far away. Or, when the real-time distance is decreased, both the real-time light intensity and the real-time sound intensity are decreased; that is, as the pedestrian approaches the vehicle, the brightness of the target light effect becomes dark and the target sound effect also becomes small.
In summary, referring to fig. 1, based on the human-vehicle interaction system shown in fig. 1, a vehicle locking instruction triggered by a user through an electronic key may be detected through a vehicle body controller BCM; or receiving a car locking instruction triggered by a user through an APP on the smart phone through the T-BOX; then, the BCM can detect the real-time distance between the vehicle and the pedestrian according to the received wireless signal intensity emitted by the electronic key; or the camera transmits the shot image to an in-vehicle intelligent unit, and the in-vehicle intelligent unit identifies the real-time distance between the vehicle and the pedestrian according to the image; when detecting a vehicle locking instruction and real-time distance change between a vehicle and a pedestrian, the BCM can directly send a target lamp light effect and control logic corresponding to a target sound effect to the lamp controller and the AVAS controller through the BCAN according to preset signal logic, the lamp controller controls the output of the target lamp light effect, and the AVAS controller controls the output of the target sound effect; or the intelligent unit CIU in the vehicle transmits the identified real-time distance to the vehicle-mounted central control screen controller CDU, the CDU determines the control logic corresponding to the target lamp light effect and the target sound effect according to the real-time distance, the control logic corresponding to the target lamp light effect and the target sound effect is sent to the lamp controller and the AVAS controller through the central gateway CGW and the BCAN through the ICAN, the lamp controller controls the target lamp light effect to be output, and the AVAS controller controls the target sound effect to be output.
Therefore, by implementing the method described in fig. 2, the light effect and the sound effect output by the vehicle can be changed along with the change of the real-time distance between the vehicle and the pedestrian, so that the user is prompted that the vehicle is locked by the continuous light effect and the sound effect in the process that the user is far away from the vehicle, the user does not need to confirm again, the operation of confirming that the vehicle is locked by the user can be reduced, and the user experience is improved. In addition, lamp effect and audio change along with the distance, can provide interactive experience that interactive is stronger for the user for the lamp effect and the audio of vehicle are more intelligent and the amusement, thereby can improve user's viscosity.
Example two
Referring to fig. 4, fig. 4 is a schematic flow chart of another human-vehicle interaction method during vehicle locking according to an embodiment of the present invention. As shown in fig. 4, the human-vehicle interaction method when the vehicle is locked may include:
401. the human-vehicle interaction system detects real-time distance change between a vehicle and a pedestrian when detecting a vehicle locking instruction.
402. And the human-vehicle interaction system judges whether the vehicle meets a preset working condition, if so, step 403 is executed, and if not, the process is ended.
In the embodiment of the present invention, the preset operation condition may include, but is not limited to: 1. the vehicle is flamed out and is static; 2. the method comprises the following steps that when a vehicle is in a P gear and is not in a starting preparation state, an automobile ignition gear is in a closed state, and basic lighting functions of a daytime running light or a position light and the like of the vehicle are monitored to be closed; 3. when the vehicle is in a P gear and is in a starting preparation state, the BCM turns off basic lighting functions such as a daytime running light and a position light.
It is understood that in the embodiment of the present invention, the output of the basic lighting function such as the daytime running light or the position light of the vehicle and the output of the light effect of the target light can share the same set of lamps. For example, the output may be provided through a pass-through light strip (as shown in fig. 5). The penetrating type lamp strip can be arranged in front of the vehicle head, can also be arranged at the vehicle tail or at any position outside the vehicle body, and the embodiment of the invention is not limited. Taking a penetrating type lamp belt arranged in front of a vehicle head as an example, a left front lamp and a left rear lamp are simultaneously lightened by white light and can be used as daytime running lamps; the left front lamp and the left rear lamp are simultaneously lightened by yellow light and can be used as position lamps.
In the embodiment of the invention, the vehicle can continue to output the light effect of the target lamp only when meeting any working condition, that is, in the embodiment of the invention, the priority of the basic lighting function is higher than that of the target lamp, so that the normal work of the basic indicating function of the vehicle light is ensured, and the driving safety is ensured. Thus, it may be monitored whether the basic lighting function of the vehicle is on by performing step 402; if the basic lighting function is turned on, and the vehicle is detected to be in a stop state when being in a P gear and in a starting preparation state (namely the control system is in a working state), the basic lighting function can be turned off through the vehicle body controller BCM; if the basic lighting function is on but the control system of the vehicle is not in a working state and the basic lighting function cannot be turned off, the flow is ended. In addition, as can be seen from the above operating condition that the vehicle needs to meet, if it is detected that the vehicle is in any one of the gears R (reverse gear)/D (forward gear)/N (neutral gear), the process is ended, and the vehicle does not output the target light effect. Further, in the process of executing steps 403 to 406 described below, if it is detected that the vehicle does not satisfy any of the above-mentioned operating conditions, the human-vehicle interaction system may interrupt the output of the target light effect.
403. The man-vehicle interaction system judges whether the current date is a preset specific date, if so, the step 404 is executed, and if not, the step 405 is executed.
It is understood that the foregoing steps 402 and 403 have no necessary precedence in logic, and in other possible embodiments, the step 403 may be executed first and then the step 402 may be executed.
404. The human-vehicle interaction system determines the light effect corresponding to the specific date as the target light effect and performs step 406.
In the embodiment of the invention, the specific date can be set by an operator of the human-vehicle interaction system, for example, the specific date can be public festivals such as a car-buying memorial day, spring festival and Christmas festival; the user can also set the date in the APP of the mobile device such as the smart phone and the like, the background server of the APP synchronizes to the man-car interaction system, and for example, the specific date can be wedding commemorative days, birthdays and the like set by the user. If the current date is the specific date, when the vehicle locking instruction is detected, the vehicle can be controlled to output a special light effect (namely, the light effect corresponding to the specific date) different from the conventional light effect during vehicle locking, and surprise is brought to the user. Moreover, the lighting effects corresponding to different specific dates can be different, so that different interactive experiences can be provided for the user on different specific dates. The light effects corresponding to the specific dates can be updated through a network, for example, the light effects corresponding to the specific dates can be downloaded to the vehicle from the background server in advance through the OTA by the human-vehicle interaction system.
405. The human-vehicle interaction system determines the breathing lamp effect as the target lamp effect and executes step 406.
In the embodiment of the present invention, the breathing light effect is a light effect corresponding to the vehicle locking instruction, that is, a conventional light effect when the vehicle is locked. If it is determined that the current date is not the specific date after the step 403 is performed, the normal light effect is normally output. It is to be understood that, in other possible embodiments, the light effect corresponding to the car locking instruction may also be a light effect such as a flashing light, a water lamp, and the like.
406. And the human-vehicle interaction system controls the penetrating type lamp strip to output the light effect of the target lamp according to the real-time light intensity corresponding to the real-time distance change.
In the embodiment of the present invention, for example, the penetrating type lamp strips are simultaneously disposed in front of the vehicle head and at the vehicle tail, the human-vehicle interaction system may only control the penetrating type lamp strips disposed in front of the vehicle head to output the light effect of the target lamp, may only control the penetrating type lamp strips disposed at the vehicle tail to output the light effect of the target lamp, and may also control the penetrating type lamp strips disposed in front of the vehicle head and at the vehicle tail to simultaneously output the light effect of the target lamp.
In addition, in the embodiment of the invention, the human-vehicle interaction system adjusts the real-time light intensity when the target light effect is output according to the real-time distance change between the vehicle and the pedestrian. As an alternative embodiment, the real-time light intensity may be reduced when the real-time distance between the vehicle and the pedestrian increases. For example, if the target lamp light effect is the breathing lamp effect, the real-time lamp light intensity may be used as the maximum brightness output by the pass-through light strip in each brightness variation period corresponding to the breathing lamp. That is, it is assumed that one currently set luminance change period duration is 1 s; detecting that the real-time distance between the vehicle and the pedestrian is 0.5 m at an initial time T1, and the maximum brightness output by the breathing lamp is 100%; in the period, the brightness of the LED lamp grains of the penetrating lamp strip is gradually changed from 0 to 100 percent of the lamp light intensity, and then the lamp light intensity of 100 percent is gradually changed to 0; detecting that the real-time distance between the vehicle and the pedestrian is 1 meter at the T1+1s moment when the brightness change period is ended, increasing the real-time distance between the vehicle and the pedestrian, and reducing the real-time light intensity, so that the maximum brightness output by the breathing lamp can be reduced to 80%; in a second brightness period, the brightness of the LED lamp particles is gradually changed from 0 to 80% of the lamp light intensity, and then the lamp light intensity is gradually changed from 80% of the lamp light intensity to 0; by analogy, until the maximum brightness output by the penetrating type lamp strip in a certain brightness change period is 0, the LED lamp particles are completely extinguished, and the light effect is finished. In the above example, the vehicle can be personified in light expression through the breathing lamp effect, and the brightness of the breathing lamp gradually becomes dark along with the distance of the user from the vehicle, so that the visual effect that the vehicle gradually falls asleep can be created, and the interestingness of the light effect is improved.
It is understood that the above steps 404 to 406 only show one possible implementation when the target lamp effect is output. Accordingly, the vehicle can simultaneously output the target sound effect in the course of outputting the target lamp light effect. That is, if the above-described step 403 is performed to determine that the current date is the specific date, the sound effect corresponding to the specific date may also be determined as the target sound effect; if the current date is not the specific date as determined by the execution of step 403, the sound effect corresponding to the car locking instruction may be determined as the target sound effect; the human-vehicle interaction system outputs a target sound effect in response to a real-time distance change between a vehicle and a pedestrian. The target sound effect can supplement the visual effect created by the target light effect, the target sound effect can be set as an anthropomorphic snore by taking the breathing light effect as an example, and the real-time light intensity is reduced and the real-time sound effect is increased when the real-time distance between the vehicle and the pedestrian is increased; at the moment, as the user is far away from the vehicle, the brightness of the breathing lamp is gradually darkened, the sound volume of snore played by the loudspeaker is gradually increased, and the effect that the vehicle falls asleep is achieved from the visual aspect and the auditory aspect.
In addition, as an alternative embodiment, after step 406 is executed, if the human-vehicle interaction system detects that the real-time distance change between the vehicle and the pedestrian indicates that the distance between the pedestrian and the vehicle is decreased, the human-vehicle interaction system may control the vehicle to stop outputting the target light effect and the target sound effect. That is, after the user locks the vehicle, the vehicle outputs the light effect and the sound effect in response to the vehicle locking instruction, and at this time, if the user returns to the vehicle from far to near again, the vehicle interrupts the output of the light effect and the sound effect.
It can be seen that in the method described in fig. 4, the human-vehicle interaction system can change the light effect and the sound effect according to the real-time distance change between the vehicle and the pedestrian so as to achieve the effect of continuous prompt. Furthermore, in the method described in fig. 4, before outputting the light effect and the sound effect, the human-vehicle interaction system first determines whether the vehicle meets a preset working condition, so as to ensure the basic lighting function and the driving safety; in addition, special light effects different from conventional light effects can be output according to specific dates, surprise is provided for users, and interactivity of a human-vehicle interaction system is enhanced.
EXAMPLE III
Referring to fig. 6, fig. 6 is a schematic structural diagram of another human-vehicle interaction system disclosed in the embodiment of the present invention. As shown in fig. 6, the human-vehicle interaction system may include:
the detecting unit 601 is configured to detect a real-time distance change between the vehicle and the pedestrian when the vehicle locking instruction is detected.
The manner for detecting the car locking instruction by the detection unit 601 may include, but is not limited to: 1. detecting a vehicle locking instruction triggered by pressing a vehicle locking key of an electronic key by a user; 2. detecting a car locking instruction triggered by an APP (application) installed on a mobile device such as a smart phone or a smart tablet; 3. if the vehicle is detected to be flameout and enters a parking state (a P gear is engaged), the detection user automatically triggers a vehicle locking instruction by closing the vehicle door.
In addition, the way that the detection unit 601 detects the real-time distance change between the vehicle and the pedestrian may specifically be:
detecting a wireless signal sent by an electronic key of the vehicle, and determining the real-time distance change between the vehicle and the pedestrian according to the strength of the wireless signal;
or controlling a camera of the vehicle to shoot an image, and identifying real-time distance change between the vehicle and the pedestrian according to the image;
alternatively, real-time distance changes between the vehicle and the pedestrian are detected using a ranging sensor of the vehicle.
Further, the detection unit 601 may also integrate the above three specific embodiments to detect the real-time distance change between the vehicle and the pedestrian. The specific implementation mode can be as follows:
detecting the illumination intensity of the environment where the vehicle is located, and judging whether the detected illumination intensity is higher than a preset light intensity threshold value or not; if the distance is higher than the preset distance, controlling a camera of the vehicle to shoot an image, and identifying real-time distance change between the vehicle and the pedestrian according to the image (detecting by using the camera); if not, detecting a real-time distance change between the vehicle and the pedestrian by using a ranging sensor of the vehicle (detecting by using the ranging sensor);
or, whether a wireless signal sent by the electronic key is received or not is detected; if the signal is received, determining the real-time distance change between the vehicle and the pedestrian according to the strength of the wireless signal; and if not, detecting the illumination intensity of the environment where the vehicle is located so as to judge whether the real-time distance change between the vehicle and the pedestrian is detected by using the camera or the real-time distance change between the vehicle and the pedestrian is detected by using the distance measuring sensor according to the illumination intensity.
A first control unit 602 for controlling the vehicle to output a target light effect and a target sound effect in response to the real-time distance change.
As an alternative embodiment, the way that the first control unit 602 controls the vehicle to output the target light effect and the target sound effect in response to the real-time distance change may specifically be:
a first control unit 602, configured to control the vehicle to output a target light effect and a target sound effect when it is detected that there is a change in the real-time distance between the vehicle and the pedestrian.
As another alternative, the first control unit 602 may also specifically include the following sub-units to control the vehicle to output the target lamp light effect and the target sound effect in response to the real-time distance change:
a first determining subunit 6021, configured to determine a real-time light intensity corresponding to the real-time distance change and a real-time sound intensity corresponding to the real-time distance change;
a first control subunit 6022, configured to control the vehicle to output the target light effect according to the real-time light intensity determined by the first determining subunit 6021;
a second control subunit 6023 for controlling the vehicle to output the target sound effect in accordance with the real-time sound intensity determined by the first determination subunit 6021;
when the real-time distance is increased, the real-time light intensity and the real-time sound intensity are both reduced;
when the real-time distance is increased, the real-time light intensity is increased and the real-time sound intensity is reduced;
when the real-time distance is increased, the real-time light intensity is reduced and the real-time sound intensity is increased;
when the real-time distance is increased, the real-time light intensity and the real-time sound intensity are increased;
when the real-time distance is reduced, the real-time light intensity and the real-time sound intensity are increased;
when the real-time distance is reduced, the real-time light intensity is reduced and the real-time sound intensity is increased;
when the real-time distance is reduced, the real-time light intensity is increased and the real-time sound intensity is reduced;
as the real-time distance decreases, both the real-time light intensity and the real-time sound intensity decrease. In addition, after the first control unit 602 completely outputs the target lamp light effect and the target sound effect once, the first control unit 602 may stop outputting the target lamp light effect and the target sound effect; alternatively, the first control unit 602 may also stop outputting when the target light effect and the target sound effect are cyclically output more than a specified number of times.
By implementing the human-vehicle interaction system shown in fig. 6, the light effect and the sound effect output by the vehicle can be changed along with the change of the real-time distance between the vehicle and the pedestrian, so that the user is prompted that the vehicle is locked through the continuous light effect and the sound effect in the process that the user is far away from the vehicle, the user does not need to confirm again, the operation of confirming that the vehicle is locked by the user can be reduced, and the user experience is improved. In addition, lamp effect and audio change along with the distance, can provide interactive experience that interactive is stronger for the user for the lamp effect and the audio of vehicle are more intelligent and the amusement, thereby can improve user's viscosity.
Example four
Referring to fig. 7, fig. 7 is a schematic structural diagram of another human-vehicle interaction system according to an embodiment of the present invention. The human-vehicle interaction system shown in fig. 7 is obtained by optimizing the human-vehicle interaction system shown in fig. 6. In the human-vehicle interaction system shown in fig. 7:
the first control subunit 6022 is specifically configured to control the through-type light strip to output a target light effect according to the real-time light intensity corresponding to the real-time distance change; for example, assuming that the target lamp light effect is a breathing lamp effect, in each brightness change period corresponding to the breathing lamp effect, the maximum brightness output by the penetrating lamp strip is the real-time light intensity determined according to the real-time distance change between the vehicle and the pedestrian. It can be understood that the target light effect may also include light effects such as flashing, running light, and the like, and the embodiment of the present invention is not limited.
In an embodiment of the invention, the pass-through light strip is used to output light effects under the control of the first control subunit 6022. The through type lamp strip can be arranged at any position outside the vehicle body, taking the case that the through type lamp strips are arranged in front of the vehicle head and at the rear of the vehicle as an example, the first control subunit 6022 can only control the through type lamp strip arranged in front of the vehicle head to output the target lamp light effect, can also only control the through type lamp strip arranged at the rear of the vehicle to output the target lamp light effect, and can also control the through type lamp strips arranged in front of the vehicle head and at the rear of the vehicle to output the target lamp light effect at the same time.
Optionally, in the human-vehicle interaction system shown in fig. 7, the human-vehicle interaction system may further include:
a second judging unit 604, configured to judge whether the current date is a preset specific date;
an effect determination unit 605 configured to determine the light effect corresponding to the specific date as a target light effect and determine the sound effect corresponding to the specific date as a target sound effect when the second judgment unit 604 judges that the current date is the specific date; and when the second judging unit 604 judges that the current date is not the specific date, determining the light effect corresponding to the lock instruction as the target light effect and determining the sound effect corresponding to the lock instruction as the target sound effect. That is, the target light effect and the target sound effect that the first control unit 602 controls the vehicle to output are determined by the effect determination unit 605.
The lighting effect corresponding to the specific date is different from the lighting effect corresponding to the vehicle locking instruction; the sound effect corresponding to the specific date is different from the sound effect corresponding to the lock instruction.
Further optionally, the human-vehicle interaction system shown in fig. 7 may further include:
a second control unit 603 for controlling the vehicle to stop outputting the target light effect and the target sound effect if it is recognized that the real-time distance change detected by the detection unit 601 indicates that the distance between the pedestrian and the vehicle decreases after the first control unit 602 controls the vehicle to output the target light effect and the target sound effect in response to the real-time distance change. That is, the second control unit 603 may interrupt the light effect and the sound effect output from the first control unit 602 when the user returns to the vehicle from far and near again.
Therefore, by implementing the human-vehicle interaction system shown in fig. 7, the light effect and the sound effect can be changed according to the real-time distance change between the vehicle and the pedestrian, so that the effect of continuous prompt is achieved, and the operation of confirming that the vehicle is locked by the user is reduced. Furthermore, before outputting the light effect and the sound effect, whether the vehicle meets a preset working condition can be judged firstly, so that the basic lighting function is guaranteed, and the driving safety is guaranteed; in addition, special light effects different from conventional light effects can be output according to specific dates, surprise is provided for users, and interactivity of a human-vehicle interaction system is enhanced.
The embodiment of the invention discloses a computer-readable storage medium which stores a computer program, wherein the computer program enables a computer to execute a human-vehicle interaction method when a vehicle is locked, which is shown in figure 2 or figure 4.
An embodiment of the invention discloses a computer program product, which comprises a non-transitory computer readable storage medium storing a computer program, and the computer program is operable to make a computer execute the human-vehicle interaction method when the vehicle is locked as shown in fig. 2 or fig. 4.
The embodiment of the invention discloses a vehicle, which comprises a human-vehicle interaction system shown in any one of figure 1, figure 5 or figure 6.
It should be appreciated that reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. Those skilled in the art should also appreciate that the embodiments described in this specification are exemplary and alternative embodiments, and that the acts and modules illustrated are not required in order to practice the invention.
In various embodiments of the present invention, it should be understood that the sequence numbers of the above-mentioned processes do not imply an inevitable order of execution, and the execution order of the processes should be determined by their functions and inherent logic, and should not constitute any limitation on the implementation process of the embodiments of the present invention.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated units, if implemented as software functional units and sold or used as a stand-alone product, may be stored in a computer accessible memory. Based on such understanding, the technical solution of the present invention, which is a part of or contributes to the prior art in essence, or all or part of the technical solution, can be embodied in the form of a software product, which is stored in a memory and includes several requests for causing a computer device (which may be a personal computer, a server, a network device, or the like, and may specifically be a processor in the computer device) to execute part or all of the steps of the above-described method of each embodiment of the present invention.
It will be understood by those skilled in the art that all or part of the steps in the methods of the embodiments described above may be implemented by hardware instructions of a program, and the program may be stored in a computer-readable storage medium, where the storage medium includes Read-Only Memory (ROM), Random Access Memory (RAM), Programmable Read-Only Memory (PROM), Erasable Programmable Read-Only Memory (EPROM), One-time Programmable Read-Only Memory (OTPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Compact Disc Read-Only Memory (CD-ROM), or other Memory, such as a magnetic disk, or a combination thereof, A tape memory, or any other medium readable by a computer that can be used to carry or store data.
The human-vehicle interaction method, system and vehicle during vehicle locking disclosed in the embodiments of the present invention are described above in detail, and the principle and implementation of the present invention are explained herein by applying specific embodiments, and the description of the above embodiments is only used to help understanding the method and core ideas of the present invention. Meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (11)

1. A human-vehicle interaction method during vehicle locking is characterized by comprising the following steps:
detecting real-time distance change between a vehicle and a pedestrian when a vehicle locking command is detected;
determining real-time light intensity corresponding to the real-time distance change and real-time sound intensity corresponding to the real-time distance change;
controlling the vehicle to output a target light effect according to the real-time light intensity;
controlling the vehicle to output a target sound effect according to the real-time sound intensity;
wherein the content of the first and second substances,
when the real-time distance is increased, the real-time lamplight intensity and the real-time sound intensity are both reduced; alternatively, the first and second electrodes may be,
when the real-time distance is increased, the real-time lamplight intensity is increased and the real-time sound intensity is reduced; alternatively, the first and second electrodes may be,
when the real-time distance is increased, the real-time light intensity is decreased and the real-time sound intensity is increased; alternatively, the first and second electrodes may be,
when the real-time distance is increased, the real-time lamplight intensity and the real-time sound intensity are increased; alternatively, the first and second electrodes may be,
when the real-time distance is reduced, the real-time light intensity and the real-time sound intensity are increased; alternatively, the first and second electrodes may be,
when the real-time distance is decreased, the real-time light intensity is decreased and the real-time sound intensity is increased; alternatively, the first and second electrodes may be,
when the real-time distance is decreased, the real-time light intensity is increased and the real-time sound intensity is decreased; alternatively, the first and second electrodes may be,
when the real-time distance is decreased, both the real-time light intensity and the real-time sound intensity are decreased.
2. The method of claim 1, wherein the vehicle front is provided with a pass-through light strip; the control the vehicle is according to real-time light intensity output target light effect includes:
and controlling the penetrating lamp strip to output the light effect of the target lamp according to the real-time light intensity.
3. The method of claim 1, wherein prior to the controlling the vehicle to output a target light effect and a target sound effect in response to the real-time distance change, the method further comprises:
judging whether the current date is a preset specific date or not;
if yes, determining the light effect corresponding to the specific date as a target light effect, and determining the sound effect corresponding to the specific date as a target sound effect;
if not, determining the light effect corresponding to the vehicle locking instruction as a target light effect, and determining the sound effect corresponding to the vehicle locking instruction as a target sound effect;
the lighting effect corresponding to the specific date is different from the lighting effect corresponding to the vehicle locking instruction; the sound effect corresponding to the specific date is different from the sound effect corresponding to the car locking instruction.
4. The method of claim 1, wherein detecting a real-time distance change between a vehicle and a pedestrian comprises:
detecting a wireless signal transmitted by an electronic key of a vehicle;
determining real-time distance change between the vehicle and the pedestrian according to the strength of the wireless signal;
alternatively, the detecting a real-time distance change between the vehicle and the pedestrian includes:
detecting the illumination intensity of the environment where the vehicle is located, and judging whether the illumination intensity is higher than a preset light intensity threshold value or not;
if the light intensity is higher than the light intensity threshold value, controlling a camera of the vehicle to shoot an image, and identifying real-time distance change between the vehicle and the pedestrian according to the image;
and if the light intensity is not higher than the light intensity threshold value, detecting the real-time distance change between the vehicle and the pedestrian by using the distance measuring sensor of the vehicle.
5. The method of claim 1, wherein after the controlling the vehicle to output a target light effect and a target sound effect in response to the real-time distance change, the method further comprises:
and if the real-time distance change indicates that the distance between the pedestrian and the vehicle is reduced, controlling the vehicle to stop outputting the target light effect and the target sound effect.
6. A human-vehicle interaction system, comprising:
the detection unit is used for detecting the real-time distance change between the vehicle and the pedestrian when the vehicle locking instruction is detected;
a first control unit for controlling the vehicle to output a target light effect and a target sound effect in response to the real-time distance change;
the first control unit is specifically used for controlling the vehicle to output a target light effect and a target sound effect when the fact that the real-time distance between the vehicle and the pedestrian is changed is detected;
the first control unit includes:
the first determining subunit is used for determining the real-time lamplight intensity corresponding to the real-time distance change and the real-time sound intensity corresponding to the real-time distance change;
the first control subunit is used for controlling the vehicle to output a target lamp light effect according to the real-time light intensity;
the second control subunit is used for controlling the vehicle to output a target sound effect according to the real-time sound intensity;
when the real-time distance is increased, the real-time light intensity and the real-time sound intensity are both reduced; alternatively, the first and second electrodes may be,
when the real-time distance is increased, the real-time lamplight intensity is increased and the real-time sound intensity is reduced; alternatively, the first and second electrodes may be,
when the real-time distance is increased, the real-time light intensity is decreased and the real-time sound intensity is increased; alternatively, the first and second electrodes may be,
when the real-time distance is increased, the real-time lamplight intensity and the real-time sound intensity are increased; alternatively, the first and second electrodes may be,
when the real-time distance is reduced, the real-time light intensity and the real-time sound intensity are increased; alternatively, the first and second electrodes may be,
when the real-time distance is decreased, the real-time light intensity is decreased and the real-time sound intensity is increased; alternatively, the first and second electrodes may be,
when the real-time distance is decreased, the real-time light intensity is increased and the real-time sound intensity is decreased; alternatively, the first and second electrodes may be,
when the real-time distance is decreased, both the real-time light intensity and the real-time sound intensity are decreased.
7. The system of claim 6, wherein the vehicle front is provided with a pass-through light strip;
and the first control subunit is specifically used for controlling the through type lamp strip to output the light effect of the target lamp according to the real-time light intensity.
8. The system of claim 6, further comprising:
the second judging unit is used for judging whether the current date is a preset specific date or not;
an effect determination unit configured to determine, when the second determination unit determines that the current date is the specific date, a light effect corresponding to the specific date as a target light effect, and determine a sound effect corresponding to the specific date as a target sound effect; when the second judging unit judges that the current date is not the specific date, determining the light effect corresponding to the vehicle locking instruction as a target light effect and determining the sound effect corresponding to the vehicle locking instruction as a target sound effect;
the lighting effect corresponding to the specific date is different from the lighting effect corresponding to the vehicle locking instruction; the sound effect corresponding to the specific date is different from the sound effect corresponding to the car locking instruction.
9. The system according to claim 6, wherein the detection unit is configured to detect the change in the real-time distance between the vehicle and the pedestrian by:
the detection unit is used for detecting a wireless signal sent by an electronic key of a vehicle and determining real-time distance change between the vehicle and a pedestrian according to the intensity of the detected wireless signal;
or the detection unit is used for detecting the illumination intensity of the environment where the vehicle is located and judging whether the illumination intensity is higher than a preset light intensity threshold value or not; when the illumination intensity is judged to be higher than the light intensity threshold value, controlling a camera of the vehicle to shoot an image, and identifying real-time distance change between the vehicle and the pedestrian according to the image; and when the illumination intensity is judged not to be higher than the light intensity threshold value, detecting the real-time distance change between the vehicle and the pedestrian by using the distance measuring sensor of the vehicle.
10. The system of claim 6, further comprising:
a second control unit for controlling the vehicle to stop outputting the target light effect and the target sound effect if the real-time distance change indicates that the distance between the pedestrian and the vehicle decreases after controlling the vehicle to output the target light effect and the target sound effect in response to the real-time distance change.
11. A vehicle, characterized in that the vehicle comprises a human-vehicle interaction system according to any one of claims 6 to 10.
CN201910267504.2A 2019-04-03 2019-04-03 Human-vehicle interaction method and system during vehicle locking and vehicle Active CN109969125B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910267504.2A CN109969125B (en) 2019-04-03 2019-04-03 Human-vehicle interaction method and system during vehicle locking and vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910267504.2A CN109969125B (en) 2019-04-03 2019-04-03 Human-vehicle interaction method and system during vehicle locking and vehicle

Publications (2)

Publication Number Publication Date
CN109969125A CN109969125A (en) 2019-07-05
CN109969125B true CN109969125B (en) 2020-12-15

Family

ID=67082929

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910267504.2A Active CN109969125B (en) 2019-04-03 2019-04-03 Human-vehicle interaction method and system during vehicle locking and vehicle

Country Status (1)

Country Link
CN (1) CN109969125B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109819563B (en) * 2019-01-24 2021-10-26 吉利汽车研究院(宁波)有限公司 Vehicle lamp control method, device and equipment
CN112615983A (en) * 2020-12-09 2021-04-06 广州橙行智动汽车科技有限公司 Vehicle locking method and device, vehicle and storage medium
CN113570952A (en) * 2021-06-28 2021-10-29 深圳市越疆科技有限公司 Portable display device based on electronic skin

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN204775126U (en) * 2015-07-14 2015-11-18 安徽江淮汽车股份有限公司 Automobile lock car reminder system
CN107235004A (en) * 2017-05-11 2017-10-10 汪超 The method and automobile of a kind of sound volume of automobile horn control
CN107580279A (en) * 2017-08-10 2018-01-12 深圳腾芈技术服务有限公司 Vehicle whistle control method, device and computer-readable recording medium

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3630712B2 (en) * 1994-02-03 2005-03-23 キヤノン株式会社 Gesture input method and apparatus
JP4600444B2 (en) * 2007-07-17 2010-12-15 株式会社デンソー Voice guidance system
US7827835B2 (en) * 2007-08-23 2010-11-09 Eduardo F. D'Angelo Universal steering wheel lock
JP5012546B2 (en) * 2008-02-06 2012-08-29 トヨタ自動車株式会社 Operating device
CN202364181U (en) * 2011-11-28 2012-08-01 浙江吉利汽车研究院有限公司 Automobile horn volume control device
US9218698B2 (en) * 2012-03-14 2015-12-22 Autoconnect Holdings Llc Vehicle damage detection and indication
CN203766708U (en) * 2014-03-13 2014-08-13 深圳市粤霖伟业科技有限公司 Intelligent automobile door opening theft prevention alarm system
KR20160028916A (en) * 2014-09-04 2016-03-14 김서희 car's multi-key
US9597971B2 (en) * 2014-09-16 2017-03-21 Qualcomm Incorporated Methods and systems for compatible operation between a wireless power transfer system and vehicle remote entry systems
CN106846563B (en) * 2016-12-28 2020-02-07 智车优行科技(北京)有限公司 Vehicle control method and system, vehicle key, vehicle and vehicle system
CN107235001A (en) * 2017-06-20 2017-10-10 同致电子科技(昆山)有限公司 It is a kind of that vehicle lamp brightness system is automatically adjusted based on microwave radar

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN204775126U (en) * 2015-07-14 2015-11-18 安徽江淮汽车股份有限公司 Automobile lock car reminder system
CN107235004A (en) * 2017-05-11 2017-10-10 汪超 The method and automobile of a kind of sound volume of automobile horn control
CN107580279A (en) * 2017-08-10 2018-01-12 深圳腾芈技术服务有限公司 Vehicle whistle control method, device and computer-readable recording medium

Also Published As

Publication number Publication date
CN109969125A (en) 2019-07-05

Similar Documents

Publication Publication Date Title
CN110053583B (en) Human-vehicle interaction method and system during vehicle unlocking and vehicle
CN109969125B (en) Human-vehicle interaction method and system during vehicle locking and vehicle
EP3515736B1 (en) Vehicle control device mounted on vehicle and method for controlling the vehicle
CN110015308B (en) Human-vehicle interaction method and system and vehicle
US10787177B2 (en) Exterior rearview device with illumination functions and autonomous vehicles using same
CN110085022B (en) Method and system for interaction between motorcade vehicles and vehicle
EP3456598B1 (en) Vehicle control device
US10160380B1 (en) Enhanced informational vehicle puddle lamp assemblies
US10793004B2 (en) Vehicle control device and vehicle including the same
CN106415686A (en) Trainable transceiver and camera systems and methods
US10723363B2 (en) Vehicle control device and vehicle equipped with vehicle control device
EP3597502A2 (en) Vehicle control device
CN109941182B (en) Lamp effect control method applied to automobile, automobile lamp system and automobile
US20210362597A1 (en) Vehicle control device and vehicle including the same
CN104823122A (en) Imaging system and method for detecting bright city condition
JP2015521136A (en) System and method for controlling a vehicle device responsive to multi-stage settlement detection
CN110641358A (en) Vehicle, vehicle-mounted equipment and vehicle-mounted light control method based on scene information
CN109969075B (en) Interaction method and system between vehicles and vehicle
CN110001510B (en) Vehicle and pedestrian interaction method and system and vehicle
CN108068688A (en) Vehicle-mounted light source Automatic adjustment method, device, system and electronic equipment
CN112002140A (en) Vehicle searching method, device, medium, vehicle-mounted terminal and mobile terminal for parking lot
CN114537269A (en) Control method for light in automobile and automobile
CN116963938A (en) Motor vehicle comprising at least one lighting device and at least one acoustic device, and method for operating a motor vehicle
CN112835351B (en) Control method and related equipment
CN110450846A (en) A kind of signal based reminding method, device and the electric car of electric car

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 510000 No.8 Songgang street, Cencun, Tianhe District, Guangzhou City, Guangdong Province

Applicant after: GUANGZHOU XPENG AUTOMOBILE TECHNOLOGY Co.,Ltd.

Address before: 510555 245, room nine, Jianshe Road 333, Guangzhou knowledge city, Guangzhou, Guangdong.

Applicant before: GUANGZHOU XPENG AUTOMOBILE TECHNOLOGY Co.,Ltd.