CN110706496A - Acoustic-based environment sensing method and system - Google Patents

Acoustic-based environment sensing method and system Download PDF

Info

Publication number
CN110706496A
CN110706496A CN201910483907.0A CN201910483907A CN110706496A CN 110706496 A CN110706496 A CN 110706496A CN 201910483907 A CN201910483907 A CN 201910483907A CN 110706496 A CN110706496 A CN 110706496A
Authority
CN
China
Prior art keywords
vehicle
acoustic signal
environment
surrounding environment
motion state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910483907.0A
Other languages
Chinese (zh)
Inventor
张若天
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Yunke Technology Co Ltd
Original Assignee
Beijing Yunke Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Yunke Technology Co Ltd filed Critical Beijing Yunke Technology Co Ltd
Publication of CN110706496A publication Critical patent/CN110706496A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • G08G1/096791Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is another vehicle

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Atmospheric Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

The embodiment of the application provides an environment sensing method and system based on acoustics, and belongs to the technical field of environment sensing and autonomous control of mobile vehicles. The method comprises the following steps: after a first carrier environment perception acoustic signal is transmitted to the surrounding environment, a first carrier environment feedback acoustic signal fed back by an object in the surrounding environment is received, wherein the object comprises a second carrier; receiving a second vehicle motion state acoustic signal transmitted by a second vehicle, wherein the second vehicle motion state acoustic signal is used for representing the motion state of the second vehicle; and determining the position relation and the motion relation of the object in the surrounding environment relative to the first carrier based on the first carrier environment feedback acoustic signal and the second carrier motion state acoustic signal. The relation of the object in the surrounding environment relative to the first carrier can be comprehensively determined, and the recognition accuracy of the first carrier to the state of the object in the surrounding environment can be improved based on mutual verification and adjustment of the two signals in the fusion judgment process.

Description

Acoustic-based environment sensing method and system
Technical Field
The application relates to the technical field of environmental perception and autonomous control of mobile vehicles, in particular to an environmental perception method and system based on acoustics.
Background
The vehicle is one of important production tools and living tools of human society, and along with the progress of scientific technology and the increasing requirements of comfort, convenience and diversity of human beings on work and life, the human beings put forward higher requirements on the vehicle. With the continuous development of technologies of vehicles such as intelligent automobiles, various automatic driving technologies are also continuously developed. In the development process of the automatic driving technology, how to timely identify the position of the carrier of the other party and how to enable the other carriers in the same traffic environment to timely sense the position of the carrier of the other party is a basic guarantee for improving the safety of the auxiliary driving and the automatic driving.
In the existing known technology, the target position sensing technology applied to the auxiliary driving and automatic driving technology mainly adopts the technologies of laser radar, visual imaging, ultrasonic radar and the like to realize the sensing technology of the target. The laser radar is long in measuring distance and high in precision, but the visual angle is relatively narrow, and the use is limited under the meteorological conditions with low visibility such as haze and the like; visual imaging can distinguish moving and static targets, but does not have the range finding function, and the ultrasonic radar working distance is close. Considering that the self-sensing device is greatly influenced by natural environment, particularly meteorological conditions, and is expensive, the traditional mode has the problems of low system environmental adaptability and high cost.
Disclosure of Invention
In view of the above, an object of the embodiments of the present application is to provide an acoustic-based environment sensing method and system, so as to solve the problems of low system environment adaptability and high cost in the prior art.
The embodiment of the application provides an environment sensing method based on acoustics, which is applied to a first carrier and comprises the following steps: after a first vehicle environment perception acoustic signal is transmitted to the surrounding environment, a first vehicle environment feedback acoustic signal fed back by at least one object in the surrounding environment is received, wherein the at least one object comprises a second vehicle; receiving a second vehicle motion state acoustic signal transmitted by the second vehicle, wherein the second vehicle motion state acoustic signal is used for representing the motion state of the second vehicle; and determining the position relation and the motion relation of the at least one object in the surrounding environment relative to the first vehicle based on the first vehicle environment feedback acoustic signal and the second vehicle motion state acoustic signal.
In the implementation process, the position relation and the motion relation of the object in the surrounding environment relative to the first carrier are determined by integrating the first carrier environment feedback acoustic signal and the second carrier motion state acoustic signal sent by the second carrier, and the identification accuracy and the accuracy of the first carrier on the object state in the surrounding environment are improved and the reliability of position and motion state measurement and calculation is improved simultaneously based on mutual verification and adjustment of the two signals in the fusion determination process.
Further, the first vehicle environmental perception acoustic signal includes a first vehicle acoustic signal and a first vehicle ultrasonic signal, the first vehicle environmental feedback acoustic signal includes a first vehicle acoustic feedback signal and a first vehicle ultrasonic feedback signal, and after the first vehicle environmental perception acoustic signal is transmitted to the surrounding environment, the first vehicle environmental feedback acoustic signal fed back by at least one object in the surrounding environment is received, including: transmitting the first vehicle acoustic signal and the first vehicle ultrasonic signal to a surrounding environment; receiving the first vehicle acoustic feedback signal and the first vehicle ultrasonic feedback signal reflected back by the at least one object in the surrounding environment.
In the implementation process, the first carrier acoustic feedback signal and the first carrier ultrasonic feedback signal are received simultaneously to perform subsequent calculation, based on the transmission characteristics of the sound wave signals with different frequencies, the information of the environmental target far away from the first carrier is obtained through the first carrier acoustic feedback signal with lower frequency, the information of the environmental target close to the first carrier is obtained through the first carrier ultrasonic feedback signal with higher frequency, and the accuracy of the obtained information of the environmental target is improved.
Further, before the determining the positional relationship and the kinematic relationship of the at least one object in the surrounding environment relative to the first vehicle based on the first vehicle environment feedback acoustic signal and the second vehicle motion state acoustic signal, the method further comprises: receiving a second environment perception acoustic signal and environment noise transmitted by the second vehicle; the determining the position relationship and the motion relationship of the at least one object in the surrounding environment relative to the first vehicle based on the first vehicle environment feedback acoustic signal and the second vehicle motion state acoustic signal comprises: and determining the position relation and the motion relation of the at least one object in the surrounding environment relative to the first vehicle based on the first vehicle environment feedback acoustic signal, the second vehicle motion state acoustic signal, the second environment perception acoustic signal and the environmental noise.
In the implementation process, the second environment perception acoustic signal emitted by the second carrier and the environmental noise of the surrounding environment are integrated to identify the objects such as the carriers in the surrounding environment, so that the identification accuracy of the position relation and the motion relation is improved.
Further, before the determining the positional relationship and the kinematic relationship of the at least one object in the surrounding environment relative to the first vehicle based on the first vehicle environment feedback acoustic signal and the second vehicle motion state acoustic signal, the method further comprises: collecting image information of surrounding environment; the determining the position relationship and the motion relationship of the at least one object in the surrounding environment relative to the first vehicle based on the first vehicle environment feedback acoustic signal and the second vehicle motion state acoustic signal comprises: and determining the position relation and the motion relation of the at least one object in the surrounding environment relative to the first carrier based on the first carrier environment feedback acoustic signal, the second carrier motion state acoustic signal and the image information.
In the implementation process, on the basis of the original acoustic signal, the image information is used as reference information to identify objects such as vehicles in the surrounding environment, and the identification accuracy of the position relation and the motion relation is improved.
Further, the determining the position relationship and the motion relationship of the at least one object in the surrounding environment relative to the first vehicle based on the first vehicle environment feedback acoustic signal and the second vehicle motion state acoustic signal includes: obtaining a difference value of the first vehicle environment perception acoustic signal and the first vehicle environment feedback acoustic signal on a characteristic, wherein the characteristic comprises at least one of time, space, frequency and amplitude; and determining the position relation and the motion relation of the at least one object in the surrounding environment relative to the first vehicle based on the difference value and the acoustic signal of the motion state of the second vehicle.
In the implementation process, the position and the relative motion state of the object in the surrounding environment are determined based on the difference values of the first vehicle environment perception acoustic signal and the first vehicle environment feedback acoustic signal in time, space, frequency and amplitude, so that the identification accuracy is further improved.
Further, after the determining the positional relationship and the kinematic relationship of the at least one object in the surrounding environment relative to the first vehicle based on the first vehicle environment feedback acoustic signal and the second vehicle motion state acoustic signal, the method further comprises: and generating a vehicle running environment simulation image based on the position relation and the motion relation, and displaying the vehicle running environment simulation image.
In the implementation process, the vehicle driving environment simulation image is displayed according to the relevant information, so that a driver can know the current surrounding environment more conveniently and clearly, and the driving safety is improved.
Further, after the determining the positional relationship and the kinematic relationship of the at least one object in the surrounding environment relative to the first vehicle based on the first vehicle environment feedback acoustic signal and the second vehicle motion state acoustic signal, the method further comprises: judging whether at least one object in the surrounding environment causes an obstacle to the safe driving of the first vehicle or not based on the position relation and the motion relation; determining the direction and the distance of a target object relative to the first vehicle when the target object causing an obstacle to the safe driving of the first vehicle exists in at least one object in the surrounding environment; and sending prompt information for prompting the existence of the target object and the direction and distance of the target object through a corresponding visual display module and/or a sound prompting module.
In the implementation process, whether an obstacle obstructing safe driving exists or not is automatically judged, and when the obstacle of the type exists, the obstacle is prompted to a driver through images or sound, so that the driving safety of the vehicle is improved.
Further, after the determining the positional relationship and the kinematic relationship of the at least one object in the surrounding environment relative to the first vehicle based on the first vehicle environment feedback acoustic signal and the second vehicle motion state acoustic signal, the method further comprises: and generating a safe driving plan based on the position relation and the motion relation, and performing auxiliary driving or automatic driving according to the safe driving plan.
Further, after the position relation and the motion relation of the at least one object in the surrounding environment relative to the first vehicle are determined based on the first vehicle environment feedback acoustic signal and the second vehicle motion state acoustic signal, whether the first vehicle transmits a state information acoustic signal to the environment is determined according to the distance of the at least one object in the surrounding environment relative to the first vehicle.
In the implementation process, the vehicle is controlled to carry out auxiliary driving or automatic driving based on the determined relative position relation and motion relation, so that the safety and the automation degree of the auxiliary driving or the automatic driving of the vehicle are improved.
An embodiment of the present application further provides an environment sensing system based on acoustics, and the system includes: the loudspeaker array module is used for transmitting a first vehicle environment perception acoustic signal to the surrounding environment; the microphone array module is used for receiving a first vehicle environment feedback acoustic signal fed back by at least one object in the surrounding environment, wherein the at least one object comprises a second vehicle; the microphone array module is further configured to receive a second vehicle motion state acoustic signal emitted by the second vehicle, where the second vehicle motion state acoustic signal is used to characterize a motion state of the second vehicle; and the data processing module is used for determining the position relation and the motion relation of the at least one object in the surrounding environment relative to the first vehicle based on the first vehicle environment feedback acoustic signal and the second vehicle motion state acoustic signal.
Further, the first vehicle environmental perception acoustic signal includes a first vehicle acoustic signal and a first vehicle ultrasonic signal, the first vehicle environmental feedback acoustic signal includes a first vehicle acoustic feedback signal and a first vehicle ultrasonic feedback signal, the acoustic-based environmental perception system further includes an ultrasonic sensor array module, and the speaker array is specifically configured to: transmitting the first vehicle acoustic signal to a surrounding environment; the ultrasonic sensor array module is used for transmitting the first vehicle ultrasonic signal to the surrounding environment; the microphone array module is specifically configured to: receiving the first vehicle acoustic feedback signal reflected back by the at least one object in the surrounding environment; the ultrasonic sensor array module is further configured to receive the first vehicle ultrasonic feedback signal reflected back by the at least one object in the surrounding environment.
Further, the microphone array module is further configured to: and receiving a second environment perception acoustic signal and environment noise transmitted by the second vehicle. The data processing module is specifically configured to: and determining the position relation and the motion relation of the at least one object in the surrounding environment relative to the first vehicle based on the first vehicle environment feedback acoustic signal, the second vehicle motion state acoustic signal, the second environment perception acoustic signal and the environmental noise.
Further, the system further comprises an image acquisition module for acquiring image information of the surrounding environment. The data processing module is specifically configured to: and determining the position relation and the motion relation of the at least one object in the surrounding environment relative to the first carrier based on the first carrier environment feedback acoustic signal, the second carrier motion state acoustic signal and the image information.
Further, the data processing module comprises: a difference value calculation unit, configured to obtain a difference value between the first vehicle environment sensing acoustic signal and the first vehicle environment feedback acoustic signal in a feature, where the feature includes at least one of time, space, frequency, and amplitude; a relative relationship determination unit, configured to determine a positional relationship and a motion relationship of the at least one object in a surrounding environment with respect to the first vehicle based on the difference value and the acoustic signal of the motion state of the second vehicle.
Further, the system further comprises an information display module, and the data processing module is further used for generating a vehicle running environment simulation image based on the position relation and the motion relation; the information display module is used for displaying the vehicle running environment simulation image.
Further, the data processing module is further configured to determine whether objects in the surrounding environment cause an obstacle to safe driving of the first vehicle based on the position relationship and the motion relationship, and when a target object that causes an obstacle to safe driving of the first vehicle exists in at least one object in the surrounding environment, prompt information of the target object, the direction and the distance of the target object exists; the information display module is also used for sending prompt information for prompting the existence of the target object and the direction and distance of the target object through the corresponding visual display module and/or the sound prompting module.
Further, the data processing module is further configured to generate a safe driving plan based on the position relationship and the motion relationship, and perform auxiliary driving or automatic driving according to the safe driving plan.
The embodiment of the application further provides a carrier, the carrier includes the above-mentioned acoustics-based environment perception system, the environment perception system is used for assisting the carrier to carry out assistant driving or automatic driving.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and that those skilled in the art can also obtain other related drawings based on the drawings without inventive efforts.
Fig. 1 is a schematic flowchart of an acoustic-based environment sensing method according to an embodiment of the present disclosure;
fig. 2 is a schematic flowchart of a position relationship and motion relationship determining step according to an embodiment of the present disclosure;
fig. 3 is a flowchart of target location identification according to an embodiment of the present application;
fig. 4 is a schematic flowchart of a prompt step provided in the embodiment of the present application;
fig. 5 is a schematic flow chart of obstacle determination according to an embodiment of the present application;
fig. 6 is a block diagram of an acoustic-based environmental perception system according to an embodiment of the present disclosure.
Icon: 20-an acoustic-based context awareness system; 21-a speaker array module; 22-microphone array module; 23-a data processing module; 24-an ultrasonic sensor array module; 25-an image acquisition module; 26-an information display module; 27-monitoring the microphone module.
Detailed Description
The technical solution in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.
The applicant researches and discovers that on the basis of the increasing development of intelligent automobile development technologies such as advanced Assisted Driving (ADAS), partial automatic driving, high-degree automatic driving, completely unmanned driving and intelligent automobile stages, the requirements of a vehicle for positioning objects in the surrounding environment and judging the motion state are higher and higher, in the prior art, a passive acoustic detection system is generally adopted to judge the positions of obstacles and other vehicles in the surrounding environment of the vehicle, and meanwhile, relevant information such as the motion states of the obstacles and other vehicles is obtained through calculation based on radar information. However, in the prior art, the environment sensing is performed only by the detected sound waves emitted by the vehicle and the received reflected sound waves, which are required to identify the surrounding environment, and when the radar or related components of the vehicle break down, an accurate environment sensing result cannot be obtained, so that the accuracy and reliability are low, and meanwhile, the accuracy may not meet the current automatic driving requirement.
In order to solve the above problem, an execution subject of the method may be a computing device such as a computer, a mobile terminal, or a cloud processor, and optionally, an execution subject of the acoustic-based environment sensing method in this embodiment may be an integrated processing device disposed on a vehicle.
Referring to fig. 1, fig. 1 is a schematic flowchart illustrating an acoustic-based environmental sensing method according to an embodiment of the present disclosure. The specific steps of the environment sensing method can be as follows:
step S12: after a first vehicle environment perception acoustic signal is transmitted to the surrounding environment, a first vehicle environment feedback acoustic signal fed back by at least one object in the surrounding environment is received, and the at least one object comprises a second vehicle.
The first vehicle environment sensing acoustic signal in this embodiment may be an acoustic signal in an audible frequency band (all or a part of a frequency band signal from 50Hz to 20 kHz), or may be an ultrasonic wave (with a frequency from 20kHz to 80kHz), and the first vehicle environment feedback acoustic signal is a reflected acoustic signal that is reflected by an object around the first vehicle.
Considering that objects in a certain range around the first vehicle may cause obstacles to the driving of the first vehicle, at least one object in the surrounding environment includes not only the second vehicle and other vehicles around the first vehicle, but also all objects near the first vehicle, such as stones, fences, lampposts, etc.
Step S14: and receiving a second vehicle motion state acoustic signal transmitted by the second vehicle, wherein the second vehicle motion state acoustic signal is used for representing the motion state of the second vehicle.
The acoustic signal of the motion state of the second vehicle in this embodiment may represent different motion states of the vehicle through different characteristics of the acoustic signal in time delay, spatial distribution, frequency range and amplitude variation, for example, but not limited to, using a single-frequency long pulse signal to represent uniform motion of the vehicle, using a frequency modulated pulse signal from low to high to represent accelerated motion, using a frequency modulated pulse signal from high to low to represent decelerated motion, using a low-high-low pulse signal to represent left-turn motion, using a high-low-high pulse signal to represent right-turn motion, using a continuous single-frequency short pulse to represent reverse.
The specific motion state information of the second vehicle may be extracted from a vehicle control system of the second vehicle, which may be, but is not limited to: start/stop, forward/reverse, acceleration/deceleration, left/right turn, and speed information at that time.
Alternatively, the present embodiment may add the motion state information of the second vehicle or other related information to the sound wave signal by using FSK (Frequency-shift Keying), PSK (Phase shift Keying), OFDM (Orthogonal Frequency division Multiplexing), or other pulse coding techniques.
Specifically, the other related information may include remaining fuel amount of the second vehicle, tire pressure, engine temperature, and other information related to any vehicle.
It should be understood that the first vehicle may also send the first vehicle motion state acoustic signal carrying the position or motion state information of the first vehicle to the surrounding environment during the driving process.
Step S16: and determining the position relation and the motion relation of the at least one object in the surrounding environment relative to the first vehicle based on the first vehicle environment feedback acoustic signal and the second vehicle motion state acoustic signal.
The first vehicle can determine the distance and the relative direction between the object and the first vehicle through the change of the first vehicle environment feedback acoustic signal reflected back by the object relative to the change of the characteristic values of the amplitude, the frequency and the like of the first environment perception acoustic signal, and can also determine the motion states of the object such as the speed, the acceleration and the like of the object by referring to the self speed and the change of the distance and the relative direction between the object and the first vehicle at different times, thereby determining the motion relation of the object relative to the first vehicle.
Furthermore, the first vehicle may analyze and obtain motion state related information, such as a position, a speed, an acceleration, and the like of the second vehicle carried by the second vehicle motion state acoustic signal, and determine the position and the motion state of the second vehicle relative to the motion state of the first vehicle.
It should be understood that, in addition to the second vehicle, the first vehicle may also determine the relative position and motion state of a plurality of other vehicles in the surrounding environment based on the first vehicle environmental feedback acoustic signals and other motion state acoustic signals reflected back by the other vehicles.
In the embodiment, the first vehicle determines the position relationship and the motion relationship of the object in the surrounding environment relative to the first vehicle by integrating the first vehicle environment feedback acoustic signal and the second vehicle motion state acoustic signal sent by the second vehicle, and improves the accuracy and the precision of the first vehicle in identifying the object state in the surrounding environment based on the mutual verification and adjustment of the two signals in the fusion determination process, and simultaneously improves the reliability of the position and motion state measurement.
For step S12, the first vehicle environment perception acoustic signal may include a first vehicle acoustic signal and a first vehicle ultrasonic signal, and correspondingly, the first vehicle environment feedback acoustic signal includes a first vehicle acoustic feedback signal and a first vehicle ultrasonic feedback signal, and after the first vehicle environment perception acoustic signal is transmitted to the surrounding environment, the first vehicle environment feedback acoustic signal fed back by the object in the surrounding environment is received, which may include: transmitting the first vehicle acoustic signal and the first vehicle ultrasonic signal to a surrounding environment; receiving the first vehicle acoustic feedback signal and the first vehicle ultrasonic feedback signal reflected back by the at least one object in the surrounding environment.
First vehicle acoustic signal the first vehicle acoustic feedback signal further, the first vehicle acoustic signal may be an acoustic signal audible to the human ear, and may also serve as a reminder for pedestrians.
In this embodiment, the first vehicle acoustic feedback signal and the first vehicle ultrasonic feedback signal are received at the same time to perform subsequent calculation, based on the transmission characteristics of the acoustic wave signals with different frequencies, information of an environmental target far away from the first vehicle is obtained through the first vehicle acoustic feedback signal with a lower frequency, and information of an environmental target close to the first vehicle is obtained through the first vehicle ultrasonic feedback signal with a higher frequency, so that the accuracy of the obtained information of the environmental target is improved, and the environment sensing range is further improved.
In step S14, when the second vehicle generates the acoustic signal of the second vehicle motion state, the acoustic signal collected by the environment around the second vehicle may be subjected to time-frequency analysis in different directions, such as front, back, left, right, and the like, to obtain the change of the amplitude of the acoustic signal with different frequencies in different directions along with time, and the acoustic signal with a small amplitude value and a small frequency band along with time is selected in different directions to generate the acoustic signal of the second vehicle motion state.
Further, the acoustic signal of the motion state of the second vehicle needs to be transmitted to the first vehicle and can be accurately detected by the first vehicle, for example, the specific criteria may be as follows: and the second vehicle calculates the current ambient noise level to be x0dB according to the collected ambient noise, the distance between the first vehicle and the second vehicle is r0, the signal-to-noise ratio of the acoustic signal of the motion state of the second vehicle detected by the first vehicle is determined to be not lower than z0dB, and the emission sound level of the acoustic signal of the motion state of the second vehicle is x0+ z0+20 x lg (r 0).
Under the daily driving environment, the first vehicle can receive environmental noise besides the first vehicle environment feedback acoustic signal, and other vehicles such as a second vehicle environment perception acoustic signal sent by the second vehicle, and the characteristics of the frequency and the amplitude of the environmental noise and the second vehicle environment perception acoustic signal in space and time are used as reference to generate an acoustic signal which can be changed in real time along with the acoustic signal sent by a non-first vehicle such as the environmental noise and the second vehicle environment perception acoustic signal, has obvious difference in the frequency and the amplitude in space and time with the acoustic signal sent by the non-first vehicle such as the environmental noise and the second vehicle environment perception acoustic signal, and simultaneously meets the corresponding noise regulation and standard requirements to serve as the first vehicle environment perception acoustic signal.
Specifically, before step S16, the method further includes: and receiving a second vehicle environment perception acoustic signal and environmental noise transmitted by the second vehicle. Next, step S16 includes: and determining the position relation and the motion relation of the at least one object in the surrounding environment relative to the first vehicle based on the first vehicle environment feedback acoustic signal, the second vehicle motion state acoustic signal, the second vehicle environment perception acoustic signal and the environmental noise.
The environmental noise can be wind noise, construction sound, human voice or other noises, and the first carrier can identify the environmental noise and judge the relative position relationship and the motion relationship between the sound source and the first carrier.
Further, in addition to the acoustic signal, the embodiment may further perform image information acquisition of the surrounding environment of the first vehicle, and use the image information as reference data when calculating the position and the motion state of the object in the surrounding environment, thereby further improving the calculation accuracy and reliability. Specifically, before step S16, the method further includes: image information of the surrounding environment is acquired. Next, step S16 includes: and determining the position relation and the motion relation of the at least one object in the surrounding environment relative to the first carrier based on the first carrier environment feedback acoustic signal, the second carrier motion state acoustic signal and the image information.
The image information can be a digital image which is acquired by the first carrier and has black and white, color and resolution meeting the identification requirement.
It should be understood that, when the first vehicle in the embodiment calculates the position and the motion state of the object in the surrounding environment, the reference data may include at least one of the first vehicle environment feedback acoustic signal, the second vehicle motion state acoustic signal, the second vehicle environment perception acoustic signal, the environmental noise, the image information, and the like.
In the embodiment, on the basis of the original acoustic signal, the object such as a vehicle in the surrounding environment is identified by using the image information as the reference information, so that the accuracy of identifying the position relationship and the motion relationship is improved.
Referring to step S16, first vehicle environmental perception acoustic signal referring to fig. 2, fig. 2 is a schematic flow chart of a step of determining a position relationship and a motion relationship according to an embodiment of the present application.
Step S161: obtaining a difference value of the first vehicle environment perception acoustic signal and the first vehicle environment feedback acoustic signal in a characteristic, wherein the characteristic comprises at least one of time, space, frequency and amplitude.
It should be understood that the above features may be other types of features besides time, space, frequency, amplitude, etc., and may also be features calculated and transformed based on time, space, frequency, and amplitude.
Step S162: and determining the position relation and the motion relation of the at least one object in the surrounding environment relative to the first vehicle based on the difference value and the acoustic signal of the motion state of the second vehicle.
Along with the increase of the propagation distance, the longer the time from the sending of the acoustic signal to the reflection of the acoustic signal to the first vehicle is, the larger the amplitude attenuation is, and the larger the speed of the second vehicle relative to the first vehicle is, the more obvious the frequency shift phenomenon of the reflected first vehicle environment feedback acoustic signal is, so that the position relation and the motion relation of the object in the environment around the first vehicle relative to the first vehicle can be calculated based on the difference values of the first vehicle environment perception acoustic signal and the first vehicle environment feedback acoustic signal in time, space, frequency and amplitude. In this embodiment, the position and the relative motion state of the object in the surrounding environment are determined based on the difference values of the first vehicle environment sensing acoustic signal and the first vehicle environment feedback acoustic signal in time, space, frequency and amplitude, so that the recognition accuracy is further improved.
Further, please refer to fig. 3, fig. 3 is a flowchart of a target location identification according to an embodiment of the present application. In this embodiment, the process of performing target positioning and identification based on the first vehicle acoustic signal, the first vehicle ultrasonic signal, and the image information may be:
(1) sending a first carrier acoustic signal, obtaining a first carrier acoustic feedback signal corresponding to each direction of the carrier by using a delay summation method, matching the first carrier acoustic feedback signal corresponding to each direction with the first carrier acoustic signal, and obtaining the characteristics of echo arrival delay, sound pressure amplitude attenuation, frequency offset and the like of the first carrier acoustic feedback signal corresponding to the first carrier acoustic signal in different directions;
(2) sending a first carrier ultrasonic signal, obtaining a first carrier ultrasonic feedback signal corresponding to each direction of the carrier by using a time delay summation method, matching the first carrier ultrasonic feedback signal corresponding to each direction with the first carrier ultrasonic signal, and obtaining the characteristics of echo arrival time delay, sound pressure amplitude attenuation, frequency offset and the like of the first carrier ultrasonic feedback signal corresponding to the first carrier ultrasonic signal in different directions;
(3) acquiring a real-time panoramic image of the surrounding environment of the carrier, and determining the spatial distribution of the shape characteristics and the texture characteristics of the image at different moments and in different directions;
(4) correlating the characteristics of the first carrier acoustic feedback signal, the characteristics of the first carrier ultrasonic feedback signal and the change rule of the spatial distribution of the image along with time to obtain the acoustic image combination characteristics of the environmental target;
(5) classifying and identifying the environmental targets by utilizing the feature similarity degree based on the acoustic image joint features and the environmental target feature sample library, wherein the environmental target feature sample library can be a pre-established database containing environmental target feature samples;
(6) the relative position, the movement speed and the movement direction of the environmental target 1 and the carrier are calculated by utilizing the characteristics of the ultrasonic feedback signal of the first carrier in the direction of the environmental target 1 close to the carrier, and the relative position, the movement speed and the movement direction of the environmental target 2 and the carrier are calculated by utilizing the characteristics of the acoustic feedback signal of the first carrier in the direction of the environmental target 2 far away from the carrier.
In order to enhance traffic safety, the present embodiment may further generate a vehicle driving environment simulation image based on the positional relationship and the motion relationship and display the vehicle driving environment simulation image after performing step S16.
The driving environment simulation image may be displayed on a screen of the first vehicle (e.g., a central control display screen of an automobile), and the driving environment simulation image may be displayed on the basis of the acquired actual image information of the surrounding environment, where the position and motion state of each object of the surrounding environment in the actual image information are displayed at the corresponding position, for example, a "second vehicle" is displayed beside the second vehicle in the driving environment simulation image, and the "speed: 59km/h "," distance: 51m ", etc.
In the embodiment, the vehicle driving environment simulation image is displayed according to the related information, so that the driver can know the current surrounding environment more conveniently and clearly, and the driving safety is improved.
Considering that the traffic safety is greatly influenced by the timeliness of the information acquired by the driver in the driving process, the acoustic-based environment sensing method of the embodiment may further include a prompting step. Referring to fig. 4, fig. 4 is a schematic flow chart illustrating a prompting step according to an embodiment of the present disclosure. The prompting steps may specifically be as follows:
step S181: and judging whether at least one object in the surrounding environment causes an obstacle to the safe driving of the first vehicle or not based on the position relation and the motion relation.
In this embodiment, the obstacle determination may be performed based on at least one of the distance between the target object and the first vehicle, the relative movement speed of the target object, and the relative movement direction of the target object. For example, the first vehicle identifies a target object in the surrounding environment based on the obtained real-time panoramic image, and locates the target object; the first carrier positions a target object in the surrounding environment through the characteristic difference of the first carrier acoustic signal and the first carrier acoustic feedback signal; correlating the positioning data of the two vehicles in time and space, further improving the positioning accuracy of the target object in the surrounding environment, and determining the motion state of the target object relative to the first vehicle; and evaluating the threat degree of the target object to the first carrier according to the motion state.
As an alternative implementation, the flow of step S181 in this embodiment may be as shown in fig. 5.
Step S182: when a target object causing an obstacle to safe driving of the first vehicle exists in at least one object in the surrounding environment, determining an obstacle direction of the target object.
The obstacle direction may be, but is not limited to, generally including left, right, oblique rear, front, and rear.
Step S183: and sending prompt information for prompting the obstacle direction through a corresponding visual display module and/or a sound prompting module. Taking an automobile as an example, the visual display module may be a windshield display, a central control display screen, an interior rearview mirror display, an exterior rearview mirror display, etc., and the sound reminding module may be speakers distributed in various directions in the automobile.
In this embodiment, the obstacle prompting direction may be an image or voice message with an obstacle direction in the prompting message, for example, a voice prompting message of "an obstacle exists behind" is directly sent by the voice prompting module. Furthermore, the prompt information of the obstacles in different directions can be sent through the visual display modules or the sound reminding modules in different directions and positions in the first vehicle, for example, if the driving safety of the first vehicle is the target object from two sides of the first vehicle and from the oblique rear, the prompt information is displayed by the outer rearview mirror display, if the driving safety of the first vehicle is influenced by the target object from the rear of the first vehicle, the safety image information is displayed by the inner rearview mirror display, and if the driving safety of the first vehicle is influenced by the target object from the front of the first vehicle, the safety image information is displayed by the windshield display.
Optionally, in addition to sending the prompt message to the driver, the present embodiment may also perform different safety operations according to the current distance between the target object and the first vehicle. For example, when a safe distance (the target object does not threaten the first vehicle) and a threatened distance (the target object threatens the first vehicle) are set, the first vehicle normally travels or continues to track the target object when the distance between the first vehicle and the target object is greater than the safe distance, the first vehicle sends a first vehicle motion state acoustic signal to the target object when the distance between the first vehicle and the target object is less than the safe distance and greater than the threatened distance, and the first vehicle generates a vehicle motion direction and a speed command far away from the target object based on the current distance, motion direction and motion speed of the target object relative to the first vehicle when the distance between the first vehicle and the target object is less than the threatened distance, and travels based on the command.
In the present embodiment, it is automatically determined whether or not there is an obstacle that impedes safe driving, and when there is an obstacle of this type, the driver is presented with an image or sound, thereby improving the vehicle driving safety.
In recent years, with the development of automobile intelligence, assisted driving and automatic driving gradually enter into real traffic, and then after step S16, the present embodiment may further include the following steps: and generating a safe driving plan based on the position relation and the motion relation, and performing auxiliary driving or automatic driving according to the safe driving plan.
Through the steps, the vehicle is controlled to carry out auxiliary driving or automatic driving based on the determined relative position relation and motion relation, and the safety and the automation degree of the auxiliary driving or the automatic driving of the vehicle are improved.
In order to better implement the method for sensing the environment based on the acoustics provided by this embodiment, an environment sensing system 20 based on the acoustics is further used in this embodiment, please refer to fig. 6, and fig. 6 is a structural block diagram of the environment sensing system based on the acoustics provided by this embodiment.
The above-described acoustic-based environment sensing system 20 includes a speaker array module 21, a microphone array module 22, and a data processing module 23.
A speaker array module 21 for emitting a first vehicle environment-perceived acoustic signal to a surrounding environment;
the microphone array module 22 is configured to receive a first vehicle environment feedback acoustic signal fed back by at least one object in the surrounding environment, where the at least one object includes a second vehicle.
The microphone array module 22 is further configured to receive a second vehicle motion state acoustic signal emitted by the second vehicle, where the second vehicle motion state acoustic signal is used to characterize a motion state of the second vehicle.
Alternatively, the speaker array module 21 and the microphone array module 22 are arranged in the form of a speaker array and a microphone array on the front, the side (including the left side and the right side) and the rear of the carrier, the specific arrangement thereof may be different according to the change of the carrier shape, and the speaker array module 21 and the microphone array module 22 are respectively connected with the data processing module 23.
The speaker array module 21 is specifically configured to emit the first vehicle acoustic signal to the surrounding environment, and the microphone array module 22 is specifically configured to receive the first vehicle acoustic feedback signal reflected back by at least one object in the surrounding environment.
Further, the environmental awareness system 20 further includes: an ultrasonic sensor array module 24, configured to transmit the first vehicle ultrasonic signal to the surrounding environment, and further configured to receive the first vehicle ultrasonic feedback signal reflected by at least one object in the surrounding environment.
The microphone array module 22 is further configured to receive a second environmental perception acoustic signal and environmental noise emitted by the second vehicle.
The context awareness system 20 in this embodiment may further comprise an image acquisition module 25 for acquiring image information of the surrounding environment. Correspondingly, the data processing module 23 may be specifically configured to determine a position relationship and a motion relationship of the at least one object in the surrounding environment relative to the first vehicle based on the first vehicle environment feedback acoustic signal, the second vehicle motion state acoustic signal, and the image information.
The image capturing module 25 may be a camera array disposed in front, at side (including left and right sides) and at back of the vehicle, and the specific arrangement manner changes according to the shape of the vehicle, so as to obtain video image information around the vehicle in real time.
The environmental sensing system 20 in this embodiment may further include an information display module 26, the data processing module 23 is configured to generate a vehicle driving environment simulation image based on the position relationship and the motion relationship, and the information display module 26 is configured to display the vehicle driving environment simulation image.
The data processing module 23 in this embodiment is further configured to determine whether at least one object in the surrounding environment causes an obstacle to the safe driving of the first vehicle based on the position relationship and the motion relationship; when a target object causing an obstacle to safe driving of the first vehicle exists in at least one object in the surrounding environment, determining an obstacle direction of the target object. The information display module 26 is configured to send out prompt information for prompting the obstacle direction through a corresponding visual display module and/or an audio prompt module.
The information display module 26 in this embodiment is configured to display information transmitted from the data processing module 23, and the information display module 26 may include a windshield display, a central control display screen, an inside rearview mirror display, an outside rearview mirror display, a voice prompt, and the like.
The environmental awareness system 20 may further include a monitoring microphone module 27 for collecting the first vehicle environmental awareness acoustic signal and the first motion state acoustic signal emitted by the speaker array module 21, so that the data processing module 23 may use the signals to compare with the expected signals to check whether the signals are emitted correctly.
The data processing module 23 in this embodiment may also be an operation module that performs threat target identification, risk assessment, and driving policy on the sensing information of the environment by using the comprehensive utilization system, and is operated based on a control planning unit, and is specifically configured to generate a safe driving plan based on the position relationship and the motion relationship, and perform auxiliary driving or automatic driving according to the safe driving plan.
The data processing module 23 is specifically configured to determine a position relationship and a motion relationship of the at least one object in the surrounding environment relative to the first vehicle based on the first vehicle environment feedback acoustic signal and the second vehicle motion state acoustic signal.
The data processing module 23 is specifically configured to determine a position relationship and a motion relationship of the at least one object in the surrounding environment relative to the first vehicle based on the first vehicle environment feedback acoustic signal, the second vehicle motion state acoustic signal, the second environment perception acoustic signal, and the environmental noise.
Further, the data processing module 23 may include: the difference value calculation unit is used for obtaining the difference values of the first vehicle environment perception acoustic signal and the first vehicle environment feedback acoustic signal in time, space, frequency and amplitude; a relative relationship determination unit, configured to determine a positional relationship and a motion relationship of the at least one object in a surrounding environment with respect to the first vehicle based on the difference value and the acoustic signal of the motion state of the second vehicle.
It should be understood that the data processing module 23 may further include an acoustic information detection unit, an image processing unit, an alert tone generation unit, an environmental awareness monitoring unit, a control planning unit, an ultrasound information processing unit, a data storage unit, and the like.
The acoustic information detection Unit is an acoustic information comprehensive Processing Unit, is an independent Signal/information Processing board card, is used for Processing acoustic signals such as acoustic signals fed back by the environment of the first carrier, acoustic signals of the motion state of the second carrier, environmental noise and the like, and is provided with an independent Signal Processing chip, wherein the independent Signal Processing chip comprises, but is not limited to ARM (Advanced RISCMAcine), FPGA (Field-Programmable Gate Array, Chinese), DSP (Digital Signal Processing), GPU (Graphics Processing Unit, TPU), TPU (sensor Processing Unit, tensor Processing Unit, Chinese) and the like.
The image processing unit is an integrated processing unit of image information, is an independent signal/information processing board card and is used for carrying out corresponding processing on the image information, and an independent signal processing chip is arranged on the image processing unit, wherein the signal processing chip comprises an ARM (advanced RISC machine), an FPGA (field programmable gate array), a DSP (digital signal processor), a GPU (graphic processing unit), a TPU (thermoplastic polyurethane) and the like.
The warning sound generating unit is a first carrier environment feedback acoustic signal and first motion state acoustic signal coding generating unit of the first carrier, and is an independent signal/information processing board card, and an independent signal processing chip is arranged on the signal/information processing board card, and the signal processing chip includes but is not limited to ARM, FPGA, DSP, GPU, TPU and the like.
The environment cognition monitoring unit is a unit which comprehensively utilizes sensing signals to process information and realize target classification, identification and tracking, is an independent information processing board card and is used for determining the position relation and the motion relation of an object in the surrounding environment relative to the first carrier, and is provided with an independent signal processing chip, including but not limited to ARM, FPGA, DSP, GPU, TPU and the like.
The ultrasonic information processing unit is an integrated processing unit of acoustic information, is an independent signal/information processing board card, is used for processing acoustic signals such as acoustic signals fed back by the first carrier environment, acoustic signals of the second carrier motion state, environmental noise and the like, and is provided with an independent signal processing chip, including but not limited to ARM, FPGA, DSP, GPU, TPU and the like.
And the control planning unit generates a safe driving plan based on the position relation and the motion relation obtained by the environment cognition monitoring unit, and performs auxiliary driving or automatic driving according to the safe driving plan.
The data storage unit is a module for transceiving, storing and managing system data, and comprises a corresponding data cache element, a storage element, a data transceiving element and a power supply element.
Further, the present embodiment also provides a vehicle, which includes the acoustic-based environmental awareness system 20, and the vehicle can perform assistant driving or automatic driving with the assistance of the acoustic-based environmental awareness system.
In summary, the embodiment of the present application provides an environmental sensing method and system based on acoustics, which are applied to a first vehicle, and the method includes: after a first vehicle environment perception acoustic signal is transmitted to the surrounding environment, a first vehicle environment feedback acoustic signal fed back by at least one object in the surrounding environment is received, wherein the at least one object comprises a second vehicle; receiving a second vehicle motion state acoustic signal transmitted by the second vehicle, wherein the second vehicle motion state acoustic signal is used for representing the motion state of the second vehicle; and determining the position relation and the motion relation of the at least one object in the surrounding environment relative to the first vehicle based on the first vehicle environment feedback acoustic signal and the second vehicle motion state acoustic signal.
In the implementation process, the position relation and the motion relation of the object in the surrounding environment relative to the first carrier are determined by integrating the first carrier environment feedback acoustic signal and the second carrier motion state acoustic signal sent by the second carrier, and the identification accuracy and the accuracy of the first carrier on the object state in the surrounding environment are improved and the reliability of position and motion state measurement and calculation is improved simultaneously based on mutual verification and adjustment of the two signals in the fusion determination process.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method can be implemented in other ways. The apparatus embodiments described above are merely illustrative, and for example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only an example of the present application and is not intended to limit the scope of the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application. It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.

Claims (10)

1. An acoustic-based environmental perception method applied to a first vehicle, the method comprising:
after a first vehicle environment perception acoustic signal is transmitted to the surrounding environment, a first vehicle environment feedback acoustic signal fed back by at least one object in the surrounding environment is received, wherein the at least one object comprises a second vehicle;
receiving a second vehicle motion state acoustic signal transmitted by the second vehicle, wherein the second vehicle motion state acoustic signal is used for representing the motion state of the second vehicle;
and determining the position relation and the motion relation of the at least one object in the surrounding environment relative to the first vehicle based on the first vehicle environment feedback acoustic signal and the second vehicle motion state acoustic signal.
2. The environment sensing method of claim 1, wherein the first vehicle environment sensing acoustic signal comprises a first vehicle acoustic signal and a first vehicle ultrasonic signal, the first vehicle environment feedback acoustic signal comprises a first vehicle acoustic feedback signal and a first vehicle ultrasonic feedback signal, and the receiving the first vehicle environment feedback acoustic signal fed back by at least one object in the surrounding environment after transmitting the first vehicle environment sensing acoustic signal to the surrounding environment comprises:
transmitting the first vehicle acoustic signal and the first vehicle ultrasonic signal to a surrounding environment;
receiving the first vehicle acoustic feedback signal and the first vehicle ultrasonic feedback signal reflected back by the at least one object in the surrounding environment.
3. The environment awareness method according to claim 1, wherein before said determining a positional relationship and a kinematic relationship of the at least one object in a surrounding environment with respect to the first vehicle based on the first vehicle environment feedback acoustic signal and the second vehicle motion state acoustic signal, the method further comprises:
receiving a second environment perception acoustic signal and environment noise transmitted by the second vehicle;
the determining the position relationship and the motion relationship of the at least one object in the surrounding environment relative to the first vehicle based on the first vehicle environment feedback acoustic signal and the second vehicle motion state acoustic signal comprises:
and determining the position relation and the motion relation of the at least one object in the surrounding environment relative to the first vehicle based on the first vehicle environment feedback acoustic signal, the second vehicle motion state acoustic signal, the second environment perception acoustic signal and the environmental noise.
4. The environment awareness method according to claim 1, wherein before said determining a positional relationship and a kinematic relationship of the at least one object in a surrounding environment with respect to the first vehicle based on the first vehicle environment feedback acoustic signal and the second vehicle motion state acoustic signal, the method further comprises:
collecting image information of surrounding environment;
the determining the position relationship and the motion relationship of the at least one object in the surrounding environment relative to the first vehicle based on the first vehicle environment feedback acoustic signal and the second vehicle motion state acoustic signal comprises:
and determining the position relation and the motion relation of the at least one object in the surrounding environment relative to the first carrier based on the first carrier environment feedback acoustic signal, the second carrier motion state acoustic signal and the image information.
5. The method of claim 1, wherein the determining the position relationship and the motion relationship of the at least one object in the surrounding environment relative to the first vehicle based on the first vehicle environment feedback acoustic signal and the second vehicle motion state acoustic signal comprises:
obtaining a difference value of the first vehicle environment perception acoustic signal and the first vehicle environment feedback acoustic signal on a characteristic, wherein the characteristic comprises at least one of time, space, frequency and amplitude;
and determining the position relation and the motion relation of the at least one object in the surrounding environment relative to the first vehicle based on the difference value and the acoustic signal of the motion state of the second vehicle.
6. The context awareness method of any one of claims 1-5, wherein after the determining the positional relationship and the kinematic relationship of the at least one object in the surrounding environment relative to the first vehicle based on the first vehicle environment feedback acoustic signal and the second vehicle motion state acoustic signal, the method further comprises:
and generating a vehicle running environment simulation image based on the position relation and the motion relation, and displaying the vehicle running environment simulation image.
7. The context awareness method of any one of claims 1-5, wherein after the determining the positional relationship and the kinematic relationship of the at least one object in the surrounding environment relative to the first vehicle based on the first vehicle environment feedback acoustic signal and the second vehicle motion state acoustic signal, the method further comprises:
judging whether at least one object in the surrounding environment causes an obstacle to the safe driving of the first vehicle or not based on the position relation and the motion relation;
determining the direction and the distance of a target object relative to the first vehicle when the target object causing an obstacle to the safe driving of the first vehicle exists in at least one object in the surrounding environment;
and sending prompt information for prompting the existence of the target object and the direction and distance of the target object through a corresponding visual display module and/or a sound prompting module.
8. The context awareness method of any one of claims 1-5, wherein after the determining the positional relationship and the kinematic relationship of the at least one object in the surrounding environment relative to the first vehicle based on the first vehicle environment feedback acoustic signal and the second vehicle motion state acoustic signal, the method further comprises:
and generating a safe driving plan based on the position relation and the motion relation, and performing auxiliary driving or automatic driving according to the safe driving plan.
9. An acoustic-based environmental perception system, the system comprising:
the loudspeaker array module is used for transmitting a first vehicle environment perception acoustic signal to the surrounding environment;
the microphone array module is used for receiving a first vehicle environment feedback acoustic signal fed back by at least one object in the surrounding environment, wherein the at least one object comprises a second vehicle;
the microphone array module is further configured to receive a second vehicle motion state acoustic signal emitted by the second vehicle, where the second vehicle motion state acoustic signal is used to characterize a motion state of the second vehicle;
and the data processing module is used for determining the position relation and the motion relation of the at least one object in the surrounding environment relative to the first vehicle based on the first vehicle environment feedback acoustic signal and the second vehicle motion state acoustic signal.
10. A vehicle comprising the acoustic-based environmental awareness system of claim 9, the environmental awareness system configured to assist in the or autopilot.
CN201910483907.0A 2018-07-10 2019-06-04 Acoustic-based environment sensing method and system Pending CN110706496A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2018107490207 2018-07-10
CN201810749020 2018-07-10

Publications (1)

Publication Number Publication Date
CN110706496A true CN110706496A (en) 2020-01-17

Family

ID=69193045

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910483907.0A Pending CN110706496A (en) 2018-07-10 2019-06-04 Acoustic-based environment sensing method and system

Country Status (1)

Country Link
CN (1) CN110706496A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111413964A (en) * 2020-03-09 2020-07-14 上海理工大学 Method for detecting moving state of obstacle in real time by mobile robot in environment

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1801660A (en) * 2004-10-19 2006-07-12 三洋电机株式会社 Communication device and distance calculation system
GB2463544A (en) * 2008-09-19 2010-03-24 Shih-Hsiung Li Vehicle reversing collision avoidance system
US20110196569A1 (en) * 2010-02-08 2011-08-11 Hon Hai Precision Industry Co., Ltd. Collision avoidance system and method
CN102737522A (en) * 2012-06-29 2012-10-17 惠州天缘电子有限公司 Active anti-collision method based on Internet of vehicles
CN103578115A (en) * 2012-07-31 2014-02-12 电装It研究所 Moving object recognition systems and moving object recognition methods
CN103703388A (en) * 2011-05-21 2014-04-02 大众汽车有限公司 Environment monitoring device in a motor vehicle and method for monitoring the environment using a correlation
CN106537175A (en) * 2014-07-09 2017-03-22 罗伯特·博世有限公司 Device and method for the acoustic examination of objects in the environment of a means of conveyance
CN207502722U (en) * 2017-12-14 2018-06-15 北京汽车集团有限公司 Vehicle and vehicle sensory perceptual system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1801660A (en) * 2004-10-19 2006-07-12 三洋电机株式会社 Communication device and distance calculation system
GB2463544A (en) * 2008-09-19 2010-03-24 Shih-Hsiung Li Vehicle reversing collision avoidance system
US20110196569A1 (en) * 2010-02-08 2011-08-11 Hon Hai Precision Industry Co., Ltd. Collision avoidance system and method
CN103703388A (en) * 2011-05-21 2014-04-02 大众汽车有限公司 Environment monitoring device in a motor vehicle and method for monitoring the environment using a correlation
CN102737522A (en) * 2012-06-29 2012-10-17 惠州天缘电子有限公司 Active anti-collision method based on Internet of vehicles
CN103578115A (en) * 2012-07-31 2014-02-12 电装It研究所 Moving object recognition systems and moving object recognition methods
CN106537175A (en) * 2014-07-09 2017-03-22 罗伯特·博世有限公司 Device and method for the acoustic examination of objects in the environment of a means of conveyance
CN207502722U (en) * 2017-12-14 2018-06-15 北京汽车集团有限公司 Vehicle and vehicle sensory perceptual system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111413964A (en) * 2020-03-09 2020-07-14 上海理工大学 Method for detecting moving state of obstacle in real time by mobile robot in environment

Similar Documents

Publication Publication Date Title
US11231905B2 (en) Vehicle with external audio speaker and microphone
Thombre et al. Sensors and AI techniques for situational awareness in autonomous ships: A review
KR102382587B1 (en) Detection and response of sirens
CN107985225B (en) Method for providing sound tracking information, sound tracking apparatus and vehicle having the same
CN108226854B (en) Apparatus and method for providing visual information of rear vehicle
GB2560412A (en) Generating simulated sensor data for training and validation of detection models
US10970899B2 (en) Augmented reality display for a vehicle
US20100046326A1 (en) Method and apparatus for detection and classification of a swimming object
CN108725452B (en) Unmanned vehicle control system and control method based on full-audio perception
CN108352119B (en) Enhanced sound generation for quiet vehicles with vehicle-to-vehicle communication capability
US11477567B2 (en) Method and system for locating an acoustic source relative to a vehicle
Ortiz et al. Applications and services using vehicular exteroceptive sensors: A survey
US11812245B2 (en) Method, apparatus, and computer-readable storage medium for providing three-dimensional stereo sound
Vladyko et al. Method of early pedestrian warning in developing intelligent transportation system infrastructure
CN110706496A (en) Acoustic-based environment sensing method and system
US20230322212A1 (en) Processing data for driving automation system
Jin et al. Acoussist: An acoustic assisting tool for people with visual impairments to cross uncontrolled streets
Furletov et al. Auditory scene understanding for autonomous driving
CN111824159A (en) Vehicle control method, device, vehicle and computer readable storage medium
CN112519799A (en) Motor vehicle road auxiliary driving device and method
CN109061655B (en) Full-audio sensing system of intelligent driving vehicle and intelligent control method thereof
Choudhury et al. Review of Emergency Vehicle Detection Techniques by Acoustic Signals
US20220272448A1 (en) Enabling environmental sound recognition in intelligent vehicles
CN111213069B (en) Obstacle avoidance device and method based on coherent light
CN106772397B (en) Vehicle data processing method and vehicle radar system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200117

RJ01 Rejection of invention patent application after publication