CN112105953A - Obstacle detection method and device - Google Patents

Obstacle detection method and device Download PDF

Info

Publication number
CN112105953A
CN112105953A CN201880093375.3A CN201880093375A CN112105953A CN 112105953 A CN112105953 A CN 112105953A CN 201880093375 A CN201880093375 A CN 201880093375A CN 112105953 A CN112105953 A CN 112105953A
Authority
CN
China
Prior art keywords
sensor
obstacle
echo
distance
echo signal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201880093375.3A
Other languages
Chinese (zh)
Other versions
CN112105953B (en
Inventor
郑佳
李维
吴祖光
周鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Publication of CN112105953A publication Critical patent/CN112105953A/en
Application granted granted Critical
Publication of CN112105953B publication Critical patent/CN112105953B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/02Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
    • G01S15/06Systems determining the position data of a target
    • G01S15/08Systems for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/93Sonar systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00

Abstract

A method and a device for detecting an obstacle relate to the technical field of ultrasonic detection and are used for improving the accuracy of obstacle boundary detection. The method comprises the following steps: receiving a first echo signal from the main sensor and a second echo signal from the at least one auxiliary sensor (S201), wherein the first echo signal is an echo signal received by the main sensor after an ultrasonic signal sent by the main sensor is reflected by an obstacle, and the second echo signal is an echo signal received by the at least one auxiliary sensor after the ultrasonic signal sent by the main sensor is reflected by the obstacle; determining a first distance between the primary sensor and the obstacle according to the first echo signal of the primary sensor, and determining a second distance between each secondary sensor and the obstacle according to the second echo signal of the secondary sensor (S202); the position of the obstacle is determined based on the first distance, the second distance, the position of the main sensor, and the detection angle of the main sensor (S203).

Description

Obstacle detection method and device Technical Field
The application relates to the technical field of ultrasonic detection, in particular to a method and a device for detecting obstacles.
Background
With the continuous development of society, the number of automobiles is increasing day by day, and the traffic accidents caused by parking in place are also increasing day by day, so people put forward higher requirements on the safety and the operation convenience when parking, and hope that the device can solve the inconvenience brought to people by automobile parking and eliminate unsafe factors in driving. Therefore, many vehicle-mounted devices mainly having a parking function are also on the market.
At present, most of vehicle-mounted devices mainly having a parking function in the prior art adopt a single ultrasonic sensor to detect a parking space in a self-sending and self-receiving mode, and have the problems of low detection angle resolution, large obstacle boundary detection error, high false alarm rate and the like.
Disclosure of Invention
The embodiment of the application provides an obstacle detection method and device, which are used for improving the accuracy of obstacle boundary detection.
In order to achieve the above purpose, the embodiment of the present application adopts the following technical solutions:
in a first aspect, there is provided an obstacle detection method, the method comprising: receiving a first echo signal from a main sensor and a second echo signal from at least one auxiliary sensor, wherein the first echo signal is an echo signal received by the main sensor after an ultrasonic signal sent by the main sensor is reflected by an obstacle, the second echo signal is an echo signal received by the at least one auxiliary sensor after the ultrasonic signal sent by the main sensor is reflected by the obstacle, and a sensor array comprises the main sensor and the at least one auxiliary sensor; determining a first distance between the primary sensor and the obstacle according to the first echo signal of the primary sensor, and determining a second distance between the secondary sensor and the obstacle according to the second echo signal of each secondary sensor; and determining the position of the obstacle according to the first distance, the second distance, the position of the main sensor and the detection angle of the main sensor. In the technical scheme, the distance between each sensor and the obstacle is determined through a mode of one-time transmission and multi-time reception (namely the main sensor transmits an ultrasonic signal, and the main sensor and at least one auxiliary sensor receive an echo signal of the ultrasonic signal reflected by the obstacle), so that the interference between adjacent sensors is avoided, the detection error is reduced, and meanwhile, the position of the obstacle is determined by combining the distance between the main sensor and the obstacle, the position of the main sensor and the detection angle of the main sensor, so that the detection angle resolution and the obstacle boundary detection accuracy are improved.
In a possible implementation manner of the first aspect, the method further includes: and determining the position of the parking space according to the positions of the obstacles on the two sides of the parking space. In the possible implementation manner, the accuracy of parking space detection can be improved under the condition that the vehicle is at a high speed or a low speed.
In one possible implementation manner of the first aspect, a space is provided between any two adjacent sensors in the sensor array. Among the above-mentioned possible implementation, through rationally setting up the interval between two adjacent sensors, can effectively reduce or avoid ultrasonic signal's aftershock interference to reduce detection error.
In one possible implementation manner of the first aspect, determining the position of the obstacle according to the first distance, the second distance, the position of the main sensor, and the detection angle of the main sensor includes: determining an echo boundary of the obstacle according to the first distance, the position of the main sensor and the detection angle of the main sensor; and correcting the echo boundary according to the second distance to obtain boundary truncation points at two ends of the echo boundary, wherein the echo boundary and the boundary truncation points are used for determining the position of the obstacle. In the possible implementation manner, the echo boundary of the obstacle determined by using the first distance is corrected by using the second distance, so that the detection angle resolution can be improved, and the accuracy of obstacle boundary detection can be further improved.
In one possible implementation manner of the first aspect, determining the echo boundary of the obstacle according to the first distance, the position of the main sensor, and the detection angle of the main sensor includes: and determining the formed echo arc line as the echo boundary of the obstacle by taking the position of the main sensor as the circle center, the detection angle of the main sensor as the sector radian and the first distance as the radius. In the above possible implementation, a simple and effective way of determining the echo boundary of the obstacle is provided.
In one possible implementation manner of the first aspect, before receiving the first echo signal from the primary sensor and the second echo signal from the at least one secondary sensor, the method further includes: and sending control information to the sensor array, wherein the control information is used for instructing the main sensor to send the ultrasonic signals and instructing the main sensor and the at least one auxiliary sensor to receive echo signals of the ultrasonic signals reflected by the obstacles. In the possible implementation manner, the sensor array is controlled to adopt a one-shot multi-shot mode, so that the interference between adjacent sensors can be avoided, and the detection error is reduced.
In a second aspect, there is provided the obstacle detecting apparatus, comprising: the sensor array comprises a main sensor, at least one auxiliary sensor and a receiving unit, wherein the main sensor is used for receiving a first echo signal from the main sensor and a second echo signal from the at least one auxiliary sensor, the first echo signal is an echo signal received by the main sensor after an ultrasonic signal sent by the main sensor is reflected by an obstacle, the second echo signal is an echo signal received by the at least one auxiliary sensor after the ultrasonic signal sent by the main sensor is reflected by the obstacle, and the sensor array comprises the main sensor and the at least one auxiliary sensor; a determining unit, configured to determine a first distance between the primary sensor and the obstacle according to the first echo signal of the primary sensor, and determine a second distance between the secondary sensor and the obstacle according to the second echo signal of each secondary sensor; and the determining unit is further used for determining the position of the obstacle according to the first distance, the second distance, the position of the main sensor and the detection angle of the main sensor.
In a possible implementation manner of the second aspect, the determining unit is further configured to: and determining the position of the parking space according to the positions of the obstacles on the two sides of the parking space.
In a possible implementation manner of the second aspect, the determining unit is further configured to: determining an echo boundary of the obstacle according to the first distance, the position of the main sensor and the detection angle of the main sensor; and correcting the echo boundary according to the second distance to obtain boundary truncation points at two ends of the echo boundary, wherein the echo boundary and the boundary truncation points are used for determining the position of the obstacle.
In a possible implementation manner of the second aspect, the determining unit is further configured to: and determining the formed echo arc line as the echo boundary of the obstacle by taking the position of the main sensor as the circle center, the detection angle of the main sensor as the sector radian and the first distance as the radius.
In a possible implementation manner of the second aspect, the apparatus further includes: and the transmitting unit is used for transmitting control information to the sensor array, wherein the control information is used for instructing the main sensor to transmit the ultrasonic signal and instructing the main sensor and the at least one auxiliary sensor to receive the echo signal of the ultrasonic signal reflected by the obstacle.
In a third aspect, there is provided an obstacle detection apparatus, including a processor and a memory, where the memory stores codes and data, and the processor executes the codes in the memory to enable the apparatus to perform the obstacle detection method provided in the first aspect or any possible implementation manner of the first aspect.
In a fourth aspect, there is provided an in-vehicle apparatus including: a processor and a sensor array; wherein the sensor array comprises a main sensor and at least one auxiliary sensor, and the processor is the obstacle detection device provided by the second aspect or any possible implementation manner of the second aspect.
In a possible implementation manner of the fourth aspect, a space is provided between any two adjacent sensors in the sensor array.
In yet another aspect of the present application, a readable storage medium is provided, in which instructions are stored, which, when run on a device, cause the device to perform the obstacle detection method provided by the first aspect or any one of the possible implementations of the first aspect.
In yet another aspect of the present application, a computer program product is provided, which, when run on a computer, causes the computer to perform the obstacle detection method provided by the first aspect or any one of the possible implementations of the first aspect.
It is understood that the apparatus, the computer storage medium, or the computer program product of any one of the above-mentioned obstacle detection methods is used for executing the corresponding method provided above, and therefore, the beneficial effects achieved by the apparatus, the computer storage medium, or the computer program product may refer to the beneficial effects of the corresponding method provided above, and are not described herein again.
Drawings
Fig. 1 is a schematic structural diagram of an on-board device according to an embodiment of the present disclosure;
fig. 2 is a first schematic flowchart of a method for detecting an obstacle according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of a first distance and a second distance provided by an embodiment of the present application;
fig. 4 is a first schematic diagram of an echo boundary according to an embodiment of the present disclosure;
fig. 5 is a schematic diagram of a boundary truncation point of an echo boundary according to an embodiment of the present disclosure;
fig. 6 is a second schematic diagram of an echo boundary according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a position of an obstacle according to an embodiment of the present disclosure;
fig. 8 is a second schematic flowchart of an obstacle detection method according to an embodiment of the present application;
fig. 9 is a schematic view of a parking space provided in the embodiment of the present application;
fig. 10 is a first schematic structural diagram of an obstacle detection apparatus according to an embodiment of the present disclosure;
fig. 11 is a schematic structural diagram of a second obstacle detection device according to an embodiment of the present application.
Detailed Description
In the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone, wherein A and B can be singular or plural. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, a-b, a-c, b-c or a-b-c, wherein a, b and c can be single or multiple. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. In addition, in the embodiments of the present application, the words "first", "second", and the like do not limit the number and the execution order.
It is noted that, in the present application, words such as "exemplary" or "for example" are used to mean exemplary, illustrative, or descriptive. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
Fig. 1 is a schematic structural diagram of an on-board device provided in an embodiment of the present application, where the on-board device is applicable to a vehicle, and the on-board device may include: a processor and at least one sensor array. The processor may be a central processing unit, a general-purpose processor, a digital signal processor, a microcontroller, a microprocessor, or the like. Further, the processor may also include other hardware circuits or accelerators, such as application specific integrated circuits, field programmable gate arrays or other programmable logic devices, transistor logic devices, hardware components, or any combination thereof. Which may implement or perform the various illustrative logical blocks, modules, and circuits described in connection with the disclosure. A processor may also be a combination of computing functions, e.g., a combination of one or more microprocessors, a digital signal processor and a microprocessor, or the like.
The at least one sensor array may include one or more sensor arrays (a plurality of sensor arrays may be connected in series by a local-area-internet (LIN) line), and the LIN line may be connected to the processor by a controller area network (CAN-BUS) BUS, so that the processor may communicate with the at least one sensor array through the LIN line and the CAN-BUS. Wherein each sensor array may include a plurality of sensors, and one sensor of the plurality of sensors may be a primary sensor (e.g., a sensor located in the middle of the sensor array may be the primary sensor), and the remaining sensors may be secondary sensors. The sensor may be an Auto Park Assist (APA) sensor. The example of at least one sensor array comprising two APA arrays, one on each side of the vehicle, each APA array comprising 5 APA sensors is illustrated in fig. 1.
Fig. 2 is a schematic flowchart of an obstacle detection method according to an embodiment of the present application, where the method is applicable to the vehicle-mounted device shown in fig. 1, and may be specifically executed by a processor in the vehicle-mounted device, and referring to fig. 2, the method includes the following steps.
S201: the processor receives a first echo signal from the primary sensor and a second echo signal from the at least one secondary sensor.
Wherein the sensor array may include a primary sensor and at least one secondary sensor, and the at least one secondary sensor may include one or more secondary sensors. The main sensor and the auxiliary sensor may have the same sensor parameters, for example, the detection angle and the detection distance of the main sensor and the auxiliary sensor are the same. The primary sensor may be located in the middle of the sensor array, such as the APA array shown in fig. 1 above, which includes 5 APA sensors, the sensor located in the middle may serve as the primary sensor, and the remaining 4 sensors may serve as the secondary sensors. Optionally, a distance is provided between any two adjacent sensors in the sensor array, and when the types, manufacturers, or sensor parameters of the sensors are different, the distance may also be different, for example, taking the APA array shown in fig. 1 as an example, the distance between two adjacent sensors may be 30 cm. In this application embodiment, the first echo signal is an echo signal received by the primary sensor after an ultrasonic signal sent by the primary sensor is reflected by an obstacle, the second echo signal is an echo signal received by the at least one secondary sensor after the ultrasonic signal sent by the primary sensor is reflected by the obstacle, and the sensor array includes the primary sensor and the at least one secondary sensor.
Further, as shown in fig. 2, before S201, the method may further include: and S200.
S200: the processor sends control information to the sensor array, wherein the control information is used for instructing the main sensor to send the ultrasonic signal and instructing at least one auxiliary sensor in the main sensor and the sensor array to receive the echo signal of the ultrasonic signal reflected by the obstacle.
Optionally, when the processor sends the control information to the sensor array, the control information may be implemented in several different ways, which are described in detail below.
Firstly, the processor only sends the control information corresponding to one sensor at a time, and completes the sending of the control information corresponding to different sensors by multiple times; for example, the processor sends first control information to the main sensor, the first control information is used for instructing the main sensor to send an ultrasonic signal and receive an echo signal of the ultrasonic signal reflected by the obstacle, the processor sends second control information to each of the at least one auxiliary sensor, and the second control information corresponding to one auxiliary sensor is used for instructing the auxiliary sensor to receive the echo signal of the ultrasonic signal reflected by the obstacle.
Secondly, the processor sends control information corresponding to the plurality of sensors at one time; for example, the processor sends control information including a plurality of fields to the sensor array, different fields may be used to carry control information corresponding to different sensors in the sensor array, for example, at least one secondary sensor includes 4 secondary sensors, the control information includes 5 fields, the 1 st field is used to instruct the primary sensor to send an ultrasonic signal and receive an echo signal of the ultrasonic signal reflected by an obstacle, and the 2 nd to 5 th fields are respectively used to instruct the 4 secondary sensors to receive echo signals of the ultrasonic signal reflected by the obstacle.
Thirdly, the processor sends the control information by the first mode and the second mode simultaneously; for example, the processor sends first control information to the main sensor, the first control information is used for instructing the main sensor to send an ultrasonic signal and receive an echo signal of the ultrasonic signal after the ultrasonic signal is reflected by an obstacle, the processor sends second control information comprising a plurality of fields to the at least one auxiliary sensor, and different fields can be used for carrying control information corresponding to different auxiliary sensors.
Specifically, the processor sends the control information to the sensor array, and when the sensor array receives the control information, the main sensor can send an ultrasonic signal according to the control information corresponding to the main sensor in the control information and receive an echo signal of the ultrasonic signal reflected by the obstacle; each auxiliary sensor can receive the echo signal of the ultrasonic signal reflected by the obstacle according to the control information corresponding to each auxiliary sensor in the control information. The primary sensor may then transmit the received echo signals to the processor, and each secondary sensor may also transmit the respective received echo signals to the processor.
Optionally, the control information may also be used to instruct the main sensor to transmit the frequency of the ultrasonic signal, for example, the control information is used to instruct the main sensor to transmit the ultrasonic signal at a certain fixed frequency. Correspondingly, when the main sensor sends the ultrasonic signal, the main sensor can send the ultrasonic signal according to the frequency indicated by the control information, and meanwhile, the main sensor and the auxiliary sensor can also receive the echo signal of the ultrasonic signal reflected by the obstacle according to a certain frequency and send the respective received echo signal to the processor.
Accordingly, after the processor sends control information to the sensor array, the processor may delay receiving a first echo signal from the primary sensor and a second echo signal from the at least one secondary sensor for a fixed time. Optionally, when the main sensor and the auxiliary sensor receive and transmit echo signals of ultrasonic signals reflected by the obstacle according to a certain frequency, the processor may also receive echo signals from the respective sensors according to a certain frequency, that is, the processor may receive multi-frame echo signals (i.e., multiple first echo signals) from the main sensor and multi-frame echo signals (i.e., multiple second echo signals) from each auxiliary sensor within a period of time.
S202: the processor determines a first distance between the primary sensor and the obstacle from the first echo signal of the primary sensor and a second distance between each secondary sensor and the obstacle from the second echo signal of the secondary sensor.
When at least one auxiliary sensor comprises one auxiliary sensor, determining to obtain a second distance according to a second echo signal of the auxiliary sensor; when the at least one auxiliary sensor comprises a plurality of auxiliary sensors, a plurality of second distances are determined according to the second echo signals of the plurality of auxiliary sensors, namely one auxiliary sensor corresponds to one second distance. Since the first echo signal of the main sensor and the second echo signal of the at least one auxiliary sensor are echo signals of the ultrasonic signal sent by the main sensor after being reflected by the obstacle, the time for the ultrasonic signal to reach the obstacle can be considered to be the same relative to the main sensor and the at least one auxiliary sensor (i.e. the time for the ultrasonic signal to reach the echo point of each sensor at the obstacle is considered to be the same, where the echo point can be referred to as the starting point of the echo signal), and the time for the echo signal received by each sensor to return to the obstacle varies due to the position of the sensor (i.e. the distance between each sensor and the obstacle varies).
For example, taking 5 APA sensors shown in fig. 1 as 0-4, where 0 is used to indicate a main sensor, and 1-4 are used to indicate 4 auxiliary sensors as examples, the forward and backward travel times corresponding to each sensor may be as shown in table 1 below, where the distance from each sensor to an obstacle may be determined by R ═ C ×, t is the propagation rate of the ultrasonic signal, and t is the backward travel time corresponding to the sensor. T in Table 10、T 1、T 2、T 3And T4Respectively representing the sum of the outbound time and the inbound time (i.e., the total round-trip time) for the sensors numbered 0-4, the first distance between the primary sensor numbered 0 and the obstacle may be represented as R0=C*(T 0/2), the second distances between the auxiliary sensors numbered 1-4 and the obstacle can be respectively denoted as R1=C*(T 1-T 0/2)、R 2=C*(T 2-T 0/2)、R 3=C*(T 3-T 0/2) and R4=C*(T 4-T 0/2), corresponding first distance R0Four second distances R1、R 2、R 3And R4As can be seen in fig. 3.
TABLE 1
Figure PCTCN2018117505-APPB-000001
Further, when the processor receives the multi-frame echo signal from the main sensor and the multi-frame echo signal from each auxiliary sensor within a period of time, the processor may determine to obtain a plurality of first distances and a plurality of second distances corresponding to each auxiliary sensor within the period of time according to the above manner.
S203: the processor determines the position of the obstacle based on the first distance, the second distance, the position of the primary sensor, and the detection angle of the primary sensor.
Wherein the processor may determine an echo boundary of the obstacle according to the first distance, the position of the main sensor, and the detection angle of the main sensor; and correcting the echo boundary according to the second distance to obtain boundary truncation points at two ends of the echo boundary, wherein the echo boundary and the boundary truncation points are used for determining the position of the obstacle.
Alternatively, the processor may establish a grid coordinate system, an origin of which may be a center point of a vehicle on which the in-vehicle device is mounted, and the position of the main sensor may include a coordinate position of the main sensor in the grid coordinate system and an elevation angle of the main sensor. Specifically, as shown in fig. 4, when the processor determines the echo boundary of the obstacle, the processor may use the position of the main sensor as a center of a circle, the detection angle of the main sensor as a sector arc, and the first distance as a radius, and the elevation angle of the main sensor is used to determine the direction of the sector arc, so that the formed echo arc may be determined as the echo boundary of the obstacle. In FIG. 4, the coordinate position of the main sensor in the grid coordinate system is (X)0,Y 0) The description is given for the sake of example.
Illustratively, the echo boundary may be represented by (x) in the following formula (1)i,y i) Expressed as R in the formula (1)0Denotes a first distance, (X)0,Y 0) Indicating the coordinate position of the main sensor,
Figure PCTCN2018117505-APPB-000002
representing the detection angle of the main sensor, η representing the resolution of the grid coordinate system,
Figure PCTCN2018117505-APPB-000003
indicating a rounding down.
Figure PCTCN2018117505-APPB-000004
When the echo boundary is corrected according to the second distance, the position of each secondary sensor in the at least one secondary sensor is assumed to be represented as (X)j,Y j) And the second distance corresponding to each secondary sensor is represented as RjThe value range of j is 1 to N, and N is the number of at least one auxiliary sensor, the echo boundary can be modified according to the following formulas (2) and (3), and the boundary truncation points at the two ends of the echo boundary can be respectively represented as (x)L,y L) And (x)R,y R),(x L,y L) For the truncation point on the left, (x)R,y R) For indicating the right truncation point. (x) in formula (2)jh,y jh) Represents (x) corresponding to the minimum value of formula (2)i,y i) Step (k) is equal to step in the above equation (1).
Figure PCTCN2018117505-APPB-000005
Figure PCTCN2018117505-APPB-000006
Exemplarily, in connection with the schematic diagram of the echo boundary shown in fig. 4, it is assumed that the boundary truncation points (x) at the two ends of the echo boundary are obtained according to the above equations (2) and (3)L,y L) And (x)R,y R) As shown in fig. 5, the echo boundary is located at the boundary truncation point (x)L,y L) And (x)R,y R) The section in between is used to determine the position of the obstacle, i.e. the arc denoted L-R in fig. 5 is used to determine the position of the obstacle.
Further, the processor may determine the position of the obstacle according to the above manner by using a plurality of first distances determined over a period of time and a plurality of second distances corresponding to each secondary sensor. For example, the processor determines an echo boundary corresponding to each of the plurality of first distances in the above manner for determining the echo boundary, the corresponding plurality of echo boundaries may be as shown in fig. 6, and for each echo boundary, a boundary truncation point is determined as described above, and the boundary of the obstacle determined by each echo boundary and the corresponding boundary truncation point may be as shown in fig. 7.
Further, referring to fig. 8, after S203, the method further includes: and S204.
S204: the processor determines the position of the parking space according to the positions of the obstacles on the two sides of the parking space.
When the processor detects the parking space, the processor can determine the position of the parking space according to the positions of the obstacles on the two sides of the parking space. For example, if the positions of two obstacles determined by the processor are as shown in fig. 9, that is, the processor determines the positions of the obstacles (i.e., vehicles) on two adjacent sides of a parking space, and the distance between the two obstacles is greater than or equal to the width of the parking space, the processor may determine that the position between the two obstacles is the position of the parking space.
In the embodiment of the application, the processor performs signal detection by controlling the sensor array in a one-to-many mode (that is, the main sensor sends an ultrasonic signal, and the main sensor and the at least one auxiliary sensor receive an echo signal of the ultrasonic signal reflected by an obstacle), and determines the distance between each sensor and the obstacle, so that interference between adjacent sensors is avoided, detection errors are reduced, and meanwhile, the position of the obstacle is obtained by correcting the echo boundary of the obstacle determined by the first distance by using the second distance, so that the detection angle resolution and the obstacle boundary detection accuracy are improved. In addition, the positions of the obstacles on the two sides of the parking space can be accurately and effectively determined, and the accuracy of parking space detection is improved.
The obstacle detection method provided by the embodiment of the present application is introduced mainly from the perspective of the vehicle-mounted device. It is understood that the vehicle-mounted device includes hardware structures and/or software modules for performing the respective functions in order to realize the functions. Those of skill in the art would readily appreciate that the present application is capable of being implemented as hardware or a combination of hardware and computer software for performing the exemplary network elements and algorithm steps described in connection with the embodiments disclosed herein. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiment of the present application, the functional modules of the obstacle detection device may be divided according to the above method, for example, each functional module may be divided according to each function, or two or more functions may be integrated into one processing module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. It should be noted that, in the embodiment of the present application, the division of the module is schematic, and is only one logic function division, and there may be another division manner in actual implementation.
In the case of dividing each functional module according to each function, fig. 10 shows a schematic structural diagram of an obstacle detection apparatus according to the above embodiment, where the apparatus may be a chip or a processing device, and includes: a receiving unit 1001 and a determining unit 1002. Wherein, the receiving unit 1001 is configured to support the apparatus to execute S201 in the foregoing method embodiment; the determining unit 1002 is configured to enable the apparatus to perform S202, S203 in the above-described method embodiment, and/or other processes for the techniques described herein. Optionally, the determining unit 1002 is further configured to support the apparatus to execute S204 in the foregoing method embodiment. Further, the apparatus further comprises: a transmission unit 1003; wherein, the sending unit 1003 is configured to support the apparatus to execute S200 in the foregoing method embodiment. All relevant contents of each step related to the above method embodiment may be referred to the functional description of the corresponding functional module, and are not described herein again.
Based on the hardware implementation, the determining unit 1002 may be a processor, the transmitting unit 1003 may be a transmitter, the receiving unit 1001 may be a receiver, and the receiver and the transmitter may be integrated into a transceiver, which may also be referred to as a communication interface.
Fig. 11 is a schematic structural diagram of another obstacle detection apparatus according to an embodiment of the present application, where the apparatus may be a chip or a processing device, and the apparatus includes: a memory 1101 and a processor 1102. Wherein the memory 1101 is used for storing program codes and data of the apparatus, and the processor 1102 is used for controlling and managing the actions of the apparatus shown in fig. 11, for example, the processor 1102 is used for supporting the apparatus to execute the above-mentioned S202-S204 in the method embodiment, and/or other processes for the technology described herein. Optionally, the apparatus shown in fig. 11 may further include a communication interface 1103, where the communication interface 1103 is configured to support the apparatus to perform S200 and S201 in the foregoing method embodiment.
The processor 1102 may be a central processing unit, a general purpose processor, a digital signal processor, an application specific integrated circuit, a processing chip, a field programmable gate array or other programmable logic device, a transistor logic device, a hardware component, or any combination thereof. Which may implement or perform various ones of the logic blocks, modules, and circuits described in connection with the disclosure of the embodiments of the application. The processor 1102 may also be a combination of computing functions, e.g., comprising one or more microprocessors, a digital signal processor and a microprocessor, or the like. The communication interface 1103 may be a transceiver, a transceiving circuit, a transceiving interface, or the like. The memory 1101 may be a volatile memory or a nonvolatile memory, or the like.
For example, the communication interface 1103, the processor 1102, and the memory 1101 are connected to each other by a bus 1104; the bus 1104 may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The bus 1104 may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown in FIG. 11, but this is not intended to represent only one bus or type of bus. Optionally, the memory 1101 may be included in the processor 1102.
The embodiment of the present application further provides an on-board device, the structure of which can be seen from fig. 1, and the device includes: a processor and a sensor array; wherein the sensor array comprises a primary sensor and at least one secondary sensor, and the processor may be the obstacle detection device provided in any one of fig. 10 or fig. 11 above. Optionally, a space is provided between any two adjacent sensors in the sensor array.
In the embodiment of the application, the processor performs signal detection by controlling the sensor array in a one-to-many mode (that is, the main sensor sends an ultrasonic signal, and the main sensor and the at least one auxiliary sensor receive an echo signal of the ultrasonic signal reflected by an obstacle), and determines the distance between each sensor and the obstacle, so that interference between adjacent sensors is avoided, detection errors are reduced, and meanwhile, the position of the obstacle is obtained by correcting the echo boundary of the obstacle determined by the first distance by using the second distance, so that the detection angle resolution and the obstacle boundary detection accuracy are improved.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical functional division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another device, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or a plurality of physical units, that is, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a terminal to execute all or part of the steps of the methods described in the embodiments of the present application, or all or part of the technical solutions. And the aforementioned storage medium includes: various media capable of storing program codes, such as a U disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk.
Finally, it should be noted that: the above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (16)

  1. An obstacle detection method, characterized in that the method comprises:
    receiving a first echo signal from a main sensor and a second echo signal from at least one auxiliary sensor, wherein the first echo signal is an echo signal received by the main sensor after an ultrasonic signal sent by the main sensor is reflected by an obstacle, the second echo signal is an echo signal received by the at least one auxiliary sensor after the ultrasonic signal sent by the main sensor is reflected by the obstacle, and a sensor array comprises the main sensor and the at least one auxiliary sensor;
    determining a first distance between the primary sensor and the obstacle from the first echo signal of the primary sensor, and determining a second distance between the secondary sensor and the obstacle from the second echo signal of each secondary sensor;
    and determining the position of the obstacle according to the first distance, the second distance, the position of the main sensor and the detection angle of the main sensor.
  2. The method of claim 1, further comprising:
    and determining the position of the parking space according to the positions of the obstacles on the two sides of the parking space.
  3. The method of claim 1, wherein a spacing is provided between any two adjacent sensors in the sensor array.
  4. The method according to any one of claims 1-3, wherein said determining the position of the obstacle based on the first distance, the second distance, the position of the primary sensor, and the detection angle of the primary sensor comprises:
    determining an echo boundary of the obstacle according to the first distance, the position of the main sensor and the detection angle of the main sensor;
    and correcting the echo boundary according to the second distance to obtain boundary truncation points at two ends of the echo boundary, wherein the echo boundary and the boundary truncation points are used for determining the position of the obstacle.
  5. The method of claim 4, wherein determining the echo boundary of the obstacle based on the first distance, the position of the primary sensor, and the detection angle of the primary sensor comprises:
    and determining the formed echo arc line as the echo boundary of the obstacle by taking the position of the main sensor as the circle center, the detection angle of the main sensor as the sector radian and the first distance as the radius.
  6. The method of any of claims 1-5, wherein prior to said receiving the first echo signal from the primary sensor and the second echo signal from the at least one secondary sensor, the method further comprises:
    and sending control information to the sensor array, wherein the control information is used for instructing the main sensor to send the ultrasonic signal and instructing the main sensor and the at least one auxiliary sensor to receive an echo signal of the ultrasonic signal reflected by the obstacle.
  7. An obstacle detection apparatus, characterized in that the apparatus comprises:
    a receiving unit, configured to receive a first echo signal from a primary sensor and a second echo signal from at least one secondary sensor, where the first echo signal is an echo signal received by the primary sensor after an ultrasonic signal sent by the primary sensor is reflected by an obstacle, the second echo signal is an echo signal received by the at least one secondary sensor after the ultrasonic signal sent by the primary sensor is reflected by the obstacle, and a sensor array includes the primary sensor and the at least one secondary sensor;
    a determining unit, configured to determine a first distance between the primary sensor and the obstacle according to the first echo signal of the primary sensor, and determine a second distance between the secondary sensor and the obstacle according to the second echo signal of each secondary sensor;
    the determining unit is further configured to determine the position of the obstacle according to the first distance, the second distance, the position of the main sensor, and the detection angle of the main sensor.
  8. The apparatus of claim 7, wherein the determining unit is further configured to:
    and determining the position of the parking space according to the positions of the obstacles on the two sides of the parking space.
  9. The apparatus according to claim 7 or 8, wherein the determining unit is further configured to:
    determining an echo boundary of the obstacle according to the first distance, the position of the main sensor and the detection angle of the main sensor;
    and correcting the echo boundary according to the second distance to obtain boundary truncation points at two ends of the echo boundary, wherein the echo boundary and the boundary truncation points are used for determining the position of the obstacle.
  10. The apparatus of claim 9, wherein the determining unit is further configured to:
    and determining the formed echo arc line as the echo boundary of the obstacle by taking the position of the main sensor as the circle center, the detection angle of the main sensor as the sector radian and the first distance as the radius.
  11. The apparatus according to any one of claims 7-10, wherein the further apparatus comprises:
    and the transmitting unit is used for transmitting control information to the sensor array, wherein the control information is used for indicating the main sensor to transmit an ultrasonic signal and indicating the main sensor and the at least one auxiliary sensor to receive an echo signal of the ultrasonic signal reflected by an obstacle.
  12. An obstacle detection apparatus, comprising a processor and a memory, the memory storing code and data, the processor executing the code in the memory to cause the apparatus to perform the obstacle detection method of any one of claims 1 to 6.
  13. An in-vehicle apparatus, characterized in that the apparatus comprises: a processor and a sensor array; wherein the sensor array comprises a primary sensor and at least one secondary sensor, and the processor is an obstacle detection device as claimed in any one of claims 7 to 11.
  14. The vehicle-mounted device according to claim 13, wherein a space is provided between any adjacent two sensors in the sensor array.
  15. A readable storage medium having stored therein instructions which, when run on a device, cause the device to perform the obstacle detection method of any one of claims 1-6.
  16. A computer program product, characterized in that it causes a computer to carry out the obstacle detection method according to any one of claims 1-6, when said computer program product is run on the computer.
CN201880093375.3A 2018-11-26 2018-11-26 Obstacle detection method and device Active CN112105953B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/117505 WO2020107176A1 (en) 2018-11-26 2018-11-26 Obstacle detection method and device

Publications (2)

Publication Number Publication Date
CN112105953A true CN112105953A (en) 2020-12-18
CN112105953B CN112105953B (en) 2022-12-06

Family

ID=70853344

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880093375.3A Active CN112105953B (en) 2018-11-26 2018-11-26 Obstacle detection method and device

Country Status (2)

Country Link
CN (1) CN112105953B (en)
WO (1) WO2020107176A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114710863A (en) * 2022-03-29 2022-07-05 惠州莫思特智照科技有限公司 Intelligent induction control method and device
CN116400362A (en) * 2023-06-08 2023-07-07 广汽埃安新能源汽车股份有限公司 Driving boundary detection method, device, storage medium and equipment

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1892249A (en) * 2005-07-05 2007-01-10 李世雄 Backing radar device without control box
JP2009264872A (en) * 2008-04-24 2009-11-12 Nippon Soken Inc Object detecting device
CN101887125A (en) * 2010-06-24 2010-11-17 浙江海康集团有限公司 Reverse sensor range positioning method and system for assisting parking
EP2293102A1 (en) * 2009-08-28 2011-03-09 Robert Bosch GmbH Method and device for determining the position of an obstacle relative to a vehicle, in particular a motor vehicle, for use in a driver assistance system of the vehicle
CN102141620A (en) * 2011-01-06 2011-08-03 同致电子科技(厦门)有限公司 Method for controlling bus of hostless parking radar system and detecting obstruction
CN103969649A (en) * 2014-04-23 2014-08-06 奇瑞汽车股份有限公司 Backing up distance measurement method, device and system
CN104502916A (en) * 2014-12-25 2015-04-08 苏州智华汽车电子有限公司 Radar car backing system and method
CN105242276A (en) * 2015-09-15 2016-01-13 清华大学苏州汽车研究院(吴江) Ultrasonic sensor-based parking assisting system
CN107957583A (en) * 2017-11-29 2018-04-24 江苏若博机器人科技有限公司 A kind of round-the-clock quick unmanned vehicle detection obstacle avoidance system of Multi-sensor Fusion
CN108116406A (en) * 2017-12-29 2018-06-05 广州优保爱驾科技有限公司 A kind of general automated parking system and method for popping one's head in

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1892249A (en) * 2005-07-05 2007-01-10 李世雄 Backing radar device without control box
JP2009264872A (en) * 2008-04-24 2009-11-12 Nippon Soken Inc Object detecting device
EP2293102A1 (en) * 2009-08-28 2011-03-09 Robert Bosch GmbH Method and device for determining the position of an obstacle relative to a vehicle, in particular a motor vehicle, for use in a driver assistance system of the vehicle
CN101887125A (en) * 2010-06-24 2010-11-17 浙江海康集团有限公司 Reverse sensor range positioning method and system for assisting parking
CN102141620A (en) * 2011-01-06 2011-08-03 同致电子科技(厦门)有限公司 Method for controlling bus of hostless parking radar system and detecting obstruction
CN103969649A (en) * 2014-04-23 2014-08-06 奇瑞汽车股份有限公司 Backing up distance measurement method, device and system
CN104502916A (en) * 2014-12-25 2015-04-08 苏州智华汽车电子有限公司 Radar car backing system and method
CN105242276A (en) * 2015-09-15 2016-01-13 清华大学苏州汽车研究院(吴江) Ultrasonic sensor-based parking assisting system
CN107957583A (en) * 2017-11-29 2018-04-24 江苏若博机器人科技有限公司 A kind of round-the-clock quick unmanned vehicle detection obstacle avoidance system of Multi-sensor Fusion
CN108116406A (en) * 2017-12-29 2018-06-05 广州优保爱驾科技有限公司 A kind of general automated parking system and method for popping one's head in

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114710863A (en) * 2022-03-29 2022-07-05 惠州莫思特智照科技有限公司 Intelligent induction control method and device
CN114710863B (en) * 2022-03-29 2023-09-05 惠州莫思特智照科技有限公司 Intelligent induction control method and device
CN116400362A (en) * 2023-06-08 2023-07-07 广汽埃安新能源汽车股份有限公司 Driving boundary detection method, device, storage medium and equipment
CN116400362B (en) * 2023-06-08 2023-08-08 广汽埃安新能源汽车股份有限公司 Driving boundary detection method, device, storage medium and equipment

Also Published As

Publication number Publication date
CN112105953B (en) 2022-12-06
WO2020107176A1 (en) 2020-06-04

Similar Documents

Publication Publication Date Title
US11027653B2 (en) Apparatus, system and method for preventing collision
EP3470789A1 (en) Autonomous driving support apparatus and method
EP3154045B1 (en) Obstacle-information-managing device
KR102132899B1 (en) Route Generation Apparatus at Crossroad, Method and Apparatus for Controlling Vehicle at Crossroad
US20150134185A1 (en) Method of generating optimum parking path of unmanned driving vehicle, and unmanned driving vehicle adopting the method
US20160003943A1 (en) Obstacle detection device for vehicle and obstacle detection system for vehicle
US20050046606A1 (en) Object detecting device
CN104575068A (en) Method and device for prompting drivers
WO2018007065A1 (en) Apparatus and method for vehicle parking
US20170300062A1 (en) Parking assistance device for vehicle and parking control method thereof
CN112789207B (en) Method and device for determining blind area alarm area
CN112105953B (en) Obstacle detection method and device
JP2016085037A (en) On-vehicle object determination device
CN111002979A (en) Collision avoidance apparatus and collision avoidance method
CN109884639B (en) Obstacle detection method and device for mobile robot
TWI448715B (en) Motion parameter estimating method, angle estimating method and determination method
CN112835045A (en) Radar detection method and device, storage medium and electronic equipment
US20170316695A1 (en) Vehicle-mounted peripheral object notification system, object notification system, and notification control apparatus
JP2008058232A (en) Vehicle position detection system
CN113267768A (en) Detection method and device
JP2008236409A (en) Communication method, and radio apparatus utilizing same
JP4644590B2 (en) Peripheral vehicle position detection device and peripheral vehicle position detection method
JP6319211B2 (en) Information providing device by inter-vehicle communication
CN113359136A (en) Target detection method and device and distributed radar system
CN112633124A (en) Target vehicle judgment method for automatic driving vehicle and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant