CN110765970A - Method and device for determining nearest obstacle, storage medium and electronic equipment - Google Patents

Method and device for determining nearest obstacle, storage medium and electronic equipment Download PDF

Info

Publication number
CN110765970A
CN110765970A CN201911050962.7A CN201911050962A CN110765970A CN 110765970 A CN110765970 A CN 110765970A CN 201911050962 A CN201911050962 A CN 201911050962A CN 110765970 A CN110765970 A CN 110765970A
Authority
CN
China
Prior art keywords
image
score
determining
frame
nearest
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911050962.7A
Other languages
Chinese (zh)
Other versions
CN110765970B (en
Inventor
周珅宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Horizon Robotics Technology Research and Development Co Ltd
Original Assignee
Beijing Horizon Robotics Technology Research and Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Horizon Robotics Technology Research and Development Co Ltd filed Critical Beijing Horizon Robotics Technology Research and Development Co Ltd
Priority to CN201911050962.7A priority Critical patent/CN110765970B/en
Publication of CN110765970A publication Critical patent/CN110765970A/en
Application granted granted Critical
Publication of CN110765970B publication Critical patent/CN110765970B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • G06F16/2458Special types of queries, e.g. statistical queries, fuzzy queries or distributed queries
    • G06F16/2462Approximate or statistical queries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • G06Q10/06393Score-carding, benchmarking or key performance indicator [KPI] analysis

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Human Resources & Organizations (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Probability & Statistics with Applications (AREA)
  • Strategic Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Economics (AREA)
  • Educational Administration (AREA)
  • Development Economics (AREA)
  • Data Mining & Analysis (AREA)
  • Fuzzy Systems (AREA)
  • Databases & Information Systems (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • Game Theory and Decision Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Marketing (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)
  • Traffic Control Systems (AREA)

Abstract

The embodiment of the disclosure discloses a method and a device for determining a nearest obstacle, a storage medium and an electronic device, wherein the method comprises the following steps: acquiring continuous n frames of images comprising a front obstacle in front of a vehicle as a first image frame sequence, and determining a first nearest obstacle in each frame of image in the first image frame sequence; wherein the first image frame sequence comprises a current frame image and n-1 frame images before the current frame image; determining a statistical score of each first nearest barrier in the first image frame sequence according to a score corresponding to each frame of image in the first image frame sequence; the second nearest barrier corresponding to the current frame image is determined based on the statistical score of each first nearest barrier, so that the stability of the target output of the CIPV can be ensured, the situation that disturbance exists in CIPV selection can be adapted, and the accuracy of the final target output of the CIPV of the system is ensured.

Description

Method and device for determining nearest obstacle, storage medium and electronic equipment
Technical Field
The present disclosure relates to a driving assistance technology, and in particular, to a method and an apparatus for determining a nearest obstacle, a storage medium, and an electronic device.
Background
In the advanced driving assistance system based on visual perception, whether the target output of the nearest obstacle (CIPV) of the own Vehicle Path is stable or not has a great influence on the system performance. Especially in the FCW (forward navigation warning) application, if the CIPV target is abnormally switched or lost due to the jitter of the predicted track of the vehicle, the jump of the lane line and the target sensing result, etc., the FCW cannot give correct alarm information. In addition, frequent abnormal switching and lost CIPV targets can also affect the user experience. The stability of the target output of the CIPV directly determines the performance of the driving assistance system.
Disclosure of Invention
The present disclosure is proposed to solve the above technical problems. The embodiment of the disclosure provides a method and a device for determining a nearest obstacle, a storage medium and an electronic device.
According to an aspect of an embodiment of the present disclosure, there is provided a method of determining a nearest obstacle, including:
acquiring continuous n frames of images comprising a front obstacle in front of a vehicle as a first image frame sequence, and determining a first nearest obstacle in each frame of image in the first image frame sequence; wherein the first image frame sequence comprises a current frame image and n-1 frame images before the current frame image;
determining a statistical score of each first nearest barrier in the first image frame sequence according to a score corresponding to each frame of image in the first image frame sequence;
and determining a second nearest barrier corresponding to the current frame image based on the statistical score of each first nearest barrier.
According to another aspect of the embodiments of the present disclosure, there is provided a nearest obstacle determination apparatus including:
the obstacle determining module is used for acquiring continuous n frames of images comprising a front obstacle in front of the vehicle as a first image frame sequence and determining a first nearest obstacle in each frame of image in the first image frame sequence; wherein the first image frame sequence comprises a current frame image and n-1 frame images before the current frame;
the score determining module is used for determining the statistical score of each first nearest barrier in the first image frame sequence according to the score corresponding to each frame of image in the first image frame sequence;
and the updating determination module is used for determining a second nearest barrier corresponding to the current frame image based on the statistical score of each first nearest barrier determined by the score determination module.
According to still another aspect of the embodiments of the present disclosure, there is provided a computer-readable storage medium storing a computer program for executing the method for determining a nearest obstacle according to the above-described embodiments.
According to still another aspect of the embodiments of the present disclosure, there is provided an electronic apparatus including:
a processor;
a memory for storing the processor-executable instructions;
the processor is configured to read the executable instructions from the memory and execute the instructions to implement the method for determining the nearest obstacle according to the above embodiment.
Based on the method and the device for determining the nearest obstacle, the storage medium and the electronic equipment provided by the above embodiments of the present disclosure, n consecutive frames of images including the obstacle in front of the vehicle are acquired as a first image frame sequence, and the first nearest obstacle in each frame of image in the first image frame sequence is determined; wherein the first image frame sequence comprises a current frame image and n-1 frame images before the current frame image; determining a statistical score of each first nearest barrier in the first image frame sequence according to a score corresponding to each frame of image in the first image frame sequence; according to the embodiment, the nearest barrier in the current frame image is determined according to the scores of different barriers in the continuous n frames of images, so that a historical information scoring mode is realized, the stability of CIPV target output is ensured, the situation that disturbance exists in CIPV selection can be adapted, and the accuracy of final CIPV target output of the system is ensured.
The technical solution of the present disclosure is further described in detail by the accompanying drawings and examples.
Drawings
The above and other objects, features and advantages of the present disclosure will become more apparent by describing in more detail embodiments of the present disclosure with reference to the attached drawings. The accompanying drawings are included to provide a further understanding of the embodiments of the disclosure and are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the description serve to explain the principles of the disclosure and not to limit the disclosure. In the drawings, like reference numbers generally represent like parts or steps.
Fig. 1 is a flowchart illustrating a method for determining a nearest obstacle according to an exemplary embodiment of the present disclosure.
Fig. 2 is a schematic view of the sliding window according to the present embodiment.
Fig. 3 is a graph illustrating a time-dependent variation of the score of the credibility score provided in the present embodiment.
Fig. 4 is a flowchart illustrating a method for determining a nearest obstacle according to another exemplary embodiment of the present disclosure.
Fig. 5 is a flowchart illustrating a method for determining a nearest obstacle according to another exemplary embodiment of the present disclosure.
Fig. 6 is a schematic flow chart of step 402 in the embodiment shown in fig. 4 of the present disclosure.
Fig. 7 is a schematic flow chart of step 403 in the embodiment shown in fig. 4 of the present disclosure.
Fig. 8a is a flowchart illustrating a method for determining a nearest obstacle according to still another exemplary embodiment of the present disclosure.
Fig. 8b is a flowchart illustrating a method for determining a nearest obstacle according to still another exemplary embodiment of the present disclosure.
Fig. 9 is a schematic structural diagram of a device for determining a nearest obstacle according to an exemplary embodiment of the present disclosure.
Fig. 10 is a schematic structural diagram of a nearest obstacle determination device according to another exemplary embodiment of the present disclosure.
Fig. 11 is a block diagram of an electronic device provided in an exemplary embodiment of the present disclosure.
Detailed Description
Hereinafter, example embodiments according to the present disclosure will be described in detail with reference to the accompanying drawings. It is to be understood that the described embodiments are merely a subset of the embodiments of the present disclosure and not all embodiments of the present disclosure, with the understanding that the present disclosure is not limited to the example embodiments described herein.
It should be noted that: the relative arrangement of the components and steps, the numerical expressions, and numerical values set forth in these embodiments do not limit the scope of the present disclosure unless specifically stated otherwise.
It will be understood by those of skill in the art that the terms "first," "second," and the like in the embodiments of the present disclosure are used merely to distinguish one element from another, and are not intended to imply any particular technical meaning, nor is the necessary logical order between them.
It is also understood that in embodiments of the present disclosure, "a plurality" may refer to two or more and "at least one" may refer to one, two or more.
It is also to be understood that any reference to any component, data, or structure in the embodiments of the disclosure, may be generally understood as one or more, unless explicitly defined otherwise or stated otherwise.
In addition, the term "and/or" in the present disclosure is only one kind of association relationship describing an associated object, and means that three kinds of relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" in the present disclosure generally indicates that the former and latter associated objects are in an "or" relationship.
It should also be understood that the description of the various embodiments of the present disclosure emphasizes the differences between the various embodiments, and the same or similar parts may be referred to each other, so that the descriptions thereof are omitted for brevity.
Meanwhile, it should be understood that the sizes of the respective portions shown in the drawings are not drawn in an actual proportional relationship for the convenience of description.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the disclosure, its application, or uses.
Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
The disclosed embodiments may be applied to electronic devices such as terminal devices, computer systems, servers, etc., which are operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well known terminal devices, computing systems, environments, and/or configurations that may be suitable for use with electronic devices, such as terminal devices, computer systems, servers, and the like, include, but are not limited to: personal computer systems, server computer systems, thin clients, thick clients, hand-held or laptop devices, microprocessor-based systems, set-top boxes, programmable consumer electronics, networked personal computers, minicomputer systems, mainframe computer systems, distributed cloud computing environments that include any of the above, and the like.
Electronic devices such as terminal devices, computer systems, servers, etc. may be described in the general context of computer system-executable instructions, such as program modules, being executed by a computer system. Generally, program modules may include routines, programs, objects, components, logic, data structures, etc. that perform particular tasks or implement particular abstract data types. The computer system/server may be practiced in distributed cloud computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed cloud computing environment, program modules may be located in both local and remote computer system storage media including memory storage devices.
Summary of the application
In the course of implementing the present disclosure, the inventor finds that the prior art selects a CIPV target for the sensed data of each frame, but due to the presence of the sensed errors, including lane line jitter, predicted trajectory jitter, speed measurement and ranging errors, the method has at least the following problems: the selection of the CIPV target simply according to the information of the current frame often leads to unstable selection of the CIPV, including missed selection and wrong selection of one or more frames, and even jumping among different obstacle targets.
Exemplary System
Fig. 1 is a flowchart illustrating a method for determining a nearest obstacle according to an exemplary embodiment of the present disclosure. The embodiment comprises the following steps:
step 101, obtaining n continuous frames of images including a front obstacle as a sliding window (corresponding to a first image frame sequence), and determining a nearest obstacle (CIPV) corresponding to each frame of image in the sliding window and a corresponding score. Without loss of generality, assuming that a barrier vehicle is selected as a CIPV target of a current frame in each frame of the high-level driver assistance system based on a monocular camera, and the CIPV target is output as an ID (a positive number greater than 0) of the target in a perception system, where the ID in this embodiment is given by a multi-target tracking system (a system in the prior art) in perception, a value of the ID depends on a specific setting, generally an integer starting from 0, IDs of different barriers are different, and the IDs of the barriers are not reused after being deleted (certainly, there is a case of multiplexing, and there is no influence on this embodiment), and as the number of the barrier targets increases, the ID value of a new barrier becomes larger (for the case of multiplexing, the ID value is limited within a certain range); if CIPV is not present, output-1. The sliding window method receives the output of each frame of the system and saves the CIPV of the system's historical output.
Fig. 2 is a schematic view of the sliding window according to the present embodiment. As shown in fig. 2, the sliding window is composed of n windows, where n is an integer greater than 2, each window represents a history frame or a current frame (the current frame may be regarded as history 0), and the ID in the window represents the ID of the CIPV target selected by that frame in the perception system; corresponding to CIPV selection of historical n frames including the current frame. Each window corresponds to a score, the score of the window corresponding to the current frame is a, lambda epsilon (0,1) is an Attenuation Factor (Attenuation Factor) of the window score, the Attenuation is that the probability that whether the CIPV is in the historical frame or the CIPV determined based on the current frame is reduced along with the time, and the Attenuation Factor represents the Attenuation of the credibility score selected for the historical CIPV along with the time. The score of the window corresponding to the ith historical frame can be expressed by the following formula (1) (the current frame is the historical 0 th frame):
Score(i)=λia, 0 ≤ i ≤ n-1 formula (1)
It can be known that the window score designed in this way shows a power law attenuation trend along with the time, and fig. 3 is a schematic diagram of a change curve of the score of the credibility score provided by the embodiment along with the time. As shown in fig. 3, the graph shows the decay trend of the score of the confidence score corresponding to different frames as the number of frames changes, taking λ ═ 0.7, a ═ 1, and n ═ 20 as an example.
And 102, determining a statistical score corresponding to each CIPV in the sliding window according to the score corresponding to each frame of image. If m different obstacle vehicle IDs exist in the current frame in a sliding window (comprising n windows) with the size of n, the scores of the m different obstacle vehicle IDs are counted according to the score of each window respectively, the IDs of the CIPV target selection obstacles of historical n frames including the current frame are recorded in the sliding window, each window corresponds to one ID, and the scores of all windows occupied by the same ID are added up to obtain the score of the obstacle with the ID in the sliding window. The sliding window method provided in this embodiment treats-1, i.e., the case where there is no CIPV output, as a valid obstacle vehicle ID to be handled uniformly without adding an additional judgment. Suppose that the obstacle in the ith window is O (i) and the Score of the obstacle with ID j is ScorejCan be expressed as:
finally, the highest score of all the obstacle vehicle scores is taken as the score selected by the current sliding window CIPV, wherein the highest score is obtained by calculating the score for each obstacle respectively, and finally the maximum value of all the scores is taken as the score selected by the current sliding window CIPV, and the score can be calculated by the following formula (3):
Score=max(Scorejj is more than or equal to 1 and less than or equal to m) formula (3)
And 103, determining whether to update the CIPV corresponding to the current frame or not based on the score of the current sliding window CIPV. If Score is larger than a set Score threshold T, setting CIPV of the current frame as the barrier vehicle ID with the highest Score; if Score is less than the set Score threshold T, the last CIPV output is held constant.
When the next frame image is generated, a sliding window comprising n consecutive frames is obtained forward from the next frame, step 104. The CIPV corresponding to the next frame is determined to be the CIPV determined by the retention system in the steps 101-103, or a new CIPV is determined based on the score of the sliding window. When the next frame comes, the CIPV target ID selected by the system is taken, the ID value stored in the window 0 is covered, and then:
o (i-1) ═ O (i), 2 ≦ i ≦ n equation (4)
The above equation (4) describes the sliding process of the window, i.e. the whole window is equivalent to "sliding" along the positive direction of the time axis.
By configuring suitable parameters (λ, a, n, T), the sliding window method can also have the effect of continuous confirmation. The parameter selection and derivation process comprises the following steps: the window size n and the attenuation factor lambda are determined. The window size n is only a moderate value, and unnecessary space resource waste is caused by the overlarge window size: most of the window values at the tail part are small, the final result is hardly influenced, and resources are spent for storage; and the influence of the historical value cannot be fully considered by a small window, and the target management process is similar to a common continuous confirmation method. In general, the window size n may be 10 to 20, but the value of n is not limited in this embodiment. The attenuation factor λ reflects the decrease in confidence in the history over time, which is preferably too small in the case of a moderate window size, e.g., λ ∈ (0.5, 1).
After the window size n and the attenuation factor λ are determined, the current window score a and the threshold T need to be selected, and the selection of the two values plays a key role in the performance of the sliding window in the CIPV target management. In order to make the sliding window method have the effect of continuous confirmation under the condition that the CIPV is normally selected, the following method is adopted to configure the parameters a and T:
if the CIPV target can be switched only after continuous k frames are normally required to be confirmed in the window sliding process, when the current k windows are occupied by the ID of the same obstacle vehicle, the score of the obstacle vehicle needs to be higher than the threshold T, that is, the following formula (5) needs to be satisfied:
according to the geometric series summation formula, the method comprises the following steps:
Figure BDA0002255321650000072
the parameters a and T satisfying the condition shown in equation (6) can implement the confirmation process of the continuous k frames under normal conditions.
The sliding window method provided by the embodiment can ensure a target management process of continuous confirmation, can effectively deal with abnormal switching of the CIPV target due to sensing jump and predicted track jitter, and is more flexible and has stronger adaptability. The advantages of the sliding window method over the continuous confirmation method are given below, and are illustrated in different scenarios, in the example, the sliding window method parameter is λ ═ 0.7, a ═ 0.3653, n ═ 20, T ═ 0.6, and the number of continuous confirmation frames is 3.
Figure BDA0002255321650000073
Figure BDA0002255321650000081
TABLE 1 CIPV selects CIPV IDs corresponding to different methods under normal conditions
As shown in table 1, the sliding window method can also ensure the stability and reliability of target output of CIPV in a continuous confirmation manner under the condition that CIPV is normally selected without disturbance. The frames 15983 to 15985 are the switching process of the CIPV target output, at which time the CIPV target output by the system is switched from the obstacle vehicle ID No. 324 to the obstacle vehicle ID No. 330.
Figure BDA0002255321650000082
TABLE 2 CIPV IDs corresponding to different methods under the condition of disturbance in CIPV switching process
As shown in table 2, during the process of switching the CIPV target from 324 to 330, a false alarm of one frame occurs, i.e., the obstacle vehicle with ID 314 is mistakenly selected as the CIPV. In the process of confirming that 330 is the CIPV, the confirmation process counter is reset due to the false positive of the frame, so that an additional 3-frame confirmation process is needed to switch the final CIPV target to the obstacle vehicle # 330, and if the subsequent confirmation process is disturbed again, the switching of the CIPV is delayed for a long time.
And for the sliding window method, because the mode of scoring based on the historical information is adopted, the method can adapt to the false alarm of one frame, and the No. 330 obstacle vehicle is set as the CIPV target in time, so that the accuracy of the final CIPV target output of the system is ensured.
Figure BDA0002255321650000083
Figure BDA0002255321650000091
TABLE 3 CIPV IDs corresponding to different methods under the condition of disturbance in CIPV deletion process
As shown in table 3, during the deletion of the CIPV target, the system may still output the original CIPV target ID for one or two frames due to the sensing error or the jitter of the own track. In the continuous confirmation method, the CIPV target cannot be deleted in time in the deletion process with disturbance, the final CIPV target can be deleted only by an additional 3-frame confirmation process, and if the disturbance occurs again in the subsequent confirmation process, the deletion of the CIPV is delayed for a long time. For the sliding window method, the-1 is also one of the IDs of the vehicles with the numerous obstacles, and the sliding window method can adapt to the situation and delete the CIPV target in time due to the adoption of a mode of scoring based on historical information, so that the accuracy of the final CIPV target output of the system is ensured.
In summary, the sliding window can adapt to the situation that disturbance exists in CIPV selection while ensuring the stability of target output of the CIPV, and has better management effect and performance compared with the conventional method for confirming several frames.
Exemplary method
Fig. 4 is a flowchart illustrating a method for determining a nearest obstacle according to another exemplary embodiment of the present disclosure. The embodiment can be applied to an electronic device, as shown in fig. 4, and includes the following steps:
step 401, acquiring continuous n frames of images including a front obstacle in front of the vehicle as a first image frame sequence, and determining a first nearest obstacle in each frame of image in the first image frame sequence.
Wherein the first image frame sequence comprises a current frame image and n-1 frame images before the current frame image.
Alternatively, the acquired first image frame sequence may correspond to the sliding window in step 101 in the embodiment provided in fig. 1.
Step 402, determining a statistical score of each first nearest obstacle in the first image frame sequence according to a score corresponding to each frame of image in the first image frame sequence.
In an embodiment, the first image frame sequence includes n images, each of the images may include at least one first nearest obstacle, in order to determine a nearest obstacle of the current frame, historical information in other images in the first image frame sequence is obtained, and a nearest obstacle of the current frame is determined in combination with a statistical score, where the manner of obtaining the statistical score of the first nearest obstacle may refer to step 102 in the embodiment provided in fig. 1.
Step 403, determining a second nearest barrier corresponding to the current frame image based on the statistical score of each first nearest barrier.
Optionally, each frame of image in the first image frame sequence corresponds to a first nearest barrier, and this embodiment determines, according to a size of a statistical score corresponding to each frame of image, whether to use the first nearest barrier corresponding to the current frame of image as the second nearest barrier, or to use the first nearest barrier with the largest statistical score as the second nearest barrier of the current frame of image.
The method for determining the nearest barrier provided by the above embodiment of the present disclosure obtains n consecutive frames of images including the front barrier in front of the vehicle as a first image frame sequence, and determines the first nearest barrier in each frame of image in the first image frame sequence; wherein the first image frame sequence comprises a current frame image and n-1 frame images before the current frame image; determining a statistical score of each first nearest barrier in the first image frame sequence according to a score corresponding to each frame of image in the first image frame sequence; according to the embodiment, the nearest barrier in the current frame image is determined according to the scores of different barriers in the continuous n frames of images, so that a historical information scoring mode is realized, the stability of CIPV target output is ensured, the situation that disturbance exists in CIPV selection can be adapted, and the accuracy of final CIPV target output of the system is ensured.
Combining historical information to manage selection of CIPV targets is also a filtering in the sense that this approach can eliminate CIPV selection instability due to perceptual errors.
Fig. 5 is a flowchart illustrating a method for determining a nearest obstacle according to another exemplary embodiment of the present disclosure. The embodiment can be applied to an electronic device, as shown in fig. 5, and includes the following steps:
step 401, acquiring continuous n frames of images including a front obstacle in front of the vehicle as a first image frame sequence, and determining a first nearest obstacle in each frame of image in the first image frame sequence.
Wherein the first image frame sequence comprises a current frame image and n-1 frame images before the current frame image.
Step 501, determining a score corresponding to each frame of image in a first image frame sequence based on a preset score corresponding to a current frame of image and a preset attenuation factor.
Optionally, the preset score is a score corresponding to the current image and may correspond to a score a of the current window in the embodiment shown in fig. 2, the preset attenuation factor is used to calculate a score of a historical frame before the current frame after attenuation, and the attenuation factor λ ∈ (0,1) may correspond to the score of the window.
Step 402, determining a statistical score of each first nearest obstacle in the first image frame sequence according to a score corresponding to each frame of image in the first image frame sequence.
Step 403, determining a second nearest barrier corresponding to the current frame image based on the statistical score of each first nearest barrier.
In this embodiment, after the score of the current frame image and the preset attenuation factor are known, the score corresponding to each frame image in the first image frame sequence can be determined by using formula (1) in the embodiment shown in fig. 2, which provides a basis for subsequently determining the statistical score of each first nearest obstacle.
As shown in fig. 6, based on the embodiment shown in fig. 4, step 402 may include the following steps:
step 4021, determining at least one score corresponding to each first nearest obstacle according to at least one image in the first image frame sequence corresponding to each first nearest obstacle.
Alternatively, the score of each first nearest obstacle may be determined based on the current frame corresponding score a and the attenuation factor λ, and specifically, the manner of determining each score may refer to formula (1) in the embodiment provided in fig. 2.
Step 4022, aiming at each first nearest barrier, obtaining a statistical score of the first nearest barrier based on at least one score corresponding to the first nearest barrier.
Optionally, the statistical score of the first nearest obstacle is obtained by summing at least one score corresponding to the first nearest obstacle, and the summation process may be calculated by referring to formula (2) in step 102 in the embodiment provided in fig. 1, where the ith obstacle isThe obstacle in the window is O (i), and the obstacle with ID j is Scorej
In some alternative embodiments, step 403 includes:
and determining a second nearest barrier corresponding to the current frame image based on the relation between the statistical score of each first nearest barrier and a preset score threshold.
The preset score threshold value, the preset score and the preset attenuation factor meet a preset relation.
Alternatively, the preset relationship to be satisfied between the preset score threshold T and the preset score a and the preset attenuation factor may be a relationship as determined by equation (6) in the embodiment provided in fig. 1.
In this embodiment, the preset score threshold may be set according to specific situations, for example, as the score threshold T in step 103 in the embodiment provided in fig. 1, and the second closest obstacle is determined according to the relationship between the statistical score and the preset score threshold with reference to step 103.
As shown in fig. 7, based on the embodiment shown in fig. 4, step 403 may include the following steps:
step 4031, determine the obstacle of the path of the vehicle corresponding to the highest statistical score from the statistical scores of each first nearest obstacle.
Wherein, the highest statistical score is the statistical score with the highest numerical value.
Alternatively, the highest statistical score may be determined with reference to equation (3) in step 102 in the embodiment provided in FIG. 1.
Step 4032, a second nearest obstacle corresponding to the current frame image is determined based on the relationship between the highest statistical score and a preset score threshold.
In this embodiment, the method for determining the second closest obstacle may be shown in step 103 in the embodiment provided in fig. 1, and optionally, step 4032 includes: in response to the fact that the highest statistical score is larger than a preset score threshold value, determining a first nearest obstacle corresponding to the highest statistical score as a second nearest obstacle; and in response to the fact that the highest statistical score is smaller than or equal to a preset score threshold value, determining that a first nearest obstacle corresponding to the current frame image is a second nearest obstacle corresponding to the previous frame image. The second nearest barrier is determined through the relation between the statistical score and the preset score threshold, the stability of the target output of the CIPV is guaranteed, meanwhile, the situation that disturbance exists in CIPV selection can be adapted, and compared with a traditional method for confirming a plurality of frames, the method has better management effect and performance.
Fig. 8a is a flowchart illustrating a method for determining a nearest obstacle according to still another exemplary embodiment of the present disclosure. The present embodiment can be applied to an electronic device, as shown in fig. 8a, and includes the following steps:
step 401, acquiring continuous n frames of images including a front obstacle in front of the vehicle as a first image frame sequence, and determining a first nearest obstacle in each frame of image in the first image frame sequence.
Wherein the first image frame sequence comprises a current frame image and n-1 frame images before the current frame image.
Step 402, determining a statistical score of each first nearest obstacle in the first image frame sequence according to a score corresponding to each frame of image in the first image frame sequence.
Step 403, determining a second nearest barrier corresponding to the current frame image based on the statistical score of each first nearest barrier.
Step 805, judging whether a next frame image exists, if so, executing step 806; otherwise, ending.
Step 806, obtaining a next frame image of the current frame image, and taking a multi-frame image with a time similar to that of the next frame image in the next frame image and the first image frame sequence as a second image frame sequence; step 402 is performed with the second image frame sequence as the first image frame sequence and the next image frame as the current image frame.
In this embodiment, the sliding process may refer to formula (4) in step 104 of the embodiment provided in fig. 1, implement calculation of a next window by window sliding, use the next frame image as the current frame image, determine a second nearest obstacle corresponding to the next frame image, and further implement continuous processing of multiple frames of images in the video, so as to expand the application range of this embodiment.
In some optional embodiments, when it is known that the current frame image includes the next frame image in the embodiment shown in fig. 8a, the embodiment shown in fig. 8b may be obtained, and fig. 8b is a flowchart illustrating a method for determining a nearest obstacle according to still another exemplary embodiment of the present disclosure. As shown in fig. 8b, the method comprises the following steps:
step 401, acquiring continuous n frames of images including a front obstacle in front of the vehicle as a first image frame sequence, and determining a first nearest obstacle in each frame of image in the first image frame sequence.
Wherein the first image frame sequence comprises a current frame image and n-1 frame images before the current frame image.
Step 402, determining a statistical score of each first nearest obstacle in the first image frame sequence according to a score corresponding to each frame of image in the first image frame sequence.
Step 403, determining a second nearest barrier corresponding to the current frame image based on the statistical score of each first nearest barrier.
Step 806, obtaining a next frame image of the current frame image, and using a multi-frame image with a time similar to that of the next frame image in the next frame image and the first image frame sequence as the second image frame sequence.
Step 807, determining a statistical score of each third nearest obstacle in the second image frame sequence according to the score corresponding to each frame of image in the second image frame sequence.
The manner in which the statistical score is determined in this step may be understood with reference to step 402, with the only difference being that this step is to determine the statistical score of the third most recent obstacle, while step 402 is to determine the statistical score of the first most recent obstacle.
Step 808, determining a fourth nearest obstacle corresponding to the next frame of image based on the statistical score of each third nearest obstacle.
The manner in which the statistical score is determined in this step may be understood with reference to step 403, with the only difference being that this step determines the fourth closest obstacle, and step 403 determines the second closest obstacle.
Step 809, judging whether a next frame image still exists, if so, executing step 806; otherwise, ending.
Similar to the embodiment provided in fig. 8a, in this embodiment, the calculation of the next window is realized through window sliding, the second image frame sequence is obtained based on the next frame image, the fourth nearest obstacle corresponding to the next frame image is determined, and then the continuous processing of multiple frame images in the video is realized, so that the application range of this embodiment is expanded.
Any of the recent obstacle determination methods provided by the embodiments of the present disclosure may be performed by any suitable device having data processing capabilities, including but not limited to: terminal equipment, a server and the like. Alternatively, the method for determining any nearest obstacle provided by the embodiments of the present disclosure may be executed by a processor, for example, the processor may execute the method for determining any nearest obstacle mentioned by the embodiments of the present disclosure by calling a corresponding instruction stored in a memory. And will not be described in detail below.
Exemplary devices
Fig. 9 is a schematic structural diagram of a device for determining a nearest obstacle according to an exemplary embodiment of the present disclosure. As shown in fig. 9, the present embodiment includes:
the obstacle determining module 91 is configured to acquire n consecutive images including a front obstacle in front of the vehicle as a first image frame sequence, and determine a first nearest obstacle in each image in the first image frame sequence.
Wherein the first image frame sequence comprises a current frame image and n-1 frame images before the current frame image.
And a score determining module 92, configured to determine a statistical score of each first nearest obstacle determined by the obstacle determining module 91 in the first image frame sequence according to a score corresponding to each frame of image in the first image frame sequence.
And an update determining module 93, configured to determine, based on the statistical score of each first nearest obstacle determined by the score determining module 92, a second nearest obstacle corresponding to the current frame image.
The device for determining the nearest obstacle provided by the above embodiment of the present disclosure acquires n consecutive images including the obstacle ahead of the vehicle as a first image frame sequence, and determines the first nearest obstacle in each frame of image in the first image frame sequence; wherein the first image frame sequence comprises a current frame image and n-1 frame images before the current frame image; determining a statistical score of each first nearest barrier in the first image frame sequence according to a score corresponding to each frame of image in the first image frame sequence; according to the embodiment, the nearest barrier in the current frame image is determined according to the scores of different barriers in the continuous n frames of images, so that a historical information scoring mode is realized, the stability of CIPV target output is ensured, the situation that disturbance exists in CIPV selection can be adapted, and the accuracy of final CIPV target output of the system is ensured.
Fig. 10 is a schematic structural diagram of a nearest obstacle determination device according to another exemplary embodiment of the present disclosure. As shown in fig. 10, the present embodiment includes:
the score determining module 92 is further configured to determine a score corresponding to each frame of image in the first image frame sequence based on a preset score corresponding to the current frame of image and a preset attenuation factor.
A score determination module 92 comprising:
an obstacle score determining unit 921, configured to determine at least one score corresponding to each first nearest obstacle according to at least one image in the first image frame sequence corresponding to each first nearest obstacle;
the score statistic unit 922 is configured to obtain, for each first closest obstacle, a statistical score of the first closest obstacle based on at least one score corresponding to the first closest obstacle.
An update determining module 93, specifically configured to determine, based on a relationship between a statistical score of each first nearest obstacle and a preset score threshold, a second nearest obstacle corresponding to the current frame image; the preset score threshold value, the preset score and the preset attenuation factor meet a preset relation.
An update determination module 93, comprising:
a highest score determining unit 931 configured to determine a vehicle path obstacle corresponding to a highest statistical score from the statistical scores of each of the first closest obstacles; wherein, the highest statistical score is the statistical score with the highest numerical value.
An obstacle determining unit 932, configured to determine a second nearest obstacle corresponding to the current frame image based on a relationship between the highest statistical score and a preset score threshold.
An obstacle determining unit 932, configured to determine, in response to that the highest statistical score is greater than a preset score threshold, a first nearest obstacle corresponding to the highest statistical score as a second nearest obstacle; and in response to the fact that the highest statistical score is smaller than or equal to a preset score threshold value, determining that a first nearest obstacle corresponding to the current frame image is a second nearest obstacle corresponding to the previous frame image.
The apparatus provided in this embodiment further includes:
a next frame processing module 11, configured to obtain a next frame image of the current frame image, and use a multi-frame image that is close to the next frame image in time in the next frame image and the first image frame sequence as a second image frame sequence; determining a statistical score of each third nearest barrier in the second image frame sequence according to the corresponding score of each frame of image in the second image frame sequence; and determining a fourth nearest obstacle corresponding to the next frame of image based on the statistical score of each third nearest obstacle.
Exemplary electronic device
Next, an electronic apparatus according to an embodiment of the present disclosure is described with reference to fig. 11. The electronic device may be either or both of the first device 100 and the second device 200, or a stand-alone device separate from them that may communicate with the first device and the second device to receive the collected input signals therefrom.
FIG. 11 illustrates a block diagram of an electronic device in accordance with an embodiment of the disclosure.
As shown in fig. 11, electronic device 110 includes one or more processors 111 and memory 112.
Processor 111 may be a Central Processing Unit (CPU) or other form of processing unit having data processing capabilities and/or instruction execution capabilities, and may control other components in electronic device 110 to perform desired functions.
Memory 112 may include one or more computer program products that may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. The volatile memory may include, for example, Random Access Memory (RAM), cache memory (cache), and/or the like. The non-volatile memory may include, for example, Read Only Memory (ROM), hard disk, flash memory, etc. One or more computer program instructions may be stored on the computer-readable storage medium and executed by processor 111 to implement the method of determining a proximate obstacle and/or other desired functionality of the various embodiments of the present disclosure described above. Various contents such as an input signal, a signal component, a noise component, etc. may also be stored in the computer-readable storage medium.
In one example, the electronic device 110 may further include: an input device 113 and an output device 114, which are interconnected by a bus system and/or other form of connection mechanism (not shown).
For example, when the electronic device is the first device 100 or the second device 200, the input device 113 may be a microphone or a microphone array as described above for capturing an input signal of a sound source. When the electronic device is a stand-alone device, the input means 113 may be a communication network connector for receiving the acquired input signals from the first device 100 and the second device 200.
The input device 113 may also include, for example, a keyboard, a mouse, and the like.
The output device 114 may output various information including the determined distance information, direction information, and the like to the outside. The output devices 114 may include, for example, a display, speakers, a printer, and a communication network and remote output devices connected thereto, among others.
Of course, for simplicity, only some of the components of the electronic device 110 relevant to the present disclosure are shown in fig. 11, omitting components such as buses, input/output interfaces, and the like. In addition, electronic device 110 may include any other suitable components, depending on the particular application.
Exemplary computer program product and computer-readable storage Medium
In addition to the above-described methods and apparatus, embodiments of the present disclosure may also be a computer program product comprising computer program instructions that, when executed by a processor, cause the processor to perform the steps in the method of determining a closest obstacle according to the various embodiments of the present disclosure described in the "exemplary methods" section of this specification above.
The computer program product may write program code for carrying out operations for embodiments of the present disclosure in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server.
Furthermore, embodiments of the present disclosure may also be a computer-readable storage medium having stored thereon computer program instructions that, when executed by a processor, cause the processor to perform the steps in the method of determining a closest obstacle according to various embodiments of the present disclosure described in the "exemplary methods" section above in this specification.
The computer-readable storage medium may take any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may include, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The foregoing describes the general principles of the present disclosure in conjunction with specific embodiments, however, it is noted that the advantages, effects, etc. mentioned in the present disclosure are merely examples and are not limiting, and they should not be considered essential to the various embodiments of the present disclosure. Furthermore, the foregoing disclosure of specific details is for the purpose of illustration and description and is not intended to be limiting, since the disclosure is not intended to be limited to the specific details so described.
In the present specification, the embodiments are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same or similar parts in the embodiments are referred to each other. For the system embodiment, since it basically corresponds to the method embodiment, the description is relatively simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
The block diagrams of devices, apparatuses, systems referred to in this disclosure are only given as illustrative examples and are not intended to require or imply that the connections, arrangements, configurations, etc. must be made in the manner shown in the block diagrams. These devices, apparatuses, devices, systems may be connected, arranged, configured in any manner, as will be appreciated by those skilled in the art. Words such as "including," "comprising," "having," and the like are open-ended words that mean "including, but not limited to," and are used interchangeably therewith. The words "or" and "as used herein mean, and are used interchangeably with, the word" and/or, "unless the context clearly dictates otherwise. The word "such as" is used herein to mean, and is used interchangeably with, the phrase "such as but not limited to".
The methods and apparatus of the present disclosure may be implemented in a number of ways. For example, the methods and apparatus of the present disclosure may be implemented by software, hardware, firmware, or any combination of software, hardware, and firmware. The above-described order for the steps of the method is for illustration only, and the steps of the method of the present disclosure are not limited to the order specifically described above unless specifically stated otherwise. Further, in some embodiments, the present disclosure may also be embodied as programs recorded in a recording medium, the programs including machine-readable instructions for implementing the methods according to the present disclosure. Thus, the present disclosure also covers a recording medium storing a program for executing the method according to the present disclosure.
It is also noted that in the devices, apparatuses, and methods of the present disclosure, each component or step can be decomposed and/or recombined. These decompositions and/or recombinations are to be considered equivalents of the present disclosure.
The previous description of the disclosed aspects is provided to enable any person skilled in the art to make or use the present disclosure. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects without departing from the scope of the disclosure. Thus, the present disclosure is not intended to be limited to the aspects shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing description has been presented for purposes of illustration and description. Furthermore, this description is not intended to limit embodiments of the disclosure to the form disclosed herein. While a number of example aspects and embodiments have been discussed above, those of skill in the art will recognize certain variations, modifications, alterations, additions and sub-combinations thereof.

Claims (10)

1. A method of determining a proximate obstacle, comprising:
acquiring continuous n frames of images comprising a front obstacle in front of a vehicle as a first image frame sequence, and determining a first nearest obstacle in each frame of image in the first image frame sequence; wherein the first image frame sequence comprises a current frame image and n-1 frame images before the current frame image;
determining a statistical score of each first nearest barrier in the first image frame sequence according to a score corresponding to each frame of image in the first image frame sequence;
and determining a second nearest barrier corresponding to the current frame image based on the statistical score of each first nearest barrier.
2. The method of claim 1, further comprising, prior to determining a statistical score for each first nearest obstacle in the first sequence of image frames from scores corresponding to each frame of images in the first sequence of image frames:
and determining a score corresponding to each frame of image in the first image frame sequence based on a preset score corresponding to the current frame of image and a preset attenuation factor.
3. The method of claim 2, wherein the determining a statistical score for each first nearest obstacle in the first sequence of image frames from the scores corresponding to each frame of images in the first sequence of image frames comprises:
determining at least one score corresponding to each first nearest obstacle according to at least one image in a first image frame sequence corresponding to each first nearest obstacle;
for each first nearest obstacle, obtaining a statistical score of the first nearest obstacle based on at least one score corresponding to the first nearest obstacle.
4. The method according to claim 1, wherein the determining a second nearest obstacle corresponding to the current frame image based on the statistical score of each first nearest obstacle comprises:
determining a second nearest barrier corresponding to the current frame image based on the relation between the statistical score of each first nearest barrier and a preset score threshold; the preset score threshold value, the preset score and the preset attenuation factor meet a preset relation.
5. The method according to claim 4, wherein the determining a second nearest obstacle corresponding to the current frame image based on the relationship between the statistical score of each first nearest obstacle and a preset score threshold comprises:
determining the obstacle of the path of the vehicle corresponding to the highest statistical score from the statistical scores of each first nearest obstacle; wherein the highest statistical score is the statistical score with the highest numerical value;
and determining a second nearest barrier corresponding to the current frame image based on the relation between the highest statistical score and a preset score threshold.
6. The method according to claim 5, wherein the determining a second nearest obstacle corresponding to the current frame image based on the relationship between the highest statistical score and a preset score threshold comprises:
in response to the highest statistical score being greater than the preset score threshold, determining a first nearest obstacle corresponding to the highest statistical score as a second nearest obstacle;
and in response to the fact that the highest statistical score is smaller than or equal to the preset score threshold value, determining that a first nearest obstacle corresponding to the current frame image is a second nearest obstacle corresponding to a previous frame image.
7. The method of any of claims 1-6, further comprising:
acquiring a next frame image of a current frame image, and taking a plurality of frame images which are close to the next frame image in time in the next frame image and the first image frame sequence as a second image frame sequence;
determining a statistical score of each third nearest barrier in the second image frame sequence according to a score corresponding to each frame of image in the second image frame sequence;
and determining a fourth nearest barrier corresponding to the next frame of image based on the statistical score of each third nearest barrier.
8. A nearest obstacle determination apparatus comprising:
the obstacle determining module is used for acquiring continuous n frames of images comprising a front obstacle in front of the vehicle as a first image frame sequence and determining a first nearest obstacle in each frame of image in the first image frame sequence; wherein the first image frame sequence comprises a current frame image and n-1 frame images before the current frame image;
the score determining module is used for determining the statistical score of each first nearest barrier determined by the barrier determining module in the first image frame sequence according to the score corresponding to each frame of image in the first image frame sequence;
and the updating determination module is used for determining a second nearest barrier corresponding to the current frame image based on the statistical score of each first nearest barrier determined by the score determination module.
9. A computer-readable storage medium storing a computer program for executing the method for determining a nearest obstacle according to any one of claims 1 to 7.
10. An electronic device, the electronic device comprising:
a processor;
a memory for storing the processor-executable instructions;
the processor is configured to read the executable instructions from the memory and execute the instructions to implement the method of determining a nearest obstacle of any of claims 1-7.
CN201911050962.7A 2019-10-31 2019-10-31 Method and device for determining nearest obstacle, storage medium and electronic equipment Active CN110765970B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911050962.7A CN110765970B (en) 2019-10-31 2019-10-31 Method and device for determining nearest obstacle, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911050962.7A CN110765970B (en) 2019-10-31 2019-10-31 Method and device for determining nearest obstacle, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN110765970A true CN110765970A (en) 2020-02-07
CN110765970B CN110765970B (en) 2022-08-09

Family

ID=69335238

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911050962.7A Active CN110765970B (en) 2019-10-31 2019-10-31 Method and device for determining nearest obstacle, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN110765970B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101419667A (en) * 2008-12-15 2009-04-29 东软集团股份有限公司 Method and apparatus for identifying obstacle in image
US20150073664A1 (en) * 2013-09-12 2015-03-12 Ford Global Technologies, Llc Collision warning for a vehicle
US20150120137A1 (en) * 2013-10-28 2015-04-30 GM Global Technology Operations LLC Path planning for evasive steering maneuver in presence of target vehicle and surrounding objects
CN105074600A (en) * 2013-02-27 2015-11-18 夏普株式会社 Surrounding environment recognition device, autonomous mobile system using same, and surrounding environment recognition method
CN106599832A (en) * 2016-12-09 2017-04-26 重庆邮电大学 Method for detecting and recognizing various types of obstacles based on convolution neural network
CN109214348A (en) * 2018-09-19 2019-01-15 北京极智嘉科技有限公司 A kind of obstacle detection method, device, equipment and storage medium
CN109269478A (en) * 2018-10-24 2019-01-25 南京大学 A kind of container terminal based on binocular vision bridge obstacle detection method
CN109785366A (en) * 2019-01-21 2019-05-21 中国科学技术大学 It is a kind of for the correlation filtering method for tracking target blocked
CN109910877A (en) * 2019-02-01 2019-06-21 中科安达(北京)科技有限公司 The method of AEBS intelligent recognition barrier
CN110132275A (en) * 2019-04-29 2019-08-16 北京云迹科技有限公司 Laser barrier-avoiding method and device
CN110348290A (en) * 2019-05-27 2019-10-18 天津中科智能识别产业技术研究院有限公司 Coke tank truck safe early warning visible detection method

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101419667A (en) * 2008-12-15 2009-04-29 东软集团股份有限公司 Method and apparatus for identifying obstacle in image
CN105074600A (en) * 2013-02-27 2015-11-18 夏普株式会社 Surrounding environment recognition device, autonomous mobile system using same, and surrounding environment recognition method
US20150073664A1 (en) * 2013-09-12 2015-03-12 Ford Global Technologies, Llc Collision warning for a vehicle
US20150120137A1 (en) * 2013-10-28 2015-04-30 GM Global Technology Operations LLC Path planning for evasive steering maneuver in presence of target vehicle and surrounding objects
CN106599832A (en) * 2016-12-09 2017-04-26 重庆邮电大学 Method for detecting and recognizing various types of obstacles based on convolution neural network
CN109214348A (en) * 2018-09-19 2019-01-15 北京极智嘉科技有限公司 A kind of obstacle detection method, device, equipment and storage medium
CN109269478A (en) * 2018-10-24 2019-01-25 南京大学 A kind of container terminal based on binocular vision bridge obstacle detection method
CN109785366A (en) * 2019-01-21 2019-05-21 中国科学技术大学 It is a kind of for the correlation filtering method for tracking target blocked
CN109910877A (en) * 2019-02-01 2019-06-21 中科安达(北京)科技有限公司 The method of AEBS intelligent recognition barrier
CN110132275A (en) * 2019-04-29 2019-08-16 北京云迹科技有限公司 Laser barrier-avoiding method and device
CN110348290A (en) * 2019-05-27 2019-10-18 天津中科智能识别产业技术研究院有限公司 Coke tank truck safe early warning visible detection method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
HYUN SOO PARK 等: ""Object detection in adaptive cruise control using multi-class support vector machine"", 《IEEE》 *
SE-KYUNG PARK 等: ""Lane estimation using lateral histogram in radar based ACC system"", 《IEEE》 *
张芳慧: ""行车视频中基于深度学习的目标检测"", 《中国优秀硕士学位论文全文数据库信息科技辑》 *

Also Published As

Publication number Publication date
CN110765970B (en) 2022-08-09

Similar Documents

Publication Publication Date Title
JP6802039B2 (en) Adaptive update method and device of registration database for user authentication
US20200334638A1 (en) Method and apparatus for processing loss assessment data for car insurance and processing device
EP3407200A1 (en) Method and device for updating online self-learning event detection model
CN111814746A (en) Method, device, equipment and storage medium for identifying lane line
CN109684944B (en) Obstacle detection method, obstacle detection device, computer device, and storage medium
CN110363748B (en) Method, device, medium and electronic equipment for processing dithering of key points
CN110381310B (en) Method and device for detecting health state of visual system
CN114821066A (en) Model training method and device, electronic equipment and computer readable storage medium
CN113435409A (en) Training method and device of image recognition model, storage medium and electronic equipment
CN112800812A (en) Target object lane change identification method and device, readable storage medium and electronic equipment
WO2017072854A1 (en) Monitoring device, monitoring system and monitoring method
CN110392207B (en) Method and device for triggering focusing of camera equipment
CN110765970B (en) Method and device for determining nearest obstacle, storage medium and electronic equipment
CN115294328A (en) Target detection frame generation method and device, storage medium and electronic equipment
CN112770057A (en) Camera parameter adjusting method and device, electronic equipment and storage medium
CN112991418A (en) Image depth prediction and neural network training method and device, medium and equipment
CN112950687B (en) Method and device for determining tracking state, storage medium and electronic equipment
CN114740975A (en) Target content acquisition method and related equipment
US10061388B2 (en) Method and apparatus for processing user input
CN111212239B (en) Exposure time length adjusting method and device, electronic equipment and storage medium
US20200258550A1 (en) Moving image reproduction apparatus, moving image reproduction method, moving image reproduction system, and storage medium
CN113111692A (en) Target detection method and device, computer readable storage medium and electronic equipment
CN113158706A (en) Face snapshot method, device, medium and electronic equipment
CN115361517B (en) Video recording method and device before police, electronic equipment and readable storage medium
CN112863096A (en) Monitoring method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant