CN116118739A - Vehicle periphery monitoring method and system - Google Patents

Vehicle periphery monitoring method and system Download PDF

Info

Publication number
CN116118739A
CN116118739A CN202310010068.7A CN202310010068A CN116118739A CN 116118739 A CN116118739 A CN 116118739A CN 202310010068 A CN202310010068 A CN 202310010068A CN 116118739 A CN116118739 A CN 116118739A
Authority
CN
China
Prior art keywords
scene
data
vehicle
determining
attribute data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310010068.7A
Other languages
Chinese (zh)
Inventor
王宏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Binneng Electric Technology Co ltd
Original Assignee
Shenzhen Binneng Electric Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Binneng Electric Technology Co ltd filed Critical Shenzhen Binneng Electric Technology Co ltd
Priority to CN202310010068.7A priority Critical patent/CN116118739A/en
Publication of CN116118739A publication Critical patent/CN116118739A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Human Computer Interaction (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention provides a vehicle periphery monitoring method and a system; wherein the method comprises the following steps: acquiring perception data of a surrounding area of a vehicle, and determining motion data of each moving object according to the perception data; acquiring scene data of an area where a vehicle is located, and determining scene attribute data according to the scene data; and calculating the matching relation between the motion data and the scene attribute data, and deciding the monitoring display of each motion object according to the matching relation. According to the invention, the corresponding relation between the motion condition of the moving object and the scene is analyzed to screen the moving object which needs to be displayed to the driver in the vehicle, so that the probability of invalid display of the surrounding objects can be effectively reduced, the driver can obtain enough surrounding condition information, and the traction to the attention of the driver can be reduced as much as possible so as to improve the driving safety.

Description

Vehicle periphery monitoring method and system
Technical Field
The present invention relates to the technical field of driving assistance, and in particular, to a vehicle periphery monitoring method, a system, an electronic device, and a computer storage medium.
Background
The display of the movement status of objects around the vehicle is an important development direction of automobile driving assistance technology. In the existing display method, a vehicle-mounted detection sensor, such as a millimeter wave radar, a camera and a laser radar, is used for detecting and sensing objects around a vehicle body, and displaying the relative motion state of the vehicle-mounted detection sensor and the vehicle on a vehicle-mounted display screen. However, the existing method lacks a reasonable screening strategy for surrounding objects, and displaying all the surrounding objects can shift excessive attention of a driver, which is not beneficial to safe and efficient driving.
Disclosure of Invention
In order to at least solve the technical problems in the background art, the invention provides a vehicle periphery monitoring method, a system, an electronic device and a computer storage medium.
A first aspect of the invention provides a vehicle surroundings monitoring method including the steps of:
acquiring perception data of a surrounding area of a vehicle, and determining motion data of each moving object according to the perception data;
acquiring scene data of an area where a vehicle is located, and determining scene attribute data according to the scene data;
and calculating the matching relation between the motion data and the scene attribute data, and deciding the monitoring display of each motion object according to the matching relation.
Further, the obtaining the scene data of the area where the vehicle is located, and determining the scene attribute data according to the scene data includes:
matching calculation is carried out on the vehicle positioning data and a high-precision map to determine scene data of an area where the vehicle is located, and the scene attribute data corresponding to the scene data is retrieved from the high-precision map;
and/or the number of the groups of groups,
and drawing scene maps of the vehicle and the moving objects according to the motion data, carrying out matching calculation on the scene maps and a plurality of template scene maps, and determining the scene attribute data according to a matching calculation result.
Further, the calculating the matching relation between the motion data and the scene attribute data includes:
determining preset conditions according to the scene attribute data, and calculating the matching degree of the motion data and the preset conditions;
if the matching degree is higher than a judging threshold value, judging that the motion data is matched with the scene attribute data; otherwise, determining that the motion data does not match the scene attribute data.
Further, the determining the preset condition according to the scene attribute data includes:
if the scene attribute data is a smooth driving scene, determining the preset condition as a first preset condition; the first preset condition is related to real-time motion data of the moving object;
if the attribute data are the delayed driving scenes, determining that the preset conditions are second preset conditions; the second preset condition is related to potential motion data of the moving object.
Further, when the scene attribute data is a smooth driving scene, the calculating the matching degree between the motion data and the preset condition includes:
calculating a first mutation evaluation value of each moving object based on the motion data, and setting the matching degree to be lower than the judging threshold value if the first mutation evaluation value exceeds a first safety range of the first preset condition; otherwise, setting the matching degree to be higher than the judging threshold value.
Further, the first safety range is determined by:
determining a first initial safety range associated with the smooth driving scene;
and determining the number and the vehicle speed of the moving objects according to the motion data, determining a first correction coefficient according to the number and the vehicle speed, and correcting the first initial safety range according to the first correction coefficient to obtain the first safety range.
Further, when the scene attribute data is a delayed driving scene, the calculating the matching degree between the motion data and the preset condition includes:
calculating a second mutation evaluation value of each moving object based on the motion data, and setting the matching degree to be lower than the judging threshold value if the second mutation evaluation value exceeds a second safety range of the second preset condition; otherwise, setting the matching degree to be higher than the judging threshold value.
A second aspect of the present invention provides a vehicle surroundings monitoring system, including an acquisition module, a processing module, a storage module; the processing module is connected with the acquisition module and the storage module;
the memory module is used for storing executable computer program codes;
the acquisition module is used for acquiring the perception data of the surrounding area of the vehicle and transmitting the perception data to the processing module;
the processing module is configured to perform the method of any of the preceding claims by invoking the executable computer program code in the storage module.
A third aspect of the present invention provides an electronic device comprising: a memory storing executable program code; a processor coupled to the memory; the processor invokes the executable program code stored in the memory to perform the method of any one of the preceding claims.
A fourth aspect of the invention provides a computer storage medium having stored thereon a computer program which, when executed by a processor, performs a method as claimed in any one of the preceding claims.
The invention has the beneficial effects that:
compared with the prior art, the method has the advantages that the probability that the surrounding objects are invalid to be displayed can be effectively reduced, the driver can obtain enough surrounding condition information, and the traction to the attention of the driver can be reduced as much as possible so as to improve the driving safety.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flow chart of a vehicle periphery monitoring method according to an embodiment of the present invention.
Fig. 2 is a schematic diagram of a vehicle surroundings monitoring system according to an embodiment of the present invention.
Fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present disclosure are described below in conjunction with the accompanying drawings, which include various details of the embodiments of the present disclosure to facilitate understanding, and should be considered as merely exemplary. Accordingly, one of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
Referring to fig. 1, fig. 1 is a flowchart of a vehicle periphery monitoring method according to an embodiment of the invention. Referring to fig. 1, an embodiment of the present invention provides a vehicle periphery monitoring method, including the steps of:
acquiring perception data of a surrounding area of a vehicle, and determining motion data of each moving object according to the perception data;
acquiring scene data of an area where a vehicle is located, and determining scene attribute data according to the scene data;
and calculating the matching relation between the motion data and the scene attribute data, and deciding the monitoring display of each motion object according to the matching relation.
In the embodiment, a vehicle detects a certain peripheral area through a vehicle-mounted sensor, such as a millimeter wave radar, a camera, a laser radar and the like, so as to determine the real-time motion condition of each moving object; meanwhile, the attribute of the scene where the vehicle is located can be obtained, and whether the moving object has the display output requirement or not is determined according to the corresponding relation between the attribute of the scene and the real-time movement condition of the moving object. Therefore, the invention screens the moving objects which need to be displayed to the driver in the vehicle by analyzing the corresponding relation between the moving condition of the moving objects and the scene, compared with the prior art, the invention can effectively reduce the probability that the surrounding objects are invalid to be displayed, not only can enable the driver to obtain enough surrounding condition information, but also can reduce the traction of the attention of the driver as much as possible so as to improve the driving safety.
Further, the obtaining the scene data of the area where the vehicle is located, and determining the scene attribute data according to the scene data includes:
matching calculation is carried out on the vehicle positioning data and a high-precision map to determine scene data of an area where the vehicle is located, and the scene attribute data corresponding to the scene data is retrieved from the high-precision map;
and/or the number of the groups of groups,
and drawing scene maps of the vehicle and the moving objects according to the motion data, carrying out matching calculation on the scene maps and a plurality of template scene maps, and determining the scene attribute data according to a matching calculation result.
In this embodiment, the scene attribute data of the scene where the host vehicle is currently located may be determined at least by one of the two manners. In the former, the vehicle positioning information is projected into a high-precision map through a navigation positioning module equipped in the vehicle, and the scene where the vehicle is located and the attribute data thereof, such as intersections, viaducts and the like, can be directly extracted from the high-precision map. And for the latter, taking the coordinate system of the vehicle as a reference, projecting the motion data (positioning, heading, speed and the like) of the peripheral vehicles acquired by the sensing one by one into the coordinate system, so as to obtain a scene map representing the relative motion state of the vehicle and each peripheral object, and then carrying out matching calculation on the scene map and a preset template scene map to determine a specific template scene map, thereby determining the scene attribute data of the region.
Further, the calculating the matching relation between the motion data and the scene attribute data includes:
determining preset conditions according to the scene attribute data, and calculating the matching degree of the motion data and the preset conditions;
if the matching degree is higher than a judging threshold value, judging that the motion data is matched with the scene attribute data; otherwise, determining that the motion data does not match the scene attribute data.
Further, the determining the preset condition according to the scene attribute data includes:
if the scene attribute data is a smooth driving scene, determining the preset condition as a first preset condition; the first preset condition is related to real-time motion data of the moving object;
if the attribute data are the delayed driving scenes, determining that the preset conditions are second preset conditions; the second preset condition is related to potential motion data of the moving object.
In this embodiment, the scene attribute is specifically divided into a smooth driving scene and a delayed driving scene, and different preset conditions are set for different scene types. The smooth driving scene refers to a road section where vehicles can smoothly pass at a certain reasonable speed, and the delayed driving scene can be a situation that vehicles frequently stop and go, such as an intersection scene, a non-intersection congestion road section scene and the like.
Under the scenes of different attributes, the threat conditions of surrounding vehicles to the vehicle are different, and different preset conditions are specifically set, so that the identification accuracy of dangerous moving objects can be improved, and the following embodiments can be further described in detail.
Based on the foregoing scheme, the recognition of the smooth driving scene and the delayed driving scene may be realized by real-time road condition data in the high-precision map, or may be obtained by analyzing the individual motion states of each moving object and the relative motion states of all moving objects including the vehicle in the scene map drawn in real time.
Further, when the scene attribute data is a smooth driving scene, the calculating the matching degree between the motion data and the preset condition includes:
calculating a first mutation evaluation value of each moving object based on the motion data, and setting the matching degree to be lower than the judging threshold value if the first mutation evaluation value exceeds a first safety range of the first preset condition; otherwise, setting the matching degree to be higher than the judging threshold value.
In this embodiment, in a smooth driving scenario, the "real-time abrupt change" of the vehicle motion based on the motion data is worth focusing. The "real-time mutation" according to the present embodiment includes two cases, namely: 1) The speed of the vehicle is clearly distinguished from other vehicles in the scene, for example, the speed is too fast (significantly exceeding the speed of the surrounding vehicles or significantly exceeding the road segment speed limit value); 2) The trend of movement of the vehicle is distinguished from other vehicles in the scene, for example the trend of movement of the vehicle is continuous acceleration (which has been accelerated for a period of time before) while the other vehicles remain at a relatively steady speed. And identifying the vehicles with 'real-time abrupt change' according to the real-time motion data of each moving object, and judging that the vehicles are not matched with the current scene when the vehicles exceed a first safety range in a first preset condition.
It should be noted that, although the above is illustrated with respect to the vehicle speed, the parameters related to the "abrupt change" may also include an abrupt change of track (other vehicles do not change tracks), an abrupt change of line pressing (other vehicles do not press lines), an abrupt change of snake shape (other vehicles do not move in a snake shape), and the like, which will not be described in detail.
Further, the first safety range is determined by:
determining a first initial safety range associated with the smooth driving scene;
and determining the number and the vehicle speed of the moving objects according to the motion data, determining a first correction coefficient according to the number and the vehicle speed, and correcting the first initial safety range according to the first correction coefficient to obtain the first safety range.
In this embodiment, a first initial safety range of the smooth driving scene is pre-established, and the first initial safety range can comprehensively consider specific road segment types of the smooth driving scene, for example, the speed limit value of the corresponding road segment. Then, a first correction coefficient is determined according to the number of moving objects in the scene and the vehicle speed, and the first initial safety range is corrected by using the first correction coefficient, so that a more targeted first safety range is obtained.
The first correction coefficient is inversely related to the number and the vehicle speed, namely, when the number of vehicles in the scene is larger and the vehicle speed is higher, the first correction coefficient is set smaller, so that the first safety range is reduced, namely, the sensitivity of 'abrupt change' judgment is enhanced.
Further, when the scene attribute data is a delayed driving scene, the calculating the matching degree between the motion data and the preset condition includes:
calculating a second mutation evaluation value of each moving object based on the motion data, and setting the matching degree to be lower than the judging threshold value if the second mutation evaluation value exceeds a second safety range of the second preset condition; otherwise, setting the matching degree to be higher than the judging threshold value.
In this embodiment, in the delayed driving scenario, the "potential abrupt change" of the vehicle motion based on the motion data is more significant. The "potential abrupt change" according to the embodiment mainly refers to the situation of abrupt lane change of the vehicle, for example, abrupt lane change, stopover running, etc. of the vehicle at the green light of the intersection, and may also include sudden braking behavior when the green light is close to the cut-off, etc., which are predicted by real-time motion data. The second abrupt change evaluation value is used for representing the probability of the dangerous behavior of surrounding vehicles in the delayed driving scene, and when the probability exceeds a second safety range, the movement data of the vehicles can be judged to be not matched with the scene.
As a further aspect of this embodiment, the second safety range is determined by:
determining a second initial safety range associated with the hysteresis driving scene;
and determining the type and the lane data of the moving object according to the movement data, determining a second correction coefficient according to the type and the lane data, and correcting the second initial safety range according to the second correction coefficient to obtain the second safety range.
A second initial safety range of the delayed driving scene is pre-established, and the second initial safety range can be related to a specific intersection, such as the number of lane types, the number of lanes and the like of the intersection. Then, a second correction coefficient is determined according to the type of the moving object in the scene and the lane data, and the second initial safety range is corrected by using the second correction coefficient, so that a more targeted second safety range is obtained.
The second correction coefficient is related to the type of the moving object and the data of the lane where the moving object is located, for example, when the moving object is a conventional automobile, the second correction coefficient is set to a first value, and when the moving object is a motorcycle, an electric bicycle, a bicycle, an aged walker or the like, the second correction coefficient is set to a second value, the first value is larger than the second value, and the second value can adjust the second initial safety range smaller than the first value; thus, the size of the second correction coefficient can be adjusted according to the probability of potential mutation (such as sudden lane change and blockage) of different types of moving objects, and then the judgment sensitivity of the potential mutation is adjusted.
It should be noted that the first safety range may be a numerical range, for example, a vehicle speed range; while the second safety range is preferably a specific value, such as a probability value.
Referring to fig. 2, fig. 2 is a schematic diagram of a vehicle periphery monitoring system according to an embodiment of the invention. As shown in fig. 2, a vehicle periphery monitoring system according to an embodiment of the present invention includes an acquisition module (101), a processing module (102), and a storage module (103); the processing module (102) is connected with the acquisition module (101) and the storage module (103);
-said storage module (103) for storing executable computer program code;
the acquisition module (101) is used for acquiring the perception data of the surrounding area of the vehicle and transmitting the perception data to the processing module (102);
-said processing module (102) for executing the method according to any of the preceding claims by invoking said executable computer program code in said storage module (103).
For specific functions of a vehicle periphery monitoring system in this embodiment, referring to the foregoing embodiments, since the system in this embodiment adopts all the technical solutions of the foregoing embodiments, at least all the beneficial effects brought by the technical solutions of the foregoing embodiments are provided, and will not be described in detail herein.
Referring to fig. 3, fig. 3 is an electronic device according to an embodiment of the present invention, including: a memory storing executable program code; a processor coupled to the memory; the processor invokes the executable program code stored in the memory to perform the method as described in the previous embodiment.
The embodiment of the invention also discloses a computer storage medium, and a computer program is stored on the storage medium, and when the computer program is run by a processor, the computer program executes the method according to the previous embodiment.
The processor in the electronic device of the present invention may perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM) or a computer program loaded from a memory into a Random Access Memory (RAM). In RAM, various programs and data required for operation can also be stored. The processor, ROM and RAM are connected to each other by a bus. An input/output (I/O) interface is also connected to the bus.
A plurality of components in an electronic device are connected to an I/O interface, comprising: an input unit such as a keyboard, a mouse, etc.; an output unit such as various types of displays, speakers, and the like; a storage unit such as a magnetic disk, an optical disk, or the like; and communication units such as network cards, modems, wireless communication transceivers, and the like. The communication unit allows the device to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The processor may be a variety of general and/or special purpose processing components with processing and computing capabilities. Some examples of processors include, but are not limited to, central Processing Units (CPUs), graphics Processing Units (GPUs), various specialized Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, digital Signal Processors (DSPs), and any suitable processors, controllers, microcontrollers, and the like. The processor performs the various methods and processes described above, such as coping with perceptual methods. For example, in some embodiments, the method of handling awareness may be implemented as a computer software program tangibly embodied on a machine-readable medium, such as a memory. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device via the ROM and/or the communication unit. When the computer program is loaded into RAM and executed by a processor, one or more of the steps of the method of handling awareness described above may be performed. Alternatively, in other embodiments, the processor may be configured to perform the coping sense method by any other suitable means (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for carrying out methods of the present disclosure may be written in any combination of one or more programming languages. These program code may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus such that the program code, when executed by the processor or controller, causes the functions/operations specified in the flowchart and/or block diagram to be implemented. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and pointing device (e.g., a mouse or trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), and the internet.
The computer system may include a client and a server. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server may be a cloud server, a server of a distributed system, or a server incorporating a blockchain.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps recited in the present disclosure may be performed in parallel, sequentially, or in a different order, provided that the desired results of the disclosed aspects are achieved, and are not limited herein.
The above detailed description should not be taken as limiting the scope of the present disclosure. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present disclosure are intended to be included within the scope of the present disclosure.

Claims (10)

1. A vehicle periphery monitoring method characterized by comprising the steps of:
acquiring perception data of a surrounding area of a vehicle, and determining motion data of each moving object according to the perception data;
acquiring scene data of an area where a vehicle is located, and determining scene attribute data according to the scene data;
and calculating the matching relation between the motion data and the scene attribute data, and deciding the monitoring display of each motion object according to the matching relation.
2. A vehicle surroundings monitoring method according to claim 1, characterized in that: the obtaining the scene data of the area where the vehicle is located, determining scene attribute data according to the scene data, includes:
matching calculation is carried out on the vehicle positioning data and a high-precision map to determine scene data of an area where the vehicle is located, and the scene attribute data corresponding to the scene data is retrieved from the high-precision map;
and/or the number of the groups of groups,
and drawing scene maps of the vehicle and the moving objects according to the motion data, carrying out matching calculation on the scene maps and a plurality of template scene maps, and determining the scene attribute data according to a matching calculation result.
3. A vehicle surroundings monitoring method according to claim 1, characterized in that: the calculating the matching relation between the motion data and the scene attribute data comprises the following steps:
determining preset conditions according to the scene attribute data, and calculating the matching degree of the motion data and the preset conditions;
if the matching degree is higher than a judging threshold value, judging that the motion data is matched with the scene attribute data; otherwise, determining that the motion data does not match the scene attribute data.
4. A vehicle surroundings monitoring method according to claim 3, characterized in that: the determining the preset condition according to the scene attribute data comprises the following steps:
if the scene attribute data is a smooth driving scene, determining the preset condition as a first preset condition; the first preset condition is related to real-time motion data of the moving object;
if the attribute data are the delayed driving scenes, determining that the preset conditions are second preset conditions; the second preset condition is related to potential motion data of the moving object.
5. A vehicle surroundings monitoring method according to claim 4, characterized in that: when the scene attribute data is a smooth driving scene, the calculating the matching degree between the motion data and the preset condition includes:
calculating a first mutation evaluation value of each moving object based on the motion data, and setting the matching degree to be lower than the judging threshold value if the first mutation evaluation value exceeds a first safety range of the first preset condition; otherwise, setting the matching degree to be higher than the judging threshold value.
6. A vehicle surroundings monitoring method according to claim 5, characterized in that: the first safety range is determined by:
determining a first initial safety range associated with the smooth driving scene;
and determining the number and the vehicle speed of the moving objects according to the motion data, determining a first correction coefficient according to the number and the vehicle speed, and correcting the first initial safety range according to the first correction coefficient to obtain the first safety range.
7. A vehicle surroundings monitoring method according to claim 4, characterized in that: when the scene attribute data is a delayed driving scene, the calculating the matching degree between the motion data and the preset condition includes:
calculating a second mutation evaluation value of each moving object based on the motion data, and setting the matching degree to be lower than the judging threshold value if the second mutation evaluation value exceeds a second safety range of the second preset condition; otherwise, setting the matching degree to be higher than the judging threshold value.
8. A vehicle periphery monitoring system comprises an acquisition module, a processing module and a storage module; the processing module is connected with the acquisition module and the storage module;
the memory module is used for storing executable computer program codes;
the acquisition module is used for acquiring the perception data of the surrounding area of the vehicle and transmitting the perception data to the processing module;
the method is characterized in that: the processing module for performing the method of any of claims 1-7 by invoking the executable computer program code in the storage module.
9. An electronic device, comprising: a memory storing executable program code; a processor coupled to the memory; the method is characterized in that: the processor invokes the executable program code stored in the memory to perform the method of any of claims 1-7.
10. A computer storage medium having a computer program stored thereon, characterized in that: the computer program, when executed by a processor, performs the method of any of claims 1-7.
CN202310010068.7A 2023-01-04 2023-01-04 Vehicle periphery monitoring method and system Pending CN116118739A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310010068.7A CN116118739A (en) 2023-01-04 2023-01-04 Vehicle periphery monitoring method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310010068.7A CN116118739A (en) 2023-01-04 2023-01-04 Vehicle periphery monitoring method and system

Publications (1)

Publication Number Publication Date
CN116118739A true CN116118739A (en) 2023-05-16

Family

ID=86311206

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310010068.7A Pending CN116118739A (en) 2023-01-04 2023-01-04 Vehicle periphery monitoring method and system

Country Status (1)

Country Link
CN (1) CN116118739A (en)

Similar Documents

Publication Publication Date Title
CN111626208B (en) Method and device for detecting small objects
CN113129625B (en) Vehicle control method and device, electronic equipment and vehicle
US20220035733A1 (en) Method and apparatus for checking automatic driving algorithm, related device and storage medium
JP7314213B2 (en) Vehicle positioning method, apparatus, electronic device, storage medium and program
CN113135193B (en) Method, device, storage medium and program product for outputting early warning information
CN113859264A (en) Vehicle control method, device, electronic device and storage medium
JP7391125B2 (en) Methods, devices, equipment, media and computer programs for identifying automatic driving characteristics
CN116767281A (en) Auxiliary driving method, device, equipment, vehicle and medium
CN114463985A (en) Driving assistance method, device, equipment and storage medium
CN112699773B (en) Traffic light identification method and device and electronic equipment
CN113052047A (en) Traffic incident detection method, road side equipment, cloud control platform and system
CN117168488A (en) Vehicle path planning method, device, equipment and medium
CN116118739A (en) Vehicle periphery monitoring method and system
CN113950702A (en) Multi-object tracking using correlation filters in video analytics applications
CN114394111B (en) Lane changing method for automatic driving vehicle
CN115782919A (en) Information sensing method and device and electronic equipment
CN114715151A (en) Vehicle control method, vehicle control device, electronic device, medium, and autonomous vehicle
CN113989300A (en) Lane line segmentation method and device, electronic equipment and storage medium
CN114282776A (en) Method, device, equipment and medium for cooperatively evaluating automatic driving safety of vehicle and road
CN116279538A (en) Visualization method and system for assisting vehicle driving
CN116386327A (en) Road condition early warning method and system based on driving mode switching
CN114379587B (en) Method and device for avoiding pedestrians in automatic driving
CN114333368B (en) Voice reminding method, device, equipment and medium
CN114565903B (en) Abnormal data identification method and device, equipment, medium and product
CN117922604A (en) Vehicle emergency condition determining method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination