WO2022017385A1 - Procédé et appareil de surveillance de véhicule, véhicule et support de stockage - Google Patents

Procédé et appareil de surveillance de véhicule, véhicule et support de stockage Download PDF

Info

Publication number
WO2022017385A1
WO2022017385A1 PCT/CN2021/107382 CN2021107382W WO2022017385A1 WO 2022017385 A1 WO2022017385 A1 WO 2022017385A1 CN 2021107382 W CN2021107382 W CN 2021107382W WO 2022017385 A1 WO2022017385 A1 WO 2022017385A1
Authority
WO
WIPO (PCT)
Prior art keywords
current vehicle
panoramic
vehicle
target
monitoring
Prior art date
Application number
PCT/CN2021/107382
Other languages
English (en)
Chinese (zh)
Inventor
连桂有
孙连明
李兵
王丽丽
赵秀栋
闫力博
Original Assignee
中国第一汽车股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中国第一汽车股份有限公司 filed Critical 中国第一汽车股份有限公司
Publication of WO2022017385A1 publication Critical patent/WO2022017385A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/93Sonar systems specially adapted for specific applications for anti-collision purposes
    • G01S15/931Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52004Means for monitoring or calibrating

Definitions

  • the embodiments of the present application relate to the field of vehicle monitoring, for example, to a vehicle monitoring method, device, vehicle, and storage medium.
  • the present application provides a vehicle monitoring method, device, vehicle and storage medium. By viewing the stored panoramic data, dangerous image data can be obtained, and the damage party of the damaged vehicle can be found in time.
  • an embodiment of the present application provides a vehicle monitoring method, including:
  • the panoramic data collected by the panoramic system is backed up and stored.
  • an embodiment of the present application further provides a vehicle monitoring device, the device comprising:
  • a target monitoring module configured to perform target monitoring on the preset dangerous range of the current vehicle based on the radar system of the current vehicle in response to the current vehicle being in a target monitoring state;
  • a request message module configured to generate a panorama system request message in response to monitoring that a target exists within the preset danger range
  • a panorama acquisition module configured to trigger the panorama system to perform panorama acquisition on the current vehicle based on the panorama system request message
  • the data storage module is configured to backup and store the panoramic data collected by the panoramic system.
  • an embodiment of the present application further provides a vehicle, the vehicle comprising:
  • the vehicle also includes:
  • storage means arranged to store at least one program
  • the at least one processor When the at least one program is executed by the at least one processor, the at least one processor implements the vehicle monitoring method provided by any embodiment of the present application.
  • an embodiment of the present application further provides a storage medium containing vehicle executable instructions, and a computer program is stored thereon, and the computer program implements the vehicle monitoring method provided by any embodiment of the present application when the computer program is executed by a processor.
  • FIG. 1 is a flowchart of a vehicle monitoring method provided in Embodiment 1 of the present application.
  • FIG. 2 is a flowchart of a vehicle monitoring method provided in Embodiment 2 of the present application.
  • FIG. 3 is a schematic structural diagram of a vehicle monitoring device provided in Embodiment 3 of the present application.
  • FIG. 4 is a schematic structural diagram of a vehicle according to Embodiment 4 of the present application.
  • FIG. 1 is a flowchart of a vehicle monitoring method in Embodiment 1 of the present application. This embodiment can be applied to the situation of monitoring the environment in a preset range of the vehicle after the vehicle sleeps.
  • the method can be implemented by the vehicle monitoring device provided by the embodiment of the present application.
  • the vehicle monitoring device may be implemented in software and/or hardware.
  • the vehicle monitoring device may be jointly implemented by an ultrasonic radar system and a panoramic system.
  • the method includes S110 to S140.
  • the current vehicle includes a working state (ie, working mode) and a sleep monitoring state (ie, sleep mode).
  • the working state is the state when the current vehicle is in the normal driving mode;
  • the sleep monitoring state is the state when the current vehicle is in the parking mode and the driver is not in the current vehicle.
  • the current state of the vehicle is monitored, and the target monitoring state is entered when the vehicle sleep monitoring conditions are met, and the target monitoring is performed on the preset dangerous range of the current vehicle based on the radar system of the current vehicle.
  • the radar system in the current vehicle may be composed of a radar controller and several ultrasonic radars.
  • the ultrasonic radar is usually installed at the position of the front and rear bumper assemblies and the position of the side body of the vehicle body, and is parallel to the horizontal plane.
  • the composition and installation position of the above-mentioned radar system are only optional embodiments, and can also be set according to actual needs, and the composition and installation position of the radar system are not limited in this embodiment.
  • the target monitoring is carried out within the preset dangerous range of the current vehicle.
  • the ultrasonic radar starts timing by sending the ultrasonic wave. When the ultrasonic wave propagates in the air and encounters obstacles, it will be reflected back. When the probe receives the rebound sound wave, it records Feedback interruption time.
  • the radar controller calculates the obstacle information based on the propagation speed of the ultrasonic wave in the air and the interruption time of the feedback through the interruption time of the probe feedback at different positions, so as to determine the distance between the target object and the current vehicle, and determine whether the current vehicle is in advance.
  • the preset danger range may be within 60 cm of the current vehicle circumference.
  • the target object may include but not limited to other objects such as people, vehicles, rocks, etc.
  • the above-mentioned preset danger range can also be set as required, and this implementation does not limit the setting of the preset danger range.
  • the panoramic system when the radar monitors the current vehicle according to the preset danger range, and detects that the current vehicle enters the target object within the preset danger range, a panoramic system request message is generated.
  • the panoramic system can be composed of a controller and several wide-angle cameras.
  • the cameras are generally arranged in the front grille, left and right exterior mirrors, rear doors, etc.
  • the real-time image picked up by the wide-angle camera is corrected and stitched by the controller distortion.
  • the panoramic top view reflects the real environment around the vehicle, making it a 360-degree top view of the vehicle around the vehicle, enabling panoramic monitoring of the current vehicle’s preset danger range.
  • the request message may be a panorama request instruction triggered by the target object.
  • the radar system detects that the target object has entered the preset dangerous range of the current vehicle, it sends a request message to the panoramic system to activate the panoramic system, and the message contains the target object that has entered the preset dangerous range of the current vehicle. , requesting the panorama system to activate the data collected by the panorama image, and then transmit the data to the panorama system.
  • the panoramic system triggers the panoramic system to collect panoramic views of the current vehicle according to the received request message for activating the panoramic system (ie, the panoramic system request message in step 130 ).
  • a pre-placed wide-angle camera can be used for panoramic capture of the current vehicle.
  • the time for panorama system capture can be preset, and when the preset capture time is reached, the panorama system is automatically turned off to stop capture; it can also be detected when the distance between the target object and the current vehicle is greater than the preset time.
  • the panoramic system is automatically turned off to stop the acquisition.
  • the radar system detects that the distance between the other vehicles and the current vehicle is less than the distance of the preset danger range, Send a request message to activate the panoramic system, and the panoramic system will start the panoramic collection accordingly.
  • the preset collection time is reached or when the distance between other vehicles and the current vehicle is greater than the preset dangerous range, the panoramic system will be automatically turned off to stop the collection.
  • the above-mentioned conditions for stopping collection are only optional embodiments, and may also be set according to actual conditions, and the conditions for stopping collection are not limited in this embodiment.
  • the obtained original panoramic data (original images that have not undergone data processing such as distortion correction have legal effect) are backed up and stored.
  • the collected panoramic data can be backed up and stored in the panoramic system, or uploaded to a cloud server for backup storage through the panoramic system.
  • a vehicle monitoring method when a current vehicle is in a target monitoring state, target monitoring is performed on a preset dangerous range of the current vehicle based on a radar system of the current vehicle; When the target is reached, a panorama system request message is generated; based on the panorama system request message, the panorama system is triggered to perform panorama acquisition of the current vehicle; the panorama data collected by the panorama system is backed up and stored.
  • the panoramic system is activated to collect and store the panoramic view of the vehicle, which is convenient for viewing the panoramic information in the sleep state of the vehicle. By viewing the stored panoramic data, Find out and find the information of the harming party in time.
  • FIG. 2 is a flowchart of a vehicle monitoring method in Embodiment 2 of the present application, which is refined on the basis of the foregoing embodiment. As shown in FIG. 2 , the method includes S210 to S260.
  • the current vehicle includes a radar system and a panoramic acquisition system.
  • the radar system consists of a radar controller and several ultrasonic radars.
  • the setting positions of the multiple ultrasonic radars are determined according to the detection parameters of the ultrasonic radars, and the detection angle calibration and detection accuracy calibration of the multiple ultrasonic radars in the radar system are performed according to the detection parameters. .
  • the first one is installed on the front and rear bumpers of the car and is set to measure the obstacles in the front and rear of the car; the second is installed on the side of the car and is set to measure the obstacles on the side.
  • the detection range and detection area of the two ultrasonic radars are different. Among them, the detection distance of the ultrasonic radar that measures the obstacles in the front and rear of the car is generally between 15-250cm, and the detection distance of the ultrasonic radar that measures the obstacles on the side is generally in Between 30-500cm.
  • the number and positions of the ultrasonic radars set on the vehicle are limited, and the ultrasonic radars are set according to the performance parameters such as the detection distance and detection angle of the multiple ultrasonic radars, so that the radar system can perform a 360-degree measurement of the current vehicle.
  • the whole vehicle can be monitored.
  • a left front ultrasonic radar, a left rear ultrasonic radar, a right front ultrasonic radar, a right rear ultrasonic radar, a front ultrasonic radar, and a front rear ultrasonic radar are set in the current vehicle.
  • the front ultrasonic radar and the front ultrasonic radar are the first ultrasonic radar; the left front ultrasonic radar, the left rear ultrasonic radar, the right front ultrasonic radar and the right rear ultrasonic radar are the second ultrasonic radar.
  • the setting of the above ultrasonic radar is only an optional embodiment, and it can also be set according to actual needs, and the setting of the ultrasonic wave is not limited in this embodiment.
  • the detection angle calibration and detection accuracy calibration are carried out according to the set ultrasonic radar.
  • the ultrasonic radar performs multiple detections on the same detection target in the process of moving with the current vehicle movement. Detect and record the detected position data; the ultrasonic radar calculates the detected position data to obtain the deviation between the actual installation angle and the preset installation angle, and completes the calibration of the detection angle through correction.
  • the detection accuracy is calibrated according to the difference between the actual timing of the bounced sound waves received by the radar controller probe in the radar system and the calculated timing.
  • the above-mentioned calibration method for the detection angle and detection accuracy of the ultrasonic radar is only an optional embodiment. In fact, other settings may be performed as required, and the calibration method is not limited in the embodiment of the present application.
  • S240 Trigger the panoramic system to collect panoramic views of the current vehicle based on the panoramic system request message.
  • the panoramic raw data is the panoramic data.
  • the cloud server can perform danger identification on the panoramic data based on the stored panoramic original data, and cut the panoramic data based on the danger identification result to obtain dangerous image data, and transmit the dangerous image data to the current vehicle or the current vehicle.
  • the associated terminal of prompts the owner of the current vehicle.
  • the hazard identification for the panoramic data may be image data or video data that damages the current vehicle from the panoramic data.
  • the cloud server can call the danger identification model, which has the function of identifying danger, and inputs the obtained panoramic data into the danger identification model, and the output result is the dangerous image data with the danger level.
  • the risk identification model can be trained based on historical risk image data of different levels.
  • the panoramic data is cut out according to the obtained dangerous image data, and then the dangerous image data obtained after the cut processing is transmitted to the current vehicle or the associated terminal of the current vehicle.
  • the dangerous image data in the current vehicle is the image data collected by the camera of the tailgate
  • the image data of the tailgate is cut and transmitted as the dangerous image data.
  • the dangerous image data is image data collected when a target object collides with the current vehicle.
  • the hazard level may be determined according to the damage area and deformation degree of the current vehicle in the panoramic data, and the damage of the current vehicle may be graded based on the damage area and deformation degree corresponding to multiple hazard levels.
  • the identified hazard levels may be classified as minor, moderate, and severe.
  • the minor danger is a slight collision between the target and the vehicle, the appearance of the current vehicle is slightly damaged, and does not affect the driving of the current vehicle; the medium danger is a slight collision between the target and the current vehicle, and the appearance of the current vehicle is affected.
  • the damage is more serious and affects the driving of the current vehicle; the serious danger is the excessive collision between the target object and the current vehicle, the appearance of the current vehicle is severely damaged, and the current vehicle cannot be driven; according to the output results of the danger recognition model Different levels of reminders can be made to the owner.
  • the panorama data whose distance between the target object and the vehicle is less than the danger threshold may also be clipped as danger image data.
  • the risk threshold may be, for example, 10 cm.
  • the time when the target object enters the range where the distance from the vehicle is less than the danger threshold is used as the starting point of the dangerous image data, and the time when the target object leaves the range where the distance from the vehicle is less than the danger threshold is used as the end point of the dangerous image data, based on the starting point Cut the panoramic data with the end point to obtain the dangerous image data.
  • the above-mentioned danger identification method and cutting method are only optional embodiments.
  • the identification method and the cutting method can be set according to actual needs, and this embodiment does not limit the identification method and the cutting method.
  • the dangerous image data can be transmitted to the current vehicle or the associated terminal of the current vehicle; when it is identified as a medium danger, it can be Automatically transmit the dangerous image data to the current vehicle or the associated terminal of the current vehicle; when a serious danger is identified, the dangerous image data can be automatically transmitted to the current vehicle or the associated terminal of the current vehicle, the current vehicle or the
  • the display device in the associated terminal of the current vehicle makes a mandatory reminder of the page pop-up; of course, the above-mentioned classification and reminder methods are only optional embodiments, which can actually be set according to actual needs, which are not limited in this embodiment.
  • the embodiment of the present application performs target monitoring on the preset dangerous range of the current vehicle based on the radar system of the current vehicle by setting the radar system; when a target is detected in the preset dangerous range, a panoramic system request message is generated ;Trigger the panoramic system to collect panoramic views of the current vehicle based on the panoramic system request message; upload the panoramic data collected by the panoramic system to the cloud server for storage, and perform danger identification on the collected panoramic data, and based on the danger identification results
  • the data is cut to obtain dangerous image data, and the dangerous image data is transmitted to the current vehicle or an associated terminal of the current vehicle, so as to prompt the owner of the current vehicle.
  • the vehicle owner can obtain dangerous image data conveniently and quickly, and discover and find the damage party in time, thereby avoiding the owner's property loss.
  • FIG. 3 is a schematic structural diagram of a traffic data processing apparatus in Embodiment 3 of the present application. As shown in Figure 3, the device includes:
  • the target monitoring module 310 is configured to perform target monitoring on the preset dangerous range of the current vehicle based on the radar system of the current vehicle when the current vehicle is in a target monitoring state;
  • the request message module 320 is configured to generate a panoramic system request message if a target is detected within the preset danger range;
  • the panorama acquisition module 330 is configured to trigger the panorama system to perform panorama acquisition on the current vehicle based on the panorama system request message;
  • the data storage module 340 is configured to backup and store the panoramic data collected by the panoramic system.
  • the target monitoring module 310 includes:
  • a distance calculation unit configured to calculate the distance between the target and the current vehicle according to the interruption of the target feedback received by the radar system
  • a distance judging unit configured to judge whether the distance between the target and the current vehicle is within the preset danger range of the current vehicle.
  • the panoramic collecting module 330 includes a first panoramic collecting unit or a second panoramic collecting unit;
  • the first panorama acquisition unit is configured to control the panorama system to perform panorama acquisition in a preset time period
  • the second panorama acquisition unit is configured to control the panorama system to perform panorama acquisition until the target within the preset danger range leaves the preset danger range.
  • the data storage module 340 includes:
  • the data transmission unit is configured to transmit the panoramic data collected by the panoramic system to a cloud server, wherein the cloud server is configured to perform backup processing on the panoramic data.
  • the cloud server is further configured to perform danger identification on the panoramic data, and cut the panoramic data based on the danger identification result to obtain dangerous image data, and transmit the dangerous image data to the current vehicle or other vehicle. the associated terminal of the current vehicle.
  • the device further includes:
  • a working mode monitoring module configured to monitor the working mode of the current vehicle, wherein the working mode of the current vehicle includes a normal mode and a sleep mode;
  • the current vehicle circumferential direction is provided with a plurality of ultrasonic radars, and the plurality of ultrasonic radars form the radar system, wherein the setting position of each ultrasonic radar is determined according to the detection parameter of each ultrasonic radar;
  • the device also includes:
  • the calibration module is configured to perform detection angle calibration and detection accuracy calibration for each ultrasonic radar in the radar system.
  • a vehicle monitoring device when a current vehicle is in a target monitoring state, target monitoring is performed on a preset dangerous range of the current vehicle based on a radar system of the current vehicle; when the preset dangerous range is monitored When there is a target in the memory, a panorama system request message is generated; based on the panorama system request message, the panorama system is triggered to perform panorama acquisition of the current vehicle; the panorama data collected by the panorama system is backed up and stored.
  • the panoramic system is activated to collect and store the panoramic view of the vehicle, which is convenient for viewing the panoramic information in the vehicle sleep monitoring state, and timely discovering and finding the information of the damage party .
  • FIG. 4 is a schematic structural diagram of a vehicle in Embodiment 4 of the present application.
  • FIG. 4 shows a block diagram of an exemplary vehicle 412 suitable for use in implementing embodiments of the present application.
  • the vehicle 412 shown in FIG. 4 is only an example, and should not impose any limitations on the functions and scope of use of the embodiments of the present application.
  • vehicle 412 takes the form of a general purpose computing device.
  • Components of vehicle 412 may include, but are not limited to, radar system 410, panoramic system 411, at least one processor or processing unit 416, system memory 428, bus 418 connecting various system components including system memory 428 and processing unit 416.
  • the bus 418 represents at least one of several types of bus structures, including a memory bus or memory controller, a peripheral bus, a graphics acceleration port, a processor, or a local bus using any of a variety of bus structures.
  • these architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, enhanced ISA bus, Video Electronics Standards Association (Video Electronics Standards) Association, VESA) local bus and Peripheral Component Interconnect (PCI) bus.
  • Vehicle 412 typically includes a variety of computer system readable media. These media can be any available media that can be accessed by the vehicle 412, including volatile and non-volatile media, removable and non-removable media.
  • System memory 428 may include computer system readable media in the form of volatile memory, such as random access memory (RAM) 430 and/or cache memory 432 .
  • Vehicle 412 may include other removable/non-removable, volatile/non-volatile computer system storage media.
  • storage system 434 may be used to read and write to non-removable, non-volatile magnetic media (not shown in FIG. 4, commonly referred to as a "hard drive”).
  • a magnetic disk drive for reading and writing to removable non-volatile magnetic disks (eg "floppy disks") and removable non-volatile optical disks (eg Compact Disc-Read only) may be provided.
  • Memory 428 may include at least one program product having a set (eg, at least one) of program modules configured to perform the functions of various embodiments of the present application.
  • a program/utility 440 having a set (at least one) of program modules 442, which may be stored, for example, in memory 428, such program modules 442 including, but not limited to, an operating system, at least one application program, other program modules, and program data, which An implementation of a network environment may be included in each or some combination of the examples.
  • Program modules 442 generally perform the functions and/or methods of the embodiments described herein.
  • the vehicle 412 may also communicate with at least one external device 414 (eg, a keyboard, pointing device, display 424, etc.), may also communicate with at least one device that enables a user to interact with the vehicle 412, and/or communicate with the vehicle 412. Any device (eg, network card, modem, etc.) that communicates with at least one other computing device. Such communication may take place through Input/Output (I/O) interface 422 . Also, the vehicle 412 may communicate with at least one network (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet) via a network adapter 420. As shown, network adapter 420 communicates with other modules of vehicle 412 via bus 418 .
  • LAN Local Area Network
  • WAN Wide Area Network
  • public network such as the Internet
  • vehicle 412 may be used in conjunction with vehicle 412, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, Redundant Arrays of Independent Disks, RAID) systems, tape drives, and data backup storage systems.
  • the processing unit 416 executes various functional applications and data processing by running the programs stored in the system memory 428, for example, to implement the vehicle monitoring method provided by the embodiments of the present application, and the method includes:
  • the panoramic data collected by the panoramic system is backed up and stored.
  • the fifth embodiment of the present application also provides a computer-readable storage medium containing vehicle executable instructions, and a computer program is stored thereon.
  • a computer program is stored thereon.
  • the program is executed by a processor, the vehicle monitoring method provided by the embodiment of the present application is implemented.
  • Methods include:
  • the panoramic data collected by the panoramic system is backed up and stored.
  • the computer storage medium containing the vehicle-executable instructions of the embodiments of the present application may adopt any combination of at least one computer-readable medium.
  • the computer-readable medium may be a computer-readable signal medium or a computer-readable storage medium.
  • the computer-readable storage medium can be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus or device, or a combination of any of the above.
  • a computer-readable storage medium can be any tangible medium that contains or stores a program that can be used by or in conjunction with an instruction execution system, apparatus, or device.
  • a computer-readable signal medium may include a propagated data signal in baseband or as part of a carrier wave, with computer-readable program code embodied thereon. Such propagated data signals may take a variety of forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the foregoing.
  • a computer-readable signal medium can also be any computer-readable medium other than a computer-readable storage medium that can transmit, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device .
  • Program code embodied on a computer readable medium may be transmitted using any suitable medium, including but not limited to wireless, wire, optical fiber cable, radio frequency (RF), etc., or any suitable combination of the foregoing.
  • RF radio frequency
  • Computer program code for carrying out the operations of the present application may be written in at least one programming language, including object-oriented programming languages—such as Java, Smalltalk, C++, but also conventional procedural languages, or a combination thereof.
  • Programming Language - such as "C" language or similar programming language.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or vehicle.
  • the remote computer may be connected to the user's computer through any kind of network, including a local area network (LAN) or a wide area network (WAN), or may be connected to an external computer (eg, using an Internet service provider through Internet connection).
  • LAN local area network
  • WAN wide area network

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Traffic Control Systems (AREA)

Abstract

La présente invention concerne un procédé et un appareil de surveillance de véhicule, un véhicule et un support de stockage. Le procédé comprend les étapes suivantes : en réponse au véhicule actuel (412) étant dans un état de détection cible, sur la base d'un système radar (410) du véhicule actuel (412), réaliser une détection de cible dans une plage de danger prédéfinie du véhicule actuel (412) (S110) ; en réponse à la détection du fait qu'il existe une cible dans la plage de danger prédéfinie, générer un message de demande de système de panorama (S120) ; sur la base du message de demande de système de panorama, déclencher un système de panorama (411) afin d'effectuer une collecte de panorama sur le véhicule actuel (412) (S130) ; et sauvegarder et stocker des données de panorama collectées par le système de panorama (411) (S140).
PCT/CN2021/107382 2020-07-20 2021-07-20 Procédé et appareil de surveillance de véhicule, véhicule et support de stockage WO2022017385A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010697851.1 2020-07-20
CN202010697851.1A CN111856475A (zh) 2020-07-20 2020-07-20 一种车辆监控方法、装置、车辆及存储介质

Publications (1)

Publication Number Publication Date
WO2022017385A1 true WO2022017385A1 (fr) 2022-01-27

Family

ID=73002109

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/107382 WO2022017385A1 (fr) 2020-07-20 2021-07-20 Procédé et appareil de surveillance de véhicule, véhicule et support de stockage

Country Status (2)

Country Link
CN (1) CN111856475A (fr)
WO (1) WO2022017385A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114866701A (zh) * 2022-06-08 2022-08-05 江铃汽车股份有限公司 一种540°全景影像下线配置系统及方法

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111856475A (zh) * 2020-07-20 2020-10-30 中国第一汽车股份有限公司 一种车辆监控方法、装置、车辆及存储介质
CN112966543B (zh) * 2020-12-24 2022-07-08 浙江吉利控股集团有限公司 一种车辆刮蹭记录方法及装置
CN114697511A (zh) * 2020-12-25 2022-07-01 广州汽车集团股份有限公司 一种车辆控制方法及系统
CN112937445B (zh) * 2021-03-25 2022-08-12 深圳安智物联科技有限公司 360°车辆安全辅助方法及车载系统
CN114120561A (zh) * 2021-11-09 2022-03-01 中国第一汽车股份有限公司 一种提醒方法、装置、设备及存储介质
CN115333938B (zh) * 2022-07-19 2024-03-26 岚图汽车科技有限公司 一种车辆安全防护控制方法及相关设备
CN115158153A (zh) * 2022-07-21 2022-10-11 重庆长安汽车股份有限公司 车辆的自主泊车车外交互方法及装置

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104149692A (zh) * 2014-07-11 2014-11-19 广州广汽长和汽车科技有限公司 一种带雷达辅助的汽车全景监视系统
WO2015013311A1 (fr) * 2013-07-22 2015-01-29 Johnson Controls Technology Company Système d'imagerie de véhicule
CN107150658A (zh) * 2017-04-05 2017-09-12 吉利汽车研究院(宁波)有限公司 一种用于车辆防盗报警的监控方法和监控系统
CN107976989A (zh) * 2017-10-25 2018-05-01 中国第汽车股份有限公司 全方位车载智能安全监控系统及监控方法
CN208164888U (zh) * 2018-01-12 2018-11-30 何辉 车辆智能监控系统
US20190080180A1 (en) * 2015-09-25 2019-03-14 Apple Inc. Automated capture of image data for points of interest
DE102018009563A1 (de) * 2018-12-05 2019-07-04 Daimler Ag Verfahren zum Anzeigen einer Umgebung eines Kraftfahrzeugs, Computerprogrammprodukt sowie Kraftfahrzeug
CN111064921A (zh) * 2018-10-17 2020-04-24 上海博泰悦臻网络技术服务有限公司 车辆监控方法、系统及监控终端
CN111856475A (zh) * 2020-07-20 2020-10-30 中国第一汽车股份有限公司 一种车辆监控方法、装置、车辆及存储介质

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015013311A1 (fr) * 2013-07-22 2015-01-29 Johnson Controls Technology Company Système d'imagerie de véhicule
CN104149692A (zh) * 2014-07-11 2014-11-19 广州广汽长和汽车科技有限公司 一种带雷达辅助的汽车全景监视系统
US20190080180A1 (en) * 2015-09-25 2019-03-14 Apple Inc. Automated capture of image data for points of interest
CN107150658A (zh) * 2017-04-05 2017-09-12 吉利汽车研究院(宁波)有限公司 一种用于车辆防盗报警的监控方法和监控系统
CN107976989A (zh) * 2017-10-25 2018-05-01 中国第汽车股份有限公司 全方位车载智能安全监控系统及监控方法
CN208164888U (zh) * 2018-01-12 2018-11-30 何辉 车辆智能监控系统
CN111064921A (zh) * 2018-10-17 2020-04-24 上海博泰悦臻网络技术服务有限公司 车辆监控方法、系统及监控终端
DE102018009563A1 (de) * 2018-12-05 2019-07-04 Daimler Ag Verfahren zum Anzeigen einer Umgebung eines Kraftfahrzeugs, Computerprogrammprodukt sowie Kraftfahrzeug
CN111856475A (zh) * 2020-07-20 2020-10-30 中国第一汽车股份有限公司 一种车辆监控方法、装置、车辆及存储介质

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114866701A (zh) * 2022-06-08 2022-08-05 江铃汽车股份有限公司 一种540°全景影像下线配置系统及方法

Also Published As

Publication number Publication date
CN111856475A (zh) 2020-10-30

Similar Documents

Publication Publication Date Title
WO2022017385A1 (fr) Procédé et appareil de surveillance de véhicule, véhicule et support de stockage
CN106663379B (zh) 用于具有拖车的车辆的增强的盲点检测
CN108528443B (zh) 车辆及其防剐蹭方法、系统、处理器
CN107856671B (zh) 一种通过行车记录仪进行路况识别的方法和系统
CN110203196B (zh) 斜坡自动泊车方法、电子设备及汽车
WO2020140410A1 (fr) Procédé et dispositif de détection de place de stationnement
US20120286974A1 (en) Hit and Run Prevention and Documentation System for Vehicles
CN110126820A (zh) 自动泊车系统、泊车方法及车辆
CN110588639B (zh) 自动泊车控制方法、电子设备及汽车
CN111114537B (zh) 自动泊车倒车入库控制方法、电子设备及汽车
US20140168435A1 (en) Standby virtual bumper for parked vehicle protection
CN110304051B (zh) 自动泊车推荐方法、电子设备及汽车
CN110949257A (zh) 机动车辅助泊车装置及方法
CN110228466A (zh) 汽车受控泊车方法、电子设备、汽车及移动终端
WO2018177702A1 (fr) Système et procédé d'aide au stationnement et véhicule équipé du système
CN113071418A (zh) 车辆的后视镜控制方法、装置、电子设备及存储介质
CN110588637B (zh) 自动泊车控制方法、电子设备及汽车
CN210348799U (zh) 一种汽车测距测速的同步拍照系统
CN113240940A (zh) 汽车提醒监控方法、电子设备及存储介质
CN108664695B (zh) 模拟车辆事故系统及其应用
CN114715031A (zh) 车辆倒车的控制方法、装置、系统及介质
CN206107108U (zh) 一种彩色字符倒车雷达后视系统
CN113997930B (zh) 一种自动泊车方法、电子设备及汽车
WO2024108360A1 (fr) Procédé d'indication d'informations, procédé de traitement d'informations, appareil associé et système
CN219302678U (zh) 车辆避障装置及车辆避障设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21845414

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21845414

Country of ref document: EP

Kind code of ref document: A1