CN116142181A - Identification system, method and computer readable storage medium for vehicle blind area obstacle - Google Patents

Identification system, method and computer readable storage medium for vehicle blind area obstacle Download PDF

Info

Publication number
CN116142181A
CN116142181A CN202310444797.3A CN202310444797A CN116142181A CN 116142181 A CN116142181 A CN 116142181A CN 202310444797 A CN202310444797 A CN 202310444797A CN 116142181 A CN116142181 A CN 116142181A
Authority
CN
China
Prior art keywords
vehicle
distance
module
gate
blind area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310444797.3A
Other languages
Chinese (zh)
Other versions
CN116142181B (en
Inventor
陈�峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Klyde Electronics Co ltd
Original Assignee
Shenzhen Klyde Electronics Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Klyde Electronics Co ltd filed Critical Shenzhen Klyde Electronics Co ltd
Priority to CN202310444797.3A priority Critical patent/CN116142181B/en
Publication of CN116142181A publication Critical patent/CN116142181A/en
Application granted granted Critical
Publication of CN116142181B publication Critical patent/CN116142181B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • B60W30/0956Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0001Details of the control system
    • B60W2050/0002Automatic control, details of type of controller or control system architecture
    • B60W2050/0004In digital systems, e.g. discrete-time systems involving sampling
    • B60W2050/0005Processor details or data handling, e.g. memory registers or chip architecture
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Abstract

The present application relates to the field of vehicle technologies, and in particular, to a system, a method, and a computer readable storage medium for identifying a blind area obstacle of a vehicle, where the system includes: the system comprises a control module, an environment sensing module, a communication module and a ranging module; the control module is connected with the environment sensing module and is used for receiving the current environment brightness and the direction range of the user vehicle sent by the environment sensing module; the control module is connected with the communication module and is used for receiving a first distance sent by the target external equipment after determining that the detection distance threshold corresponding to the current environment brightness is sent to the target external equipment in the orientation range through the communication module; the control module is connected with the ranging module and is used for receiving a second distance between the user vehicle and the target external equipment, which is sent by the ranging module, when the first distance is received, and overlapping the first distance to the second distance so as to identify the blind area obstacle. The application aims to improve the accuracy of intelligent driving of a vehicle to identify blind area obstacles.

Description

Identification system, method and computer readable storage medium for vehicle blind area obstacle
Technical Field
The present application relates to the field of vehicle technologies, and in particular, to a system and a method for identifying a blind area obstacle of a vehicle, and a computer readable storage medium.
Background
With the continuous development of communication technology and automobile industry, intelligent driving vehicles have become the trend of future development of various large vehicle enterprises, and simultaneously, users put higher demands on the safety of intelligent driving vehicles.
The existing vehicle obstacle recognition system mostly adopts a camera to collect dangerous situations around a vehicle, but the camera is restricted by self ranging and environment (such as bad weather and shadow areas) and can only identify obstacles in a certain distance of the vehicle, in other words, the obstacles outside the blind area of the camera cannot be accurately identified, i.e. a driver cannot be timely and accurately reminded, so that the driver cannot make evasion operation in a short time, and traffic accidents are easy to happen.
That is, how to improve the accuracy of identifying blind area obstacles by intelligently driving vehicles is a technical problem that needs to be solved at present.
Disclosure of Invention
The main object of the present application is to provide a system, a method and a computer readable storage medium for identifying a blind area obstacle of a vehicle, which aim to improve the accuracy of identifying the blind area obstacle of the vehicle by intelligent driving.
To achieve the above object, the present application provides an identification system of a vehicle blind area obstacle, the identification system of a vehicle blind area obstacle including: the system comprises a control module, an environment sensing module, a communication module and a ranging module;
the control module is connected with the environment sensing module and is used for receiving the current environment brightness and the orientation range of the user vehicle sent by the environment sensing module;
the control module is connected with the communication module and is used for receiving a first distance sent by the target external equipment after determining that a detection distance threshold corresponding to the current environment brightness is sent to the target external equipment in the direction range through the communication module, wherein the first distance refers to a distance from the target external equipment to a blind area obstacle, which is acquired by the target external equipment in the detection distance threshold, and the target external equipment refers to any one of other vehicles which establish communication connection with the user vehicle through the communication module or an infrastructure which establishes communication connection with the user vehicle through the communication module;
the control module is connected with the ranging module and is used for receiving a second distance between the user vehicle and the target external equipment, which is sent by the ranging module, when the first distance is received, and overlapping the first distance to the second distance so as to identify the blind area obstacle.
Optionally, the environment awareness module includes: the control module is connected with the motion detection unit and is used for receiving the orientation range sent by the motion detection unit, wherein the motion detection unit comprises: accelerometers and gyroscopes.
Optionally, the current ambient brightness includes: a low-level signal output by a brightness acquisition unit in the environment sensing module;
the brightness acquisition unit includes: the system comprises a first photosensitive sensor subunit, a second photosensitive sensor subunit and an OR gate, wherein the first photosensitive sensor subunit is arranged on a right view mirror cover of the user vehicle, and the second photosensitive sensor subunit is arranged on a left view mirror cover of the user vehicle;
the first photosensitive sensor subunit is connected with a first input end of the OR gate, the second photosensitive sensor subunit is connected with a second input end of the OR gate, and an output end of the OR gate is connected with a logic control unit in the control module.
Optionally, the first photosensitive sensor subunit includes: the first photosensitive resistor, the first resistor, the second resistor and the first operational amplifier;
the first end of the first resistor is connected with a power supply end, the second end of the first resistor is connected with the first end of the first photoresistor, the second end of the first photoresistor is connected with a grounding end, a connection intersection point between the second end of the first resistor and the first end of the first photoresistor is connected with the in-phase input end of the first operational amplifier, and the out-of-phase input end of the first operational amplifier is connected with the output end of the first operational amplifier through the second resistor; the output end of the first operational amplifier is connected with the first input end of the OR gate, or the output end of the first operational amplifier is connected with the second input end of the OR gate;
The second photosensitive sensor subcell includes: the second photoresistor, the third resistor, the fourth resistor and the second operational amplifier;
the first end of the third resistor is connected with a power supply end, the second end of the third resistor is connected with the first end of the second photoresistor, the second end of the second photoresistor is connected with a grounding end, a connection intersection point between the second end of the third resistor and the first end of the second photoresistor is connected with the non-inverting input end of the second operational amplifier, and the outphasing input end of the second operational amplifier is connected with the output end of the second operational amplifier through the fourth resistor; the output end of the second operational amplifier is connected with the first input end of the OR gate, or the output end of the second operational amplifier is connected with the second input end of the OR gate.
Optionally, the logic control unit is connected with the communication module, where the logic control unit includes: selector, NOT gate, AND gate and buffer;
the output end of the OR gate is connected with the input end of the NOT gate, the output end of the NOT gate is connected with the first input end of the AND gate through the selector, the motion detection unit is connected with the second input end of the AND gate, and the output end of the AND gate is connected with the communication module through the buffer.
Optionally, the selector further comprises: a common terminal, a control terminal, a low level terminal and a high level terminal;
the control end is connected with the output end of the NOT gate, the public end is connected with a micro control processor in the control module, and the first input end of the AND gate is respectively connected with the low-level end and the high-level end.
In addition, to achieve the above object, the present application further provides a method for identifying a blind area obstacle of a vehicle, where the method for identifying a blind area obstacle of a vehicle is applied to the system for identifying a blind area obstacle of a vehicle of any one of the above, and the method for identifying a blind area obstacle of a vehicle includes the steps of:
receiving the current environment brightness and the orientation range of a user vehicle sent by an environment sensing module, and acquiring a detection distance threshold corresponding to the current environment brightness;
after the detection distance threshold is determined and sent to target external equipment in the direction range through a communication module, receiving a first distance sent by the target external equipment, wherein the first distance refers to the distance between the target external equipment and an obstacle, which is acquired by the target external equipment in the detection distance threshold, and the target external equipment refers to any one of other vehicles which establish communication connection with the user vehicle through the communication module or an infrastructure which establishes communication connection with the user vehicle through the communication module;
And when the first distance is received, receiving a second distance between the user vehicle and the target external equipment, which is sent by the ranging module, and overlapping the first distance on the second distance so as to identify the blind area obstacle.
Optionally, the current ambient brightness further includes: image information collected by an image sensor in the environment sensing module;
the step of obtaining the detection distance threshold corresponding to the current environment brightness comprises the following steps:
acquiring an environment category characteristic value corresponding to the image information, and searching an RGB color proportion pointed by the image information in a preset RGB color proportion table;
and calculating the environment category characteristic value and the RGB color proportion according to a preset image characteristic extraction algorithm to obtain an environment attribute of the image information, and determining a detection distance threshold corresponding to the environment attribute.
Optionally, the step of acquiring the environmental category characteristic value corresponding to the image information further includes:
determining gray values corresponding to a plurality of pixel points in the image information;
mapping each gray value into a gray threshold range corresponding to each gray value to obtain the total number of pixel points corresponding to each gray threshold range;
And taking the gray threshold range with the maximum total number of pixel points as the environment category characteristic value of the image information.
In addition, in order to achieve the above object, the present application further provides a computer readable storage medium, on which an identification program of a blind area obstacle of a vehicle is stored, which when executed by a processor, implements the steps of the above-described identification method of a blind area obstacle of a vehicle.
In this application, the identification system of the vehicle blind area obstacle includes: the identification system of the vehicle blind area obstacle comprises: the system comprises a control module, an environment sensing module, a communication module and a ranging module; the control module is connected with the environment sensing module and is used for receiving the current environment brightness and the direction range of the user vehicle sent by the environment sensing module; the control module is connected with the communication module and is used for receiving a first distance sent by the target external equipment after determining that a detection distance threshold corresponding to the current environment brightness is sent to the target external equipment in the direction range through the communication module, wherein the first distance refers to the distance from the target external equipment acquired by the target external equipment in the detection distance threshold to the blind area obstacle, and the target external equipment refers to any one of other vehicles which are in communication connection with the user vehicle through the communication module or an infrastructure which is in communication connection with the user vehicle through the communication module; the control module is connected with the ranging module and is used for receiving a second distance between the user vehicle and the target external equipment, which is sent by the ranging module, when the first distance is received, and overlapping the first distance to the second distance so as to identify the blind area obstacle.
Different from traditional vehicle obstacle recognition mode, after receiving current ambient brightness and orientation scope of user's vehicle, the control module of this application is according to the connection between control module and the communication module, send the detection distance threshold value that current ambient brightness corresponds to the target external equipment in the orientation scope through communication module, be regarded as the eyes of user's vehicle quick discernment blind area obstacle to target external equipment, confirm the interval (namely first distance) of target external equipment to blind area obstacle according to target external equipment in the detection distance threshold value fast and accurately, and when receiving first distance through communication module, according to the connection between control module and the range finding module, receive the range finding module and send the second distance between user's vehicle to the target external equipment, and with first distance superimposed on the second distance, in order to widen the visible interval of user's vehicle accurately, the technical problem that the sign accuracy is poor of blind area obstacle exists because of being restrained by the camera field of vision in the prior art is avoided, and then can carry out accurate effective sign to the obstacle, thereby effectively improve intelligent driving vehicle's security.
Drawings
FIG. 1 is a block diagram of a first embodiment of an identification system for a blind spot obstacle of a vehicle of the present application;
FIG. 2 is a schematic circuit diagram of a brightness acquisition unit according to one embodiment of the identification system for a blind area obstacle of a vehicle of the present application;
FIG. 3 is a schematic circuit diagram of a control module involved in one embodiment of an identification system for a blind area obstacle of a vehicle of the present application;
FIG. 4 is a block diagram of a selector structure involved in one embodiment of an identification system for a blind area obstacle of a vehicle of the present application;
FIG. 5 is a flowchart of a second embodiment of the method for identifying a blind obstacle of a vehicle applied to the system for identifying a blind obstacle of a vehicle;
fig. 6 is a schematic structural diagram of a terminal device according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of a computer-readable storage medium according to an embodiment of the present application.
Reference numerals illustrate:
reference numerals Name of the name Reference numerals Name of the name
10 Control module 20 Environment sensing module
30 Communication module 40 Ranging module
50 Target external device R1 First resistor
R2 Second resistor R3 Third resistor
R4 Fourth resistor LDR1 First photoresistor
LDR2 Second photoresistor U1 First operational amplifier
U2 Second operational amplifier
The realization, functional characteristics and advantages of the present application will be further described with reference to the embodiments, referring to the attached drawings.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
It should be noted that, in the embodiment of the present application, directional indications (such as up, down, left, right, front, and rear … …) are referred to, and the directional indications are merely used to explain the relative positional relationship, movement conditions, and the like between the components in a specific posture (as shown in the drawings), and if the specific posture is changed, the directional indications are correspondingly changed.
In addition, if there is a description of "first", "second", etc. in the embodiments of the present application, the description of "first", "second", etc. is for descriptive purposes only and is not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. In addition, the technical solutions of the embodiments may be combined with each other, but it is necessary to base that the technical solutions can be realized by those skilled in the art, and when the technical solutions are contradictory or cannot be realized, the combination of the technical solutions should be regarded as not exist and not within the protection scope of the present application.
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present application.
An embodiment of the present application provides a vehicle blind area obstacle identification system, and referring to fig. 1, fig. 1 is a block diagram of a first embodiment of a vehicle blind area obstacle identification system according to the present application. The identification system of the blind area obstacle of the vehicle comprises: the system comprises a control module 10, an environment sensing module 20, a communication module 30 and a ranging module 40;
the control module 10 is connected with the environment sensing module 20, and is configured to receive the current environment brightness and the direction range of the user vehicle sent by the environment sensing module 20;
the control module 10 is connected to the communication module 30, and is configured to receive a first distance sent by the target external device 50 after determining that a detection distance threshold corresponding to the current ambient brightness is sent to the target external device 50 in the direction range through the communication module 30, where the first distance refers to a distance from the target external device 50 to a blind area obstacle, which is acquired by the target external device 50 in the detection distance threshold, and the target external device 50 refers to any one of other vehicles that establish a communication connection with the user vehicle through the communication module 30, or an infrastructure that establishes a communication connection with the user vehicle through the communication module 30;
The control module 10 is connected to the ranging module 40, and is configured to receive a second distance between the user vehicle and the target external device 50, which is sent by the ranging module 40, when the first distance is received, and superimpose the first distance on the second distance, so as to identify the blind area obstacle.
In a specific embodiment, according to the connection between the control module 10 and the environment sensing module 20, the control module 10 receives the current environment brightness and the direction range of the user vehicle, after receiving the current environment brightness and the direction range of the user vehicle, according to the connection between the control module 10 and the communication module 30, the detection distance threshold corresponding to the current environment brightness is sent to the target external device 50 in the direction range through the communication module 30, at this time, the target external device 50 quickly searches for a blind area obstacle in the detection distance threshold, and sends the distance between the target external device 50 and the blind area obstacle as a first distance to the control module 10 through the communication module 30, and when receiving the first distance, the control module 10 receives a second distance between the user vehicle and the target external device 50 sent by the ranging module 40, and superimposes the first distance onto the second distance, so that the visible distance of the user vehicle can be widened accurately (that the coordinate information of the blind area obstacle can be obtained), and then the blind area obstacle is identified according to the coordinate information of the blind area obstacle, thereby greatly improving the driving safety of the intelligent vehicle.
It should be noted that the ranging module 40 includes, but is not limited to, standard radar, lidar, and the like.
The current ambient brightness includes: a low level signal output by a brightness acquisition unit in the environment sensing module 20, and image information (e.g., ambient light and shade or weather conditions around a vehicle) acquired by an image sensor in the environment sensing module 20, wherein the image sensor includes, but is not limited to, cameras, and at least one of the number of cameras.
The range of orientation may understand the area of the path that the current direction of motion of the user vehicle is pointing at.
The detection distance threshold is dynamically changeable, the detection distance threshold being related to the current ambient brightness, in other words, each ambient brightness corresponds to a detection distance threshold.
At least one of the blind area obstacles is arranged, namely at least one of the first distances is arranged, in other words, the first distances are added to the second distances, so that coordinate information of one or more blind area obstacles can be obtained, and then the plurality of blind area obstacles are identified according to the coordinate information corresponding to each blind area obstacle.
Further, in some possible embodiments, the context awareness module 20 includes: the control module 10 is connected with the motion detection unit and is used for receiving the orientation range sent by the motion detection unit, wherein the motion detection unit is an MPU-6050 sensor;
In the present embodiment, the control module 10 acquires current pose data of the user vehicle based on the MPU-6050 sensor, and then acquires an orientation range corresponding to the current pose data.
It should be noted that the MPU-6050 is a six-axis accelerometer and a gyro sensor, and can be used to measure the motion and posture of a user vehicle.
Further, in other possible embodiments, the current ambient brightness includes: a low level signal output by a brightness acquisition unit in the environment sensing module 20;
the brightness acquisition unit includes: the system comprises a first photosensitive sensor subunit, a second photosensitive sensor subunit and an OR gate, wherein the first photosensitive sensor subunit is arranged on a right view mirror cover of the user vehicle, and the second photosensitive sensor subunit is arranged on a left view mirror cover of the user vehicle;
the first photosensitive sensor subunit is connected with a first input end of the or gate, the second photosensitive sensor subunit is connected with a second input end of the or gate, and an output end of the or gate is connected with a logic control unit in the control module 10.
In this embodiment, referring to fig. 2, fig. 2 is a schematic circuit diagram of a brightness acquisition unit according to an embodiment of an identification system for a blind area obstacle of a vehicle of the present application. The control module 10 obtains a first voltage threshold value of the first photosensitive sensor subunit and a second voltage threshold value of the second photosensitive sensor subunit based on characteristics of the photoresistor, then detects whether the first voltage threshold value and the second voltage threshold value exceed preset voltage threshold values respectively, and outputs a low-level signal through an output end of the or gate when determining that neither the first voltage threshold value nor the second voltage threshold value exceeds the preset voltage threshold values (i.e., when both a first input end of the or gate and a second input end of the or gate receive the low-level signal).
It should be noted that the characteristic of the photoresistor means that the resistance of the photoresistor is inversely proportional to the current ambient brightness, that is, the higher the light intensity corresponding to the current ambient brightness is, the lower the resistance of the photoresistor is.
In this embodiment, the application sets up that first photosensitive sensor subunit and second photosensitive sensor subunit gather respectively the ambient brightness in the front right side of user's vehicle and in the front left side of user's vehicle, and then can synthesize the current ambient brightness who obtains user's vehicle, then insert the level signal of first photosensitive sensor subunit and second photosensitive sensor subunit output respectively through two input of or gate, avoided the arbitrary rear-view mirror lid of vehicle to shelter from by shadow area briefly promptly and lead to carrying out the erroneous judgement (for example with high visible region erroneous judgement to low visible region) to user's vehicle's current ambient brightness.
Further, in another embodiment, the first photosensitive sensor sub-unit includes: the first light-sensitive resistor LDR1, the first resistor R1, the second resistor R2 and the first operational amplifier U1;
the first end of the first resistor R1 is connected with a power supply end, the second end of the first resistor R1 is connected with the first end of the first photoresistor LDR1, the second end of the first photoresistor LDR1 is connected with a grounding end, a connection intersection point between the second end of the first resistor R1 and the first end of the first photoresistor is connected with the in-phase input end of the first operational amplifier U1, and the out-of-phase input end of the first operational amplifier U1 is connected with the output end of the first operational amplifier U1 through the second resistor R2; the output end of the first operational amplifier U1 is connected with the first input end of the OR gate, or the output end of the first operational amplifier U1 is connected with the second input end of the OR gate;
The second photosensitive sensor subcell includes: a second light dependent resistor LDR2, a third resistor R3, a fourth resistor R4 and a second operational amplifier U2;
the first end of the third resistor R3 is connected with a power supply end, the second end of the third resistor R3 is connected with the first end of the second light-sensitive resistor LDR2, the second end of the second light-sensitive resistor LDR2 is connected with a grounding end, a connection intersection point between the second end of the third resistor R3 and the first end of the second light-sensitive resistor is connected with the in-phase input end of the second operational amplifier U2, and the out-of-phase input end of the second operational amplifier U2 is connected with the output end of the second operational amplifier U2 through the fourth resistor R4; the output end of the second operational amplifier U2 is connected with the first input end of the OR gate, or the output end of the second operational amplifier U2 is connected with the second input end of the OR gate.
In this embodiment, the model of the first light-dependent resistor LDR1 is the same as the model of the second light-dependent resistor LDR2, the model of the first resistor R1 is the same as the model of the third resistor R3, the model of the second resistor R2 is the same as the model of the fourth resistor R4, and the model of the first operational amplifier U1 is the same as the model of the second operational amplifier U2.
Further, in other possible embodiments, referring to fig. 3, fig. 3 is a schematic circuit diagram of the control module 10 related to an embodiment of the identification system of a blind area obstacle of a vehicle of the present application. The logic control unit includes: selector, NOT gate, AND gate and buffer;
the output end of the OR gate is connected with the input end of the NOT gate, the output end of the NOT gate is connected with the first input end of the AND gate through the selector, the motion detection unit is connected with the second input end of the AND gate, and the input end of the AND gate is connected with the buffer.
In this embodiment, after the low-level signal output by the output end of the or gate is converted into the high-level signal by the not gate, the common end in the selector is conducted with the high-level end, so that the high-level signal is connected to the first input end of the and gate; the second input end of the AND gate is connected with the motion detection unit, and the high-level signal output by the motion detection unit is connected with the second input end of the AND gate; the output of the AND gate then outputs a high level signal to energize the communication module 30 for the user vehicle to quickly establish a communication connection with the target external device 50 via the communication module 30.
Further, in some possible embodiments, referring to fig. 4, fig. 4 is a block diagram of a selector structure involved in an embodiment of an identification system for a blind area obstacle of a vehicle of the present application. The selector further includes: a common terminal, a control terminal, a low level terminal and a high level terminal;
the control end is connected with the output end of the NOT gate, the public end is connected with a micro control processor in the control module 10, and the first input end of the AND gate is respectively connected with the low level end and the high level end.
In this embodiment, when the output end of the not gate outputs a high level, the common end in the selector is connected to the high level end, and at this time, the first input end of the and gate is connected to a high level signal; when the output end of the NOT gate outputs a low level, the common end in the selector is conducted with the low level end, and at the moment, the first input end of the AND gate is connected with a low level signal.
In this embodiment, the selector is designed to react to the level signal output by the not gate in time to determine the level input to the first input end of the and gate, so that the consumption of the micro control processor in the control module can be reduced, and the operation time of the micro control processor is saved.
In summary, in the present application, an identification system of a blind area obstacle of a vehicle includes: the identification system of the vehicle blind area obstacle comprises: the system comprises a control module 10, an environment sensing module 20, a communication module 30 and a ranging module 40; the control module 10 is connected with the environment sensing module 20 and is used for receiving the current environment brightness and the direction range of the user vehicle sent by the environment sensing module 20; the control module 10 is connected with the communication module 30, and is configured to receive a first distance sent by the target external device 50 after determining that a detection distance threshold corresponding to the current ambient brightness is sent to the target external device 50 in a direction range through the communication module 30, where the first distance refers to a distance between the target external device 50 and a blind area obstacle, which is acquired by the target external device 50 in the detection distance threshold, and the target external device 50 refers to any one of other vehicles that establish a communication connection with a user vehicle through the communication module 30, or an infrastructure that establishes a communication connection with the user vehicle through the communication module 30; the control module 10 is connected to the ranging module 40, and is configured to receive a second distance from the user vehicle sent by the ranging module 40 to the target external device 50 when receiving the first distance, and superimpose the first distance on the second distance to identify the blind area obstacle.
Different from the traditional vehicle obstacle recognition mode, after receiving the current environment brightness and the direction range of the user vehicle, the control module 10 of the application sends the detection distance threshold corresponding to the current environment brightness to the target external device 50 in the direction range through the communication module 30 according to the connection between the control module 10 and the communication module 30, namely, the target external device 50 is used as the eyes of the user vehicle for quickly recognizing the blind area obstacle, the distance (namely, the first distance) between the target external device 50 and the blind area obstacle is quickly and accurately determined according to the detection distance threshold of the target external device 50, the second distance between the user vehicle and the target external device 50 is received through the communication module 30 according to the connection between the control module 10 and the ranging module 40, and the first distance is overlapped to the second distance, so that the technical problem that the identification accuracy of the blind area obstacle is poor due to the restriction of the view of a camera in the prior art is avoided, the blind area obstacle can be accurately and effectively identified, and accordingly, the intelligent driving safety of the vehicle is effectively improved.
Further, based on the first embodiment of the identification system of the blind area obstacle of the vehicle of the present application, a second embodiment of the identification method of the blind area obstacle of the vehicle of the present application is proposed, and referring to fig. 5, fig. 5 is a schematic flow chart of the second embodiment of the identification system of the blind area obstacle of the vehicle of the present application, in which the identification method of the blind area obstacle of the vehicle of the present application is applied.
The identification method of the vehicle blind area obstacle is applied to the identification system of the vehicle blind area obstacle, the identification method of the vehicle blind area obstacle is applied to the terminal equipment for identifying the vehicle blind area obstacle, and the identification method of the vehicle blind area obstacle can comprise the following steps:
step S10: receiving the current environment brightness and the direction range of the user vehicle sent by the environment sensing module 20, and acquiring a detection distance threshold corresponding to the current environment brightness;
in this embodiment, when a blind area obstacle identification instruction is received through a man-machine interaction interface of a user vehicle, according to the connection between the control module 10 and the environment sensing module 20, the control module 10 receives the current environment brightness and the orientation range of the user vehicle, and obtains a detection distance threshold corresponding to the current environment brightness.
It should be noted that the man-machine interface includes, but is not limited to, a touchable display screen, a car microphone, and a user terminal (such as a mobile phone, a tablet, etc.) connected to the user vehicle through a bluetooth module.
Step S20: after determining that the detection distance threshold is sent to the target external device 50 in the direction range through the communication module 30, receiving a first distance sent by the target external device 50, where the first distance refers to a distance between the target external device 50 and an obstacle, which is acquired by the target external device 50 in the detection distance threshold, and the target external device 50 refers to any one of other vehicles that establish a communication connection with the user vehicle through the communication module 30, or an infrastructure that establishes a communication connection with the user vehicle through the communication module 30;
in this embodiment, after receiving the current environmental brightness and the direction range of the user vehicle, according to the connection between the control module 10 and the communication module 30, the detection distance threshold corresponding to the current environmental brightness is sent to the target external device 50 in the direction range through the communication module 30, at this time, the target external device 50 quickly searches for a blind area obstacle in the detection distance threshold, and sends the distance between the target external device 50 and the blind area obstacle as a first distance to the control module 10 through the communication module 30, so that the control module 10 receives the first distance sent by the target external device 50.
Step S30: when the first distance is received, a second distance between the user vehicle and the target external device 50, which is transmitted by the ranging module 40, is received, and the first distance is superimposed on the second distance to identify the blind area obstacle.
In this embodiment, the control module 10 receives the second distance between the user vehicle and the target external device 50 sent by the ranging module 40 while receiving the first distance, and superimposes the first distance on the second distance, so that the visible distance of the user vehicle can be widened accurately (that is, the coordinate information of the blind area obstacle can be obtained), and then the blind area obstacle is identified according to the coordinate information of the blind area obstacle, thereby greatly improving the driving safety of the intelligent vehicle.
Further, in other possible embodiments, the current ambient brightness further includes: image information collected by the image sensor in the environment sensing module 20, step S10 above: the method for obtaining the detection distance threshold corresponding to the current environment brightness may further include the following implementation steps:
step S101: acquiring an environment category characteristic value corresponding to the image information, and searching an RGB color proportion pointed by the image information in a preset RGB color proportion table;
In this embodiment, the control module 10 determines gray values corresponding to a plurality of pixel points in the image information; mapping each gray value into a gray threshold range corresponding to each gray value, and obtaining the total number of pixel points corresponding to each gray threshold range; and taking the gray threshold range with the largest total number of pixels as the environment category characteristic value of the image information.
It should be noted that, the environmental category characteristic value may be understood as a characteristic value of environmental visibility, for example, a foggy environment corresponds to an environmental characteristic value, a night environment corresponds to an environmental characteristic value, and a rainy environment corresponds to an environmental characteristic value.
Step S102: and calculating the environment category characteristic value and the RGB color proportion according to a preset image characteristic extraction algorithm to obtain an environment attribute of the image information, and determining a detection distance threshold corresponding to the environment attribute.
In this embodiment, the control module 10 calculates the environmental category feature value and the RGB color ratio according to a preset image feature extraction algorithm, obtains the environmental attribute where the image information is located, and determines the detection distance threshold corresponding to the environmental attribute.
It should be noted that the environmental attribute includes, but is not limited to, a foggy environment, a night environment, and a rainy environment.
Further, in some possible embodiments, step S101 described above: the method for obtaining the detection distance threshold corresponding to the current environment brightness may further include the following implementation steps:
step S1011: determining gray values corresponding to a plurality of pixel points in the image information;
in this embodiment, the control module 10 obtains the gray values corresponding to all the pixel points in the image information.
Step S1012: mapping each gray value into a gray threshold range corresponding to each gray value to obtain the total number of pixel points corresponding to each gray threshold range;
in this embodiment, the control module 10 arranges the plurality of gray values in a preset descending order to obtain an arranged gray value, stores the arranged gray value in a preset storage table, searches a maximum value and a minimum value of the gray values belonging to the gray threshold range in the preset storage table according to a first gray threshold range, and grabs a plurality of pixels in the first gray threshold range in the storage table with the maximum value of the gray values as a starting point and the minimum value of the gray values as an end point, thereby obtaining the total number of pixels in the first gray threshold range.
It should be noted that the number of gray threshold ranges is plural, including but not limited to the first gray threshold range, and may be customized according to the needs of the user, which is not limited herein.
Step S1013: and taking the gray threshold range with the maximum total number of pixel points as the environment category characteristic value of the image information.
In this embodiment, the control module 10 uses the gray threshold range with the largest total number of pixels as the environment category characteristic value of the image information.
In summary, after receiving the current environmental brightness and the direction range of the user vehicle, the control module 10 of the present application sends the detection distance threshold corresponding to the current environmental brightness to the target external device 50 in the direction range through the communication module 30 according to the connection between the control module 10 and the communication module 30, that is, the target external device 50 is used as the eyes of the user vehicle to quickly identify the blind area obstacle, the distance (i.e., the first distance) between the target external device 50 and the blind area obstacle is quickly and accurately determined according to the detection distance threshold of the target external device 50, and when the first distance is received through the communication module 30, the second distance between the control module 10 and the ranging module 40 is received, and the first distance is overlapped to the second distance, so that the visual distance between the user vehicle is accurately widened, the technical problem that the identification accuracy of the blind area obstacle is poor due to the constraint of the view of the camera in the prior art is avoided, the blind area obstacle can be accurately and effectively identified, and thus the safety of the driving vehicle is effectively improved.
In addition, the application also provides terminal equipment. Referring to fig. 6, fig. 6 is a schematic structural diagram of a terminal device according to an embodiment of the present application. The terminal device in the embodiment of the application may specifically be a device for locally running to identify a blind area obstacle of a vehicle, or may be a device including an identification system of the blind area obstacle of the vehicle.
As shown in fig. 6, the terminal device in the embodiment of the present application may include: a processor 1001, such as a CPU, a communication bus 1002, a user interface 1003, a network interface 1004, and a memory 1005. Wherein the communication bus 1002 is used to enable connected communication between these components. The user interface 1003 may include a Display, an input unit such as a Keyboard (Keyboard), and the optional user interface 1003 may further include a standard wired interface, a wireless interface. The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., wi-Fi interface).
A memory 1005 is provided on the terminal apparatus main body, and a program is stored in the memory 1005, which realizes a corresponding operation when executed by the processor 1001. The memory 1005 is also used to store parameters for use by the terminal device. The memory 1005 may be a high-speed RAM memory or a stable memory (non-volatile memory), such as a disk memory. The memory 1005 may also optionally be a storage device separate from the processor 1001 described above.
It will be appreciated by those skilled in the art that the terminal device structure shown in fig. 6 is not limiting of the terminal device and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
As shown in fig. 6, a memory 1005 as a storage medium may include therein an operating system, a network communication module, a user interface module, and an identification program of a vehicle blind area obstacle of a terminal device.
In the terminal device shown in fig. 6, the processor 1001 may be configured to call an identification program of a vehicle blind area obstacle of the terminal device stored in the memory 1005, and perform the steps of the identification method of a vehicle blind area obstacle as described above.
Furthermore, the application also provides a computer readable storage medium. Referring to fig. 7, fig. 7 is a schematic structural diagram of a computer readable storage medium according to an embodiment of the present application.
The present application further provides a computer readable storage medium, on which a program for identifying a blind area obstacle of a vehicle is stored, which when executed by a processor implements the steps of the method for identifying a blind area obstacle of a vehicle as described above.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The foregoing embodiment numbers of the present application are merely for describing, and do not represent advantages or disadvantages of the embodiments.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a computer readable storage medium (e.g. ROM/RAM, magnetic disk, optical disk) as described above, including several instructions for causing a terminal device (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method described in the embodiments of the present application.
The foregoing description is only of the preferred embodiments of the present application, and is not intended to limit the scope of the claims, and all equivalent structures or equivalent processes using the descriptions and drawings of the present application, or direct or indirect application in other related technical fields are included in the scope of the claims of the present application.

Claims (10)

1. A vehicle blind zone obstacle identification system, the vehicle blind zone obstacle identification system comprising: the system comprises a control module, an environment sensing module, a communication module and a ranging module;
the control module is connected with the environment sensing module and is used for receiving the current environment brightness and the orientation range of the user vehicle sent by the environment sensing module;
the control module is connected with the communication module and is used for receiving a first distance sent by the target external equipment after determining that a detection distance threshold corresponding to the current environment brightness is sent to the target external equipment in the direction range through the communication module, wherein the first distance refers to a distance from the target external equipment to a blind area obstacle, which is acquired by the target external equipment in the detection distance threshold, and the target external equipment refers to any one of other vehicles which establish communication connection with the user vehicle through the communication module or an infrastructure which establishes communication connection with the user vehicle through the communication module;
The control module is connected with the ranging module and is used for receiving a second distance between the user vehicle and the target external equipment, which is sent by the ranging module, when the first distance is received, and overlapping the first distance to the second distance so as to identify the blind area obstacle.
2. The vehicle blind spot obstacle identification system of claim 1 wherein the environment awareness module includes: the control module is connected with the motion detection unit and is used for receiving the orientation range sent by the motion detection unit, wherein the motion detection unit comprises: accelerometers and gyroscopes.
3. The vehicle blind spot obstacle identification system of claim 2 wherein said current ambient light comprises: a low-level signal output by a brightness acquisition unit in the environment sensing module;
the brightness acquisition unit includes: the system comprises a first photosensitive sensor subunit, a second photosensitive sensor subunit and an OR gate, wherein the first photosensitive sensor subunit is arranged on a right view mirror cover of the user vehicle, and the second photosensitive sensor subunit is arranged on a left view mirror cover of the user vehicle;
The first photosensitive sensor subunit is connected with a first input end of the OR gate, the second photosensitive sensor subunit is connected with a second input end of the OR gate, and an output end of the OR gate is connected with a logic control unit in the control module.
4. The vehicle blind spot obstacle identification system as set forth in claim 3 wherein said first photosensitive sensor subunit comprises: the first photosensitive resistor, the first resistor, the second resistor and the first operational amplifier;
the first end of the first resistor is connected with a power supply end, the second end of the first resistor is connected with the first end of the first photoresistor, the second end of the first photoresistor is connected with a grounding end, a connection intersection point between the second end of the first resistor and the first end of the first photoresistor is connected with the in-phase input end of the first operational amplifier, and the out-of-phase input end of the first operational amplifier is connected with the output end of the first operational amplifier through the second resistor; the output end of the first operational amplifier is connected with the first input end of the OR gate, or the output end of the first operational amplifier is connected with the second input end of the OR gate;
The second photosensitive sensor subcell includes: the second photoresistor, the third resistor, the fourth resistor and the second operational amplifier;
the first end of the third resistor is connected with a power supply end, the second end of the third resistor is connected with the first end of the second photoresistor, the second end of the second photoresistor is connected with a grounding end, a connection intersection point between the second end of the third resistor and the first end of the second photoresistor is connected with the non-inverting input end of the second operational amplifier, and the outphasing input end of the second operational amplifier is connected with the output end of the second operational amplifier through the fourth resistor; the output end of the second operational amplifier is connected with the first input end of the OR gate, or the output end of the second operational amplifier is connected with the second input end of the OR gate.
5. The vehicle blind spot obstacle identification system of claim 4 wherein the logic control unit is coupled to the communication module, wherein the logic control unit comprises: selector, NOT gate, AND gate and buffer;
the output end of the OR gate is connected with the input end of the NOT gate, the output end of the NOT gate is connected with the first input end of the AND gate through the selector, the motion detection unit is connected with the second input end of the AND gate, and the output end of the AND gate is connected with the communication module through the buffer.
6. The vehicle blind spot obstacle identification system as set forth in claim 5 wherein said selector further comprises: a common terminal, a control terminal, a low level terminal and a high level terminal;
the control end is connected with the output end of the NOT gate, the public end is connected with a micro control processor in the control module, and the first input end of the AND gate is respectively connected with the low-level end and the high-level end.
7. A method for identifying a blind area obstacle of a vehicle, characterized in that the method for identifying a blind area obstacle of a vehicle is applied to the system for identifying a blind area obstacle of a vehicle according to any one of claims 1 to 6, and the method for identifying a blind area obstacle of a vehicle comprises the steps of:
receiving the current environment brightness and the orientation range of a user vehicle sent by an environment sensing module, and acquiring a detection distance threshold corresponding to the current environment brightness;
after the detection distance threshold is determined and sent to target external equipment in the direction range through a communication module, receiving a first distance sent by the target external equipment, wherein the first distance refers to the distance between the target external equipment and an obstacle, which is acquired by the target external equipment in the detection distance threshold, and the target external equipment refers to any one of other vehicles which establish communication connection with the user vehicle through the communication module or an infrastructure which establishes communication connection with the user vehicle through the communication module;
And when the first distance is received, receiving a second distance between the user vehicle and the target external equipment, which is sent by the ranging module, and overlapping the first distance on the second distance so as to identify the blind area obstacle.
8. The method for identifying a blind spot obstacle in a vehicle of claim 7, wherein the current ambient brightness further comprises: image information collected by an image sensor in the environment sensing module;
the step of obtaining the detection distance threshold corresponding to the current environment brightness comprises the following steps:
acquiring an environment category characteristic value corresponding to the image information, and searching an RGB color proportion pointed by the image information in a preset RGB color proportion table;
and calculating the environment category characteristic value and the RGB color proportion according to a preset image characteristic extraction algorithm to obtain an environment attribute of the image information, and determining a detection distance threshold corresponding to the environment attribute.
9. The method of identifying a blind spot obstacle for a vehicle according to claim 8, wherein the step of acquiring an environmental category characteristic value corresponding to the image information further comprises:
Determining gray values corresponding to a plurality of pixel points in the image information;
mapping each gray value into a gray threshold range corresponding to each gray value to obtain the total number of pixel points corresponding to each gray threshold range;
and taking the gray threshold range with the maximum total number of pixel points as the environment category characteristic value of the image information.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon an identification program of a vehicle blind area obstacle, which when executed by a processor, implements the steps of the identification method of a vehicle blind area obstacle according to any one of claims 7 to 9.
CN202310444797.3A 2023-04-24 2023-04-24 Identification system, method and computer readable storage medium for vehicle blind area obstacle Active CN116142181B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310444797.3A CN116142181B (en) 2023-04-24 2023-04-24 Identification system, method and computer readable storage medium for vehicle blind area obstacle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310444797.3A CN116142181B (en) 2023-04-24 2023-04-24 Identification system, method and computer readable storage medium for vehicle blind area obstacle

Publications (2)

Publication Number Publication Date
CN116142181A true CN116142181A (en) 2023-05-23
CN116142181B CN116142181B (en) 2023-07-14

Family

ID=86341098

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310444797.3A Active CN116142181B (en) 2023-04-24 2023-04-24 Identification system, method and computer readable storage medium for vehicle blind area obstacle

Country Status (1)

Country Link
CN (1) CN116142181B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101673473A (en) * 2009-09-24 2010-03-17 浙江大学 Omni-directional vision parking auxiliary device based on DSP and method for generating Omni-directional vision image
CN102449673A (en) * 2009-06-03 2012-05-09 爱信精机株式会社 Method of monitoring the surroundings of a vehicle, and device for monitoring the surroundings of a vehicle
DE102011112812A1 (en) * 2011-09-07 2013-03-07 GM Global Technology Operations LLC (n. d. Gesetzen des Staates Delaware) Device for use in vehicle to assist driver by dead-angle-area warning, has surroundings detecting device for determining contour of foreign-vehicle, traveling ahead or sideways on adjacent lane
CN104157134A (en) * 2014-09-03 2014-11-19 淮南师范学院 Real-time on-line dead-zone-free street scene sharing system for vehicles
CN105857315A (en) * 2016-04-28 2016-08-17 重庆长安汽车股份有限公司 Active monitoring system and method for dead zones
WO2021196145A1 (en) * 2020-04-02 2021-10-07 华为技术有限公司 Vehicle blind spot recognition method, automatic driving assistant system and smart driving vehicle comprising same
CN113815613A (en) * 2020-06-18 2021-12-21 现代摩比斯株式会社 Vehicle sight blind area avoiding system and method using accident history information

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102449673A (en) * 2009-06-03 2012-05-09 爱信精机株式会社 Method of monitoring the surroundings of a vehicle, and device for monitoring the surroundings of a vehicle
CN101673473A (en) * 2009-09-24 2010-03-17 浙江大学 Omni-directional vision parking auxiliary device based on DSP and method for generating Omni-directional vision image
DE102011112812A1 (en) * 2011-09-07 2013-03-07 GM Global Technology Operations LLC (n. d. Gesetzen des Staates Delaware) Device for use in vehicle to assist driver by dead-angle-area warning, has surroundings detecting device for determining contour of foreign-vehicle, traveling ahead or sideways on adjacent lane
CN104157134A (en) * 2014-09-03 2014-11-19 淮南师范学院 Real-time on-line dead-zone-free street scene sharing system for vehicles
CN105857315A (en) * 2016-04-28 2016-08-17 重庆长安汽车股份有限公司 Active monitoring system and method for dead zones
WO2021196145A1 (en) * 2020-04-02 2021-10-07 华为技术有限公司 Vehicle blind spot recognition method, automatic driving assistant system and smart driving vehicle comprising same
CN113815613A (en) * 2020-06-18 2021-12-21 现代摩比斯株式会社 Vehicle sight blind area avoiding system and method using accident history information

Also Published As

Publication number Publication date
CN116142181B (en) 2023-07-14

Similar Documents

Publication Publication Date Title
CN107749194B (en) Lane changing assisting method and mobile terminal
WO2020042859A1 (en) Smart driving control method and apparatus, vehicle, electronic device, and storage medium
US20200133281A1 (en) Safety system for a vehicle
CN109949439B (en) Driving live-action information labeling method and device, electronic equipment and medium
CN110428654B (en) Traffic signal lamp intersection safe passing method and system
WO2021082735A1 (en) Fog feature recognition method, apparatus, and related device
US11840173B2 (en) External facing communications for autonomous vehicles
CN109104689B (en) Safety warning method and terminal
EP3441960A1 (en) Intelligent lighting system, intelligent vehicle and auxiliary vehicle driving system and method therefor
CN108957460B (en) Method and device for detecting vehicle distance and computer readable storage medium
CN116142181B (en) Identification system, method and computer readable storage medium for vehicle blind area obstacle
US20200294119A1 (en) Computer program product and computer-implemented method
US20160275715A1 (en) Map image display device, navigation device, and map image display method
KR20180051422A (en) Electronic apparatus and control method thereof
CN105823476A (en) Electronic device and display method of target object
CN112013847B (en) Indoor navigation method and device
JP6877651B1 (en) Visual load value estimation device, visual load value estimation system, visual load value estimation method, and visual load value estimation program
CN116109711A (en) Driving assistance method and device and electronic equipment
JP2021149641A (en) Object presentation device and object presentation method
CN115427761A (en) Vehicle driving guidance method and electronic device
CN112884220A (en) Collision prediction method, device and equipment based on association rule and storage medium
CN109559535A (en) A kind of dynamic sound-light coordination traffic signal system of integration recognition of face
CN214523763U (en) Auxiliary driving and vehicle safety management system based on Internet of vehicles
KR102030082B1 (en) Selective Crowdsourcing System for Traffic Signal Estimation and Methods Thereof
JP7428076B2 (en) Operation method of server device, control device, vehicle, and information processing system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant