CN113223312A - Camera blindness prediction method and device based on map and storage medium - Google Patents
Camera blindness prediction method and device based on map and storage medium Download PDFInfo
- Publication number
- CN113223312A CN113223312A CN202110476765.2A CN202110476765A CN113223312A CN 113223312 A CN113223312 A CN 113223312A CN 202110476765 A CN202110476765 A CN 202110476765A CN 113223312 A CN113223312 A CN 113223312A
- Authority
- CN
- China
- Prior art keywords
- vehicle
- angle
- camera
- map
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096708—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
- G08G1/096725—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information generates an automatic action on the vehicle control
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C1/00—Measuring angles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/28—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/45—Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/45—Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
- G01S19/47—Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being an inertial measurement, e.g. tightly coupled inertial
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0125—Traffic data processing
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Automation & Control Theory (AREA)
- Life Sciences & Earth Sciences (AREA)
- Atmospheric Sciences (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Navigation (AREA)
Abstract
The invention discloses a map-based camera blinding prediction method, which comprises the steps of judging the position information of a vehicle; the map information of the vehicle is inquired through the position information, the road course angle and the road gradient of the current position of the vehicle are obtained, the information of the vehicle course angle and the vehicle pitch angle is simulated, and the road gradient and the road course angle information of the road in front of the vehicle can also be obtained. The invention can comprehensively judge where the camera can meet strong light by combining information such as a map, vehicle positioning and the like.
Description
Technical Field
The invention relates to the field of automatic driving, in particular to a solution for failure of a camera on a vehicle when encountering strong light.
Background
When the camera meets the strong light, the camera can be out of work due to overexposure. Therefore, under the condition that the camera of the vehicle pre-judges the strong light of the sunlight in advance, the early warning can be sent out in advance, and the automatic driving system is degraded or quit the automatic driving.
An on-vehicle camera with strong light suppression function disclosed in publication No. CN107465883A utilizes the image captured by the analysis camera to process the problem of overexposure through four units of image acquisition, exposure analysis, backlight compensation adjustment, image processing and image output.
The patent with the publication number of CN205819055U discloses a driving auxiliary device for overcoming transient blindness, and discloses a driving auxiliary device for overcoming transient blindness, which comprises a camera unit and a display unit, wherein the camera unit is connected with the display unit through an image interface, the camera unit comprises a chip with a strong light shielding function, the chip directly converts the sensed strong light part into a black image to be displayed on the display unit, and the definition of the image is not influenced due to the complete shielding of the strong light.
In the two methods, when the camera is overexposed, whether the camera can meet the strong sunlight cannot be predicted in advance.
Disclosure of Invention
The invention discloses a camera blindness prediction method, a camera blindness prediction device and a storage medium based on a map, which overcome the defects in the prior art.
The invention discloses a map-based camera blinding prediction method, wherein a camera is arranged on a vehicle, the transverse visible range of the camera is influenced by the course angle of the vehicle, and the longitudinal visible range of the camera is influenced by the pitch angle of the vehicle, and the prediction step comprises the following steps:
and judging the position of the vehicle, and determining a corresponding map according to the position of the vehicle.
And detecting a vehicle course angle H through road information on the map, and detecting a vehicle pitch angle L through ramp information on the map.
The sun incident angle beta and the sun azimuth angle alpha are calculated according to the longitude and the latitude of the vehicle and the current time.
And judging whether the visible range of the camera is overlapped with the sun incident angle beta and the sun azimuth angle alpha or not by combining the influence of the vehicle course angle H and the vehicle pitch angle L on the visible range of the camera, wherein the confidence coefficient of the object identification information given by the camera is reduced under the condition of overlapping.
Further, when the following conditions are met, it is determined that the camera encounters strong light, and the confidence of the object identification information given by the camera should be reduced.
The longitudinal viewing angle B satisfies: h + B/2> beta, and the lateral viewing angle A satisfies: l + A/2> alpha and L-A/2< alpha.
Further, calculating a vehicle pitch angle L and a vehicle heading angle H according to the gradient i of the road, the direction of the tangent line of the vehicle to the road and a due north included angle Q; wherein the content of the first and second substances,
L=arctan i;
H=Q。
further, the incident angle β and the azimuth angle α of the sun can be calculated by the longitude JD and the latitude Φ of the vehicle, the current month Y, the date D, and the hour S as follows:
the solar incident angle β is calculated as follows:
sinβ=sinδ×sinφ+cosδ×cosφ×cosτ,
wherein delta is the declination angle of the sun, tau is the solar time angle, and phi is the latitude;
sinδ=0.39795×cos[0.98563×(N-173)/180×pi]
wherein N is the number of days at 1 month and 1 day every year, and pi is the circumferential rate;
τ=(S0-12)×15
S0=Sd+Et
Sdis Beijing time, EtIs a time difference;
Et=(JD-120°)/15°
wherein JD is longitude;
the solar azimuth α is calculated as follows:
cosα=(sinδ-sinβ×sinφ)/(cosβ×cosφ),
wherein, delta is the declination angle of the sun, beta is the incident angle of the sun, and phi is the latitude.
Furthermore, the invention combines the GNSS positioning, the vehicle speed, the steering wheel and the inertial navigation sensor on the vehicle to judge the position information of the vehicle; inquiring map information of the vehicle through the position information; obtaining a road course angle H of the current position of the vehicle through map information, and simulating the vehicle course angle by using the road course angle; and obtaining the slope information of the current position of the vehicle through the map information, and simulating the pitch angle L of the vehicle by using the slope information.
The invention further provides a device for map-based predictive camera blinding, which comprises a memory and a processor, wherein the memory stores instructions for enabling the processor to execute the above map-based predictive camera blinding method.
The present disclosure also provides a machine-readable storage medium having stored thereon instructions for enabling a machine to perform the above-described map-based predictive camera blinding method.
The beneficial technical effects of the invention are as follows:
the invention can obtain the road course angle and the road gradient of the current position of the vehicle by judging the position information of the vehicle and inquiring the map information of the vehicle through the position information, simulate the information of the vehicle course angle and the vehicle pitch angle and also obtain the information of the road gradient and the road course angle of the road in front of the vehicle. Therefore, by combining information such as a map and vehicle positioning, the camera can be comprehensively judged to meet strong light at the position where the camera can be in front, so that the blind of the camera caused by the strong light of the sun can be avoided, targets such as lane lines, signboards, vehicles, pedestrians and the like can not be output, and the camera or an automatic driving system is given early warning or corresponding measures in advance.
Drawings
FIG. 1 is a schematic view of a vehicle coordinate system;
FIG. 2 is a schematic view of a lateral view of a camera;
FIG. 3 is a schematic view of a longitudinal view of a camera;
FIG. 4 is a schematic diagram of solar azimuth angle and incident angle;
FIG. 5 is a schematic view of a heading angle of a road.
Detailed Description
In order to more clearly illustrate the technical solution of the present invention, the present invention is further described in detail below with reference to the embodiments and the accompanying drawings.
Referring to fig. 1, a vehicle coordinate system is set for the vehicle coordinate system, the forward direction is the X-axis direction, the left side is the Y-axis direction, and the vertical direction is the Z-axis direction. The Z-axis rotation angle is the heading angle H of the vehicle and ranges from 0 degree to 360 degrees. The rotation angle of the Y axis is a vehicle pitch angle L, the range is from-45 degrees to 45 degrees, and the anticlockwise direction is positive.
Referring to fig. 2 and 3, a top View and a side View of the camera are shown, and a Field of View (FOV) of the camera mounted on the vehicle is composed of a transverse viewing angle a ranging from 0 to 90 degrees and a longitudinal viewing angle B ranging from 0 to 90 degrees.
Referring to fig. 4, a schematic diagram of the azimuth angle and the incident angle of the sunlight is shown, and the sunlight ray angle is represented by the azimuth angle and the incident angle. The position of the sunlight is divided into a pitching direction incident angle beta, and the range of the pitching direction incident angle beta is 0-90 degrees; the azimuth angle α in the horizontal direction is in the range of 0 to 360 degrees.
Referring to fig. 5, the heading angle of the road is the angle between the tangent of the road and true north.
The invention relates to a camera, a vehicle-mounted map, a vehicle positioning system and a processing device.
The map information of the vehicle-mounted map comprises road shape, gradient, weather and other information, a road course angle and a longitudinal gradient can be deduced by combining the position of the vehicle, and the weather information comprises sunny days, rainy days and cloudy days.
The vehicle positioning system comprises a GNSS sensor, a vehicle speed sensor, a steering wheel corner sensor, an inertial navigation sensor and other equipment, vehicle position information is comprehensively judged by acquiring GNSS positioning, vehicle speed, steering wheel corners and inertial navigation information, and the longitude and latitude of the position can be judged according to the vehicle position by combining a map.
The processing device acquires storage map information, acquires storage vehicle position information, and stores solar incident angle information. The processing device inquires the map information of the vehicle through the position information, obtains the road course angle and the road gradient of the current position of the vehicle, simulates the vehicle course angle according to the road course angle, and simulates the vehicle pitch angle according to the road gradient. The processing device can also expand the capability of obtaining the road gradient and the road course angle information of the road in front of the vehicle according to the computing capability. After the processing device acquires the relevant information, whether the camera meets the strong light or not, the distance of the position point of the strong light and the like are comprehensively judged.
The comprehensive judgment mode of the camera blindness-causing prediction method based on the map is as follows:
the camera is arranged on the vehicle, the visual range of the camera can change along with the movement of the vehicle, and the transverse visual range of the camera is influenced by the course angle of the vehicle; the longitudinal direction is influenced by the pitch angle of the vehicle; whether the camera at the front position of the road meets strong light or not is predicted, the vehicle is supposed to run along the road direction, and a ground disc model of the vehicle is toughened, so that the heading angle of the vehicle is equal to the included angle between the tangent line and the north of the road, and the pitch angle of the vehicle in front of the road is parallel to the road surface. If the heading angle and the pitch angle of the vehicle at a certain point in front of the road need to be accurately calculated, a vehicle chassis dynamics model needs to be added for relevant calculation, and the calculation is not specifically described here.
The method specifically comprises the following steps:
judging the position of the vehicle, and determining a corresponding map according to the position of the vehicle;
detecting a vehicle course angle H through road information on a map, and detecting a vehicle pitch angle L through ramp information on the map;
calculating a sun incident angle beta and a sun azimuth angle alpha according to the longitude and the latitude of the position of the vehicle and the current time;
by combining the influence of the vehicle heading angle H and the vehicle pitch angle L on the visual range of the camera, when the incident angle and the azimuth angle of the sun fall within the visual range of the camera of the vehicle, whether the visual range of the camera is coincident with the incident angle beta of the sun and the azimuth angle alpha of the sun or not is judged, and the camera can cause blindness due to strong sunlight, so that the confidence coefficient of the camera is reduced.
Specifically, when the following conditions are met, it is determined that the camera encounters strong light, and the confidence of the object identification information given by the camera should be decreased:
the longitudinal viewing angle B satisfies: h + B/2> beta, and the lateral viewing angle A satisfies: l + A/2> alpha and L-A/2< alpha.
If the pitch angle and the heading angle of the vehicle cannot be directly obtained when judging whether the vehicle on the road in front meets the vehicle information, the pitch angle L and the heading angle H of the vehicle can be calculated according to the gradient i of the road, the direction of the tangent of the road and the included angle Q between the north and the south; wherein the content of the first and second substances,
L=arctan i;
H=Q。
the incident angle β and azimuth angle α of the sun can be calculated from the longitude and latitude of the vehicle, the current month number Y, date D, hour S and minute F, as follows:
the solar incident angle β is calculated as follows:
sinβ=sinδ×sinφ+cosδ×cosφ×cosτ,
wherein delta is the declination angle of the sun, tau is the solar time angle, and phi is the latitude;
sinδ=0.39795×cos[0.98563×(N-173)/180×pi]
wherein N is the number of days at 1 month and 1 day every year, and pi is the circumferential rate;
τ=(S0-12)×15
S0=Sd+Et
Sdis Beijing time, EtIs a time difference;
Et=(JD-120°)/15°
wherein JD is longitude;
the solar azimuth α is calculated as follows:
cosα=(sinδ-sinβ×sinφ)/(cosβ×cosφ),
wherein, delta is the declination angle of the sun, beta is the incident angle of the sun, and phi is the latitude.
According to the method, the position information of the vehicle is judged by combining GNSS positioning, vehicle speed, a steering wheel and an inertial navigation sensor; the map information of the vehicle is inquired through the position information, and the road course angle, the road gradient, the vehicle course angle and the vehicle pitch angle information of the current position of the vehicle are obtained; the road gradient of the road ahead of the vehicle, road heading angle information (the distance of the road ahead of the vehicle depends on the computing power of the computational control system of the invention) can also be known. Therefore, the position where the camera can meet strong light can be comprehensively judged.
A further embodiment of the present invention is a map-based predictive camera blinding device comprising a memory having instructions stored therein for enabling the processor to perform the above-described map-based predictive camera blinding method.
Another embodiment of the invention is a machine-readable storage medium having instructions stored thereon for enabling a machine to perform the above-described map-based predictive camera blinding method.
The processor may be a Central Processing Unit (CPU), but may also be other general purpose processors, digital signal processors (dsps), application specific integrated circuits, off-the-shelf programmable gate arrays or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, and the like.
The memory may be used to store the computer program instructions, and the processor may implement the various functions of the fuel cell electric vehicle cruising range estimation apparatus by operating or executing the computer program instructions stored in the memory, and calling the data stored in the memory. The memory may include high speed random access memory and may also include non-volatile memory such as a hard disk, a memory, a plug-in hard disk, a Smart Memory Card (SMC), a Secure Digital (SD) card, a flash memory card, at least one magnetic disk storage device, a flash memory device, or other volatile solid state storage device.
Although the embodiments of the present invention have been described in detail with reference to the accompanying drawings, the embodiments of the present invention are not limited to the details of the above embodiments, and various simple modifications can be made to the technical solutions of the embodiments of the present invention within the technical idea of the embodiments of the present invention, and the simple modifications all belong to the protection scope of the embodiments of the present invention.
Claims (7)
1. A map-based camera blindness-causing prediction method is characterized in that: comprises the following steps of (a) carrying out,
judging the position of the vehicle, and determining a corresponding map according to the position of the vehicle;
detecting a vehicle course angle H through road information on a map, and detecting a vehicle pitch angle L through ramp information on the map;
calculating a sun incident angle beta and a sun azimuth angle alpha according to the longitude and the latitude of the vehicle and the current time;
and judging whether the visible range of the camera is overlapped with the sun incident angle beta and the sun azimuth angle alpha or not by combining the influence of the vehicle course angle H and the vehicle pitch angle L on the visible range of the camera, wherein the confidence coefficient of the object identification information given by the camera is reduced under the condition of overlapping.
2. The map-based camera blindness-causing prediction method of claim 1, wherein:
when the following conditions are met, the judgment that the camera meets strong light is carried out, and the confidence coefficient of the object identification information given by the camera is decreased:
the longitudinal viewing angle B satisfies: h + B/2> beta, and the lateral viewing angle A satisfies: l + A/2> alpha and L-A/2< alpha.
3. The map-based predictive camera blinding method of claim 2, wherein: calculating a vehicle pitch angle L and a vehicle heading angle H according to the gradient i of the road, the direction of the vehicle to the tangent of the road and the included angle Q between the true north and the tangent; wherein the content of the first and second substances,
L=arctan i;
H=Q。
4. the map-based predictive camera blinding method of claim 3, wherein: the incident angle β and azimuth angle α of the sun can be calculated by the longitude JD and latitude Φ of the vehicle, the current month Y, date D, and hour S as follows:
the solar incident angle β is calculated as follows:
sinβ=sinδ×sinφ+cosδ×cosφ×cosτ,
wherein delta is the declination angle of the sun, tau is the solar time angle, and phi is the latitude;
sinδ=0.39795×cos[0.98563×(N-173)/180×pi]
wherein N is the number of days at 1 month and 1 day every year, and pi is the circumferential rate;
τ=(S0-12)×15
S0=Sd+Et
Sdis Beijing time, EtIs a time difference;
Et=(JD-120°)/15°
wherein JD is longitude;
the solar azimuth α is calculated as follows:
cosα=(sinδ-sinβ×sinφ)/(cosβ×cosφ),
wherein, delta is the declination angle of the sun, beta is the incident angle of the sun, and phi is the latitude.
5. The map-based predictive camera blinding method of claim 4, wherein: the position information of the vehicle is judged by combining the GNSS positioning, the vehicle speed, the steering wheel and the inertial navigation sensor on the vehicle; inquiring map information of the vehicle through the position information; obtaining a road course angle H of the current position of the vehicle through map information, and simulating the vehicle course angle by using the road course angle; and obtaining the slope information of the current position of the vehicle through the map information, and simulating the pitch angle L of the vehicle by using the slope information.
6. An apparatus for map-based predictive camera blinding, the apparatus comprising a memory and a processor, the memory having stored therein instructions for enabling the processor to perform a method for map-based predictive camera blinding according to any of claims 1 to 5.
7. A machine-readable storage medium having stored thereon instructions for enabling a machine to perform the method for map-based predictive camera blinding according to any of claims 1 to 5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110476765.2A CN113223312B (en) | 2021-04-29 | 2021-04-29 | Camera blindness prediction method and device based on map and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110476765.2A CN113223312B (en) | 2021-04-29 | 2021-04-29 | Camera blindness prediction method and device based on map and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113223312A true CN113223312A (en) | 2021-08-06 |
CN113223312B CN113223312B (en) | 2022-10-11 |
Family
ID=77090068
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110476765.2A Active CN113223312B (en) | 2021-04-29 | 2021-04-29 | Camera blindness prediction method and device based on map and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113223312B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114526732A (en) * | 2022-01-25 | 2022-05-24 | 岚图汽车科技有限公司 | Vehicle positioning method and system |
CN115146014A (en) * | 2022-06-20 | 2022-10-04 | 岚图汽车科技有限公司 | Sunlight dazzling scene integration method and system for high-precision map |
CN115278095A (en) * | 2022-05-11 | 2022-11-01 | 岚图汽车科技有限公司 | Vehicle-mounted camera control method and device based on fusion perception |
Citations (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050117027A1 (en) * | 2003-10-16 | 2005-06-02 | Masaaki Fukuhara | Imaging apparatus and a camera for vehicles |
JP2008137494A (en) * | 2006-12-01 | 2008-06-19 | Denso Corp | Vehicular visual field assistance device |
CN102137231A (en) * | 2010-12-31 | 2011-07-27 | 天津市亚安科技电子有限公司 | Method and device for preventing highlights from directly irradiating light sensitive device of vidicon |
CN102509067A (en) * | 2011-09-22 | 2012-06-20 | 西北工业大学 | Detection method for lane boundary and main vehicle position |
CN103604410A (en) * | 2013-11-27 | 2014-02-26 | 东北电力大学 | Sun direction detection sensor capable of outputting digital signals |
CN105988128A (en) * | 2015-03-20 | 2016-10-05 | 福特全球技术公司 | Vehicle location accuracy |
CN106043300A (en) * | 2015-04-13 | 2016-10-26 | 福特环球技术公司 | Method and system for vehicle cruise control |
JP2017181588A (en) * | 2016-03-28 | 2017-10-05 | 大日本印刷株式会社 | Projection device, shield, projection method and projection program |
CN107531181A (en) * | 2015-04-30 | 2018-01-02 | 罗伯特·博世有限公司 | For the method and system of the outside rear-view mirror alternative system of control vehicle in the region of the road of two intersections and the outside rear-view mirror alternative system with this system |
CN110450718A (en) * | 2019-08-27 | 2019-11-15 | 广州小鹏汽车科技有限公司 | Lookout assist method, apparatus, medium, controlling terminal and the automobile that slope drives |
US20190384294A1 (en) * | 2015-02-10 | 2019-12-19 | Mobileye Vision Technologies Ltd. | Crowd sourcing data for autonomous vehicle navigation |
CN110779618A (en) * | 2019-08-28 | 2020-02-11 | 腾讯科技(深圳)有限公司 | Method, device and equipment for realizing backlight detection in vehicle running and storage medium |
CN110794848A (en) * | 2019-11-27 | 2020-02-14 | 北京三快在线科技有限公司 | Unmanned vehicle control method and device |
CN110979183A (en) * | 2019-12-07 | 2020-04-10 | 吉利汽车研究院(宁波)有限公司 | Vehicle blind area detection device and method and vehicle |
CN111144467A (en) * | 2019-12-19 | 2020-05-12 | 惠州市德赛西威智能交通技术研究院有限公司 | Method and system for realizing scene factor acquisition |
CN111536931A (en) * | 2020-04-21 | 2020-08-14 | 汉腾汽车有限公司 | Method for calculating sun illumination intensity, direction and angle based on EPS, TBOX and camera |
CN111982147A (en) * | 2020-08-26 | 2020-11-24 | 上海博泰悦臻网络技术服务有限公司 | Vehicle-mounted instrument shadow effect display method and system, storage medium and vehicle-mounted terminal |
CN112140897A (en) * | 2020-09-30 | 2020-12-29 | 山东理工大学 | Solar cell panel vertical position control scheme for solar electric vehicle based on multi-sensor fusion |
CN112216097A (en) * | 2019-07-09 | 2021-01-12 | 华为技术有限公司 | Method and device for detecting blind area of vehicle |
CN112365544A (en) * | 2019-07-26 | 2021-02-12 | 北京百度网讯科技有限公司 | Image recognition interference detection method and device, computer equipment and storage medium |
CN112654836A (en) * | 2019-02-04 | 2021-04-13 | 御眼视觉技术有限公司 | System and method for vehicle navigation |
-
2021
- 2021-04-29 CN CN202110476765.2A patent/CN113223312B/en active Active
Patent Citations (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050117027A1 (en) * | 2003-10-16 | 2005-06-02 | Masaaki Fukuhara | Imaging apparatus and a camera for vehicles |
JP2008137494A (en) * | 2006-12-01 | 2008-06-19 | Denso Corp | Vehicular visual field assistance device |
CN102137231A (en) * | 2010-12-31 | 2011-07-27 | 天津市亚安科技电子有限公司 | Method and device for preventing highlights from directly irradiating light sensitive device of vidicon |
CN102509067A (en) * | 2011-09-22 | 2012-06-20 | 西北工业大学 | Detection method for lane boundary and main vehicle position |
CN103604410A (en) * | 2013-11-27 | 2014-02-26 | 东北电力大学 | Sun direction detection sensor capable of outputting digital signals |
US20190384294A1 (en) * | 2015-02-10 | 2019-12-19 | Mobileye Vision Technologies Ltd. | Crowd sourcing data for autonomous vehicle navigation |
CN105988128A (en) * | 2015-03-20 | 2016-10-05 | 福特全球技术公司 | Vehicle location accuracy |
CN106043300A (en) * | 2015-04-13 | 2016-10-26 | 福特环球技术公司 | Method and system for vehicle cruise control |
CN107531181A (en) * | 2015-04-30 | 2018-01-02 | 罗伯特·博世有限公司 | For the method and system of the outside rear-view mirror alternative system of control vehicle in the region of the road of two intersections and the outside rear-view mirror alternative system with this system |
JP2017181588A (en) * | 2016-03-28 | 2017-10-05 | 大日本印刷株式会社 | Projection device, shield, projection method and projection program |
CN112654836A (en) * | 2019-02-04 | 2021-04-13 | 御眼视觉技术有限公司 | System and method for vehicle navigation |
CN112216097A (en) * | 2019-07-09 | 2021-01-12 | 华为技术有限公司 | Method and device for detecting blind area of vehicle |
CN112365544A (en) * | 2019-07-26 | 2021-02-12 | 北京百度网讯科技有限公司 | Image recognition interference detection method and device, computer equipment and storage medium |
CN110450718A (en) * | 2019-08-27 | 2019-11-15 | 广州小鹏汽车科技有限公司 | Lookout assist method, apparatus, medium, controlling terminal and the automobile that slope drives |
CN110779618A (en) * | 2019-08-28 | 2020-02-11 | 腾讯科技(深圳)有限公司 | Method, device and equipment for realizing backlight detection in vehicle running and storage medium |
CN110794848A (en) * | 2019-11-27 | 2020-02-14 | 北京三快在线科技有限公司 | Unmanned vehicle control method and device |
CN110979183A (en) * | 2019-12-07 | 2020-04-10 | 吉利汽车研究院(宁波)有限公司 | Vehicle blind area detection device and method and vehicle |
CN111144467A (en) * | 2019-12-19 | 2020-05-12 | 惠州市德赛西威智能交通技术研究院有限公司 | Method and system for realizing scene factor acquisition |
CN111536931A (en) * | 2020-04-21 | 2020-08-14 | 汉腾汽车有限公司 | Method for calculating sun illumination intensity, direction and angle based on EPS, TBOX and camera |
CN111982147A (en) * | 2020-08-26 | 2020-11-24 | 上海博泰悦臻网络技术服务有限公司 | Vehicle-mounted instrument shadow effect display method and system, storage medium and vehicle-mounted terminal |
CN112140897A (en) * | 2020-09-30 | 2020-12-29 | 山东理工大学 | Solar cell panel vertical position control scheme for solar electric vehicle based on multi-sensor fusion |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114526732A (en) * | 2022-01-25 | 2022-05-24 | 岚图汽车科技有限公司 | Vehicle positioning method and system |
CN114526732B (en) * | 2022-01-25 | 2024-01-16 | 岚图汽车科技有限公司 | Vehicle positioning method and system |
CN115278095A (en) * | 2022-05-11 | 2022-11-01 | 岚图汽车科技有限公司 | Vehicle-mounted camera control method and device based on fusion perception |
CN115146014A (en) * | 2022-06-20 | 2022-10-04 | 岚图汽车科技有限公司 | Sunlight dazzling scene integration method and system for high-precision map |
Also Published As
Publication number | Publication date |
---|---|
CN113223312B (en) | 2022-10-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113223312B (en) | Camera blindness prediction method and device based on map and storage medium | |
CN109946732B (en) | Unmanned vehicle positioning method based on multi-sensor data fusion | |
US20220024473A1 (en) | Generating Testing Instances for Autonomous Vehicles | |
WO2018196391A1 (en) | Method and device for calibrating external parameters of vehicle-mounted camera | |
CN107229063A (en) | A kind of pilotless automobile navigation and positioning accuracy antidote merged based on GNSS and visual odometry | |
CN111311902B (en) | Data processing method, device, equipment and machine readable medium | |
CN109313646B (en) | Method and apparatus for creating an optimized location map and machine-readable storage medium | |
CN101609149A (en) | A kind of method that improves attitude determination precision of airborne laser radar | |
CN103438887A (en) | Absolute coordinate obtaining method used for positioning mobile robot and reconstructing environment | |
CN110444044B (en) | Vehicle pose detection system based on ultrasonic sensor, terminal and storage medium | |
CN103438906A (en) | Vision and satellite positioning sensor joint calibrating method suitable for robot navigation | |
CN114088114B (en) | Vehicle pose calibration method and device and electronic equipment | |
JP2015102449A (en) | Vehicle self position estimation apparatus and vehicle self position estimation method | |
US20190005814A1 (en) | Vehicle determination apparatus, vehicle determination method, and computer readable medium | |
CN110103829B (en) | Display method and device of vehicle-mounted display screen, vehicle-mounted display screen and vehicle | |
CN110751693A (en) | Method, device, equipment and storage medium for camera calibration | |
CN113405555B (en) | Automatic driving positioning sensing method, system and device | |
CN113819904A (en) | polarization/VIO three-dimensional attitude determination method based on zenith vector | |
CN114241062A (en) | Camera external parameter determination method and device for automatic driving and computer readable storage medium | |
KR20170134893A (en) | Advanced driver assistance system for real-time estimation of relative position of the sun and method thereby | |
CN114694107A (en) | Image processing method and device, electronic equipment and storage medium | |
CN111238524B (en) | Visual positioning method and device | |
Ragab et al. | The utilization of DNN-based semantic segmentation for improving low-cost integrated stereo visual odometry in challenging urban environments | |
CN106441310A (en) | Method for calculating solar azimuth based on CMOS (Complementary Metal Oxide Semiconductor) | |
CN112874300A (en) | Backlight adjusting method and system for liquid crystal instrument and vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |