CN113361453A - Traffic light identification system and method under visual fuzzy scene - Google Patents

Traffic light identification system and method under visual fuzzy scene Download PDF

Info

Publication number
CN113361453A
CN113361453A CN202110710218.6A CN202110710218A CN113361453A CN 113361453 A CN113361453 A CN 113361453A CN 202110710218 A CN202110710218 A CN 202110710218A CN 113361453 A CN113361453 A CN 113361453A
Authority
CN
China
Prior art keywords
traffic light
chromaticity
image
light
red
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110710218.6A
Other languages
Chinese (zh)
Inventor
文翊
李红林
方强
高广博
刘帅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dongfeng Motor Corp
Original Assignee
Dongfeng Motor Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dongfeng Motor Corp filed Critical Dongfeng Motor Corp
Priority to CN202110710218.6A priority Critical patent/CN113361453A/en
Publication of CN113361453A publication Critical patent/CN113361453A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Analysis (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses a traffic light identification method under a visual fuzzy scene, which comprises the following steps: s1, collecting traffic light images and segmenting the images; s2, calibrating the chromaticity direction angle of the traffic light image; s3, acquiring a traffic light chromaticity moving direction diagram; and S4, carrying out traffic light identification. The invention can realize that the color of the traffic light is identified under the condition that the identified image is fuzzy due to the pollution or shielding of the camera and other factors, thereby improving the reliability of the traffic light identification system.

Description

Traffic light identification system and method under visual fuzzy scene
Technical Field
The invention relates to the technical field of vehicle environment perception detection, in particular to a traffic light identification system and method in a visual fuzzy scene.
Background
The related traffic light recognition system generally follows two technical routes in the process of product development at present: the route of the traffic light is directly identified through the camera, and the vehicle receives the signal to know the route of the signal state through the signal sent by the V2X equipment on the traffic light. However, although a traffic light identification system in which a camera directly identifies a route of a traffic light is mature, in a special scene, because the traffic light occupies a small area in a picture, extremely high identification precision is required, and the identification performance is greatly reduced in a rainy day or when the camera is slightly shielded; the V2X equipment on the traffic lights sends signals for vehicles to receive and know signal state routes, so that the popularization is less, the vehicles need the support of infrastructure traffic facilities, and the vehicles need the V2X equipment, so that the situation that the customers do not have the goals of improving the experience and reducing the cost cannot be achieved today.
Current traffic light identification systems, in which when no traffic light is identified or identification is not accurate, vehicle re-starting is prohibited for safety considerations, ensure safety but greatly reduce functional availability. The most advanced current traffic enterprises such as Tesla and the like in the world have no subsequent control function of a traffic light identification system for display and driver prompt at present.
Even if the system of the route of the traffic lights is very mature, the problem that the recognition performance of the system is greatly reduced in rainy days or when the camera is slightly shielded can not be solved due to the limitation of a functional principle.
The use of V2X equipment on traffic lights to send signaling routes inevitably adds significant cost, is subject to basic traffic, and is not acceptable to consumers.
Disclosure of Invention
The invention aims to provide a traffic light identification system and a traffic light identification method in a visual fuzzy scene so as to realize the identification of traffic lights under the condition that an image collected by a camera is fuzzy.
In order to solve the technical problem, the invention provides a technical scheme that: a traffic light identification method under a vision fuzzy scene comprises the following steps:
s1, collecting traffic light images and segmenting the images; the traffic light image comprises a clear image sequence and a fuzzy image sequence;
s2, carrying out traffic light identification on the segmented clear image sequence, classifying according to red and green colors, and then carrying out traffic light chromaticity direction angle calibration;
s3, aiming at the blurred image sequence, acquiring the traffic light chromaticity moving direction of the blurred image sequence by utilizing a geometric mean logarithmic chromaticity space according to a plurality of frame images at the tail of the blurred image sequence, and establishing a traffic light chromaticity moving direction diagram;
and S4, establishing a traffic light classifier according to the traffic light chromaticity moving directional diagram, carrying out traffic light identification on the fuzzy image sequence, and carrying out validity check on the identification result.
According to the above scheme, the method for segmenting the image in S1 specifically includes:
the method comprises the steps of identifying roads in a red road lamp image, calibrating vanishing points of the roads, dividing by taking a horizontal line where the vanishing points are located as a horizon, and selecting an area above the horizon as a traffic light identification area.
According to the scheme, the step S2 includes acquiring standard two-dimensional chromaticity of the clear image sequence; the clear image sequence is classified into a red light image and a green light image, and the chromaticity direction angle corresponding to the red light image and the green light image is obtained by calculating the minimum entropy curve of the red light image and the green light image, so that the chromaticity direction angle calibration of the traffic light is completed.
According to the scheme, the standard two-dimensional chroma of the clear image sequence is
Figure BDA0003133354790000021
Wherein k represents the label of each pixel point;
the red light image and the green light image are respectively expressed as PR=(RRi,GRi,BRi) And PG=(RGi,GGi,BGi) Wherein i ═ 1,2, 3.., N;
the minimum entropy curve solving step is as follows:
1) solving the minimum entropy:
η=-∑Pi(Pθ)ln(Pi(Pθ)) (1)
where η is the entropy value, PθFor two-dimensional logarithmic chromaticityPkProjection result diagram in projection direction, PiIs PθProbability that the gray value in the gray histogram falls into the ith bar;
Figure BDA0003133354790000031
wherein W is the size of the image;
Pθthe more bins contained in the gray histogram, the more PiThe smaller the entropy η is, the larger the entropy η is; and (3) obtaining the chromaticity direction and the minimum entropy curve of the red light image and the green light image according to the formula (1) and the formula (2), thereby obtaining the chromaticity direction angle theta corresponding to the red light image and the green light image.
According to the above scheme, the method for obtaining the geometric mean logarithmic chromaticity space in S3 includes: obtaining an R, G, B channel in the fuzzy image sequence to obtain a geometric mean value, obtaining chromaticity through the geometric mean value, and then obtaining a logarithm of the chromaticity to obtain a logarithm of the geometric mean value chromaticity space;
the method for acquiring the chromaticity moving directional diagram of the traffic light comprises the following steps: and converting the geometric mean logarithmic chromaticity space into rho space vector for representation, and converting the rho space into a two-dimensional chi space through coordinate system transformation, so as to obtain the motion speed of a pixel point in a blurred image sequence in a vertical chromaticity direction angle and the motion speed of the pixel point, and further obtain a traffic light chromaticity moving directional diagram.
According to the scheme, the geometric mean value is Cref
Figure BDA0003133354790000032
R, G, B is a parameter value of an RGB channel of a traffic light image in the blurred image sequence;
the chromaticity is Ck
Figure BDA0003133354790000033
For chroma CkTaking logarithmObtaining a geometric mean logarithmic chromaticity space Pk
Pk=ln(Ck) (5)
PkThe rho space vector of the matrix is denoted as rho and
Figure BDA0003133354790000034
orthogonal;
the coordinate system transformation transforms rho space into two-dimensional χ space:
Figure BDA0003133354790000035
wherein χ is the two-dimensional column vector, and U ═ v1,v2]T
Figure BDA0003133354790000041
Figure BDA0003133354790000042
Wherein v is1,v2The motion speed v in the angular directions of the pixel points parallel to and perpendicular to the chromaticity directionX,vYThe motion speed of the pixel point is taken as the motion speed of the pixel point;
whereby said traffic light chromaticity shift pattern
Figure BDA0003133354790000043
Expressed as:
Figure BDA0003133354790000044
according to the scheme, the traffic light classifier establishing method in the S4 comprises the following steps: fitting a manually calibrated region sample set such as red, green and the like by adopting a normal distribution model to obtain confidence intervals corresponding to the red and green lights respectively so as to obtain a traffic light classifier;
and (5) checking validity, and judging the circularity of the specific position.
According to the scheme, the traffic light classifier specifically comprises the following steps:
taking 90% of data of the normal distribution center of the red light region as the chromaticity characteristic of the red light, and obtaining the confidence interval of the red light as follows:
1,λ2]=[uR-1.65σR,uR+1.65σR] (10)
taking 90% of data in the normal distribution center of the green light region as the chromaticity characteristic of the green light, and obtaining the confidence interval of the green light as follows:
3,λ4]=[uG-1.65σG,uG+1.65σG] (11)
in the formula uR、uGRespectively representing the moving starting points of the moving direction diagrams of the pixels corresponding to the red light images and the green light images;
a traffic light identification classifier is then obtained:
Figure BDA0003133354790000045
the circularity determination specifically includes:
set the circularity SCComprises the following steps:
Figure BDA0003133354790000051
in the formula, s is the area of a red or green area block in a traffic light image, and p is the perimeter of the area block;
let the circularity threshold be TCThe circularity judging method comprises the following steps:
Figure BDA0003133354790000052
a vision-blurred scene traffic light recognition system for implementing the vision-blurred scene traffic light recognition method described above, comprising:
the image segmentation module is used for segmenting the acquired traffic light image;
the traffic light chromaticity direction angle calibration module is used for carrying out traffic light identification on the segmented clear image sequence, classifying according to red and green colors and then carrying out the chromaticity direction angle calibration of the traffic light;
the traffic light chromaticity moving direction diagram acquisition module is used for acquiring the traffic light chromaticity moving direction of the fuzzy image sequence by utilizing a geometric mean logarithmic chromaticity space according to a plurality of frame images at the tail end of the clear image sequence and establishing a traffic light chromaticity moving direction diagram;
and the traffic light identification module is used for establishing a traffic light classifier according to the traffic light chromaticity moving directional diagram, identifying the traffic light of the fuzzy image sequence and carrying out validity check on the identification result.
An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the method as described above when executing the program.
The invention has the beneficial effects that: the traffic light images are identified by calibrating the direction angle of the clear image sequence and acquiring the traffic light image chromaticity moving directional diagram of the fuzzy image sequence, so that the traffic light images are accurately identified in rainy days or other fuzzy scenes such as polluted cameras.
Drawings
Fig. 1 is a schematic diagram of traffic light identification in a visual fuzzy scene according to an embodiment of the present invention;
fig. 2 is a flow chart of traffic light identification in a visual fuzzy scene according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present disclosure more apparent, the technical solutions of the embodiments of the present disclosure will be described clearly and completely with reference to the drawings of the embodiments of the present disclosure. It is to be understood that the described embodiments are only a few embodiments of the present disclosure, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the described embodiments of the disclosure without any inventive step, are within the scope of protection of the disclosure.
Referring to fig. 1 and 2, a traffic light recognition method in a visual fuzzy scene includes the following steps:
s1, collecting traffic light images and segmenting the images; the traffic light image comprises a clear image sequence and a fuzzy image sequence;
s2, carrying out traffic light identification on the segmented clear image sequence, classifying according to red and green colors, and then carrying out traffic light chromaticity direction angle calibration;
s3, aiming at the blurred image sequence, acquiring the traffic light chromaticity moving direction of the blurred image sequence by utilizing a geometric mean logarithmic chromaticity space according to a plurality of frame images at the tail of the blurred image sequence, and establishing a traffic light chromaticity moving direction diagram;
and S4, establishing a traffic light classifier according to the traffic light chromaticity moving directional diagram, carrying out traffic light identification on the fuzzy image sequence, and carrying out validity check on the identification result.
Further, the method for segmenting the image in S1 specifically includes:
the method comprises the steps of identifying roads in a red road lamp image, calibrating vanishing points of the roads, dividing by taking a horizontal line where the vanishing points are located as a horizon, and selecting an area above the horizon as a traffic light identification area.
Further, the S2 includes acquiring a standard two-dimensional chromaticity of the sequence of sharp images; the clear image sequence is classified into a red light image and a green light image, and the chromaticity direction angle corresponding to the red light image and the green light image is obtained by calculating the minimum entropy curve of the red light image and the green light image, so that the chromaticity direction angle calibration of the traffic light is completed.
Further, the standard two-dimensional chrominance of the sequence of sharp images is
Figure BDA0003133354790000061
Wherein k represents the label of each pixel point;
the red light image and greenThe lamp images are respectively represented as PR=(RRi,GRi,BRi) And PG=(RGi,GGi,BGi) Wherein i ═ 1,2, 3.., N;
the minimum entropy curve solving step is as follows:
1) solving the minimum entropy:
η=-∑Pi(Pθ)ln(Pi(Pθ)) (1)
where η is the entropy value, PθFor two-dimensional logarithmic chromaticity PkProjection result diagram in projection direction, PiIs PθProbability that the gray value in the gray histogram falls into the ith bar;
Figure BDA0003133354790000071
wherein W is the size of the image;
Pθthe more bins contained in the gray histogram, the more PiThe smaller the entropy η is, the larger the entropy η is; and (3) obtaining the chromaticity direction and the minimum entropy curve of the red light image and the green light image according to the formula (1) and the formula (2), thereby obtaining the chromaticity direction angle theta corresponding to the red light image and the green light image.
Further, the geometric mean logarithmic chromaticity space obtaining method in S3 includes: obtaining an R, G, B channel in the fuzzy image sequence to obtain a geometric mean value, obtaining chromaticity through the geometric mean value, and then obtaining a logarithm of the chromaticity to obtain a logarithm of the geometric mean value chromaticity space;
the method for acquiring the chromaticity moving directional diagram of the traffic light comprises the following steps: and converting the geometric mean logarithmic chromaticity space into rho space vector for representation, and converting the rho space into a two-dimensional chi space through coordinate system transformation, so as to obtain the motion speed of a pixel point in a blurred image sequence in a vertical chromaticity direction angle and the motion speed of the pixel point, and further obtain a traffic light chromaticity moving directional diagram.
Further, the geometric mean is Cref
Figure BDA0003133354790000072
R, G, B is a parameter value of an RGB channel of a traffic light image in the blurred image sequence;
the chromaticity is Ck
Figure BDA0003133354790000073
For chroma CkTaking logarithm to obtain geometric mean logarithmic chromaticity space Pk
Pk=ln(Ck) (5)
PkThe rho space vector of the matrix is denoted as rho and
Figure BDA0003133354790000081
orthogonal;
the coordinate system transformation transforms rho space into two-dimensional χ space:
Figure BDA0003133354790000082
wherein χ is the two-dimensional column vector, and U ═ v1,v2]T
Figure BDA0003133354790000083
Figure BDA0003133354790000084
Wherein v is1,v2The motion speed v in the angular directions of the pixel points parallel to and perpendicular to the chromaticity directionX,vYThe motion speed of the pixel point is taken as the motion speed of the pixel point;
whereby said traffic light chromaticity shift pattern
Figure BDA0003133354790000085
Expressed as:
Figure BDA0003133354790000086
further, the traffic light classifier establishing method in S4 includes: fitting a manually calibrated region sample set such as red, green and the like by adopting a normal distribution model to obtain confidence intervals corresponding to the red and green lights respectively so as to obtain a traffic light classifier;
and (5) checking validity, and judging the circularity of the specific position.
Further, the specific acquiring method of the traffic light classifier comprises the following steps:
taking 90% of data of the normal distribution center of the red light region as the chromaticity characteristic of the red light, and obtaining the confidence interval of the red light as follows:
1,λ2]=[uR-1.65σR,uR+1.65σR] (10)
taking 90% of data in the normal distribution center of the green light region as the chromaticity characteristic of the green light, and obtaining the confidence interval of the green light as follows:
3,λ4]=[uG-1.65σG,uG+1.65σG] (11)
in the formula uR、uGRespectively representing the moving starting points of the moving direction diagrams of the pixels corresponding to the red light images and the green light images;
a traffic light identification classifier is then obtained:
Figure BDA0003133354790000091
the circularity determination specifically includes:
set the circularity SCComprises the following steps:
Figure BDA0003133354790000092
in the formula, s is the area of a red or green area block in a traffic light image, and p is the perimeter of the area block;
let the circularity threshold be TCThe circularity judging method comprises the following steps:
Figure BDA0003133354790000093
a vision-blurred scene traffic light recognition system for implementing the vision-blurred scene traffic light recognition method described above, comprising:
the image segmentation module is used for segmenting the acquired traffic light image;
the traffic light chromaticity direction angle calibration module is used for carrying out traffic light identification on the segmented clear image sequence, classifying according to red and green colors and then carrying out the chromaticity direction angle calibration of the traffic light;
the traffic light chromaticity moving direction diagram acquisition module is used for acquiring the traffic light chromaticity moving direction of the fuzzy image sequence by utilizing a geometric mean logarithmic chromaticity space according to a plurality of frame images at the tail end of the clear image sequence and establishing a traffic light chromaticity moving direction diagram;
and the traffic light identification module is used for establishing a traffic light classifier according to the traffic light chromaticity moving directional diagram, identifying the traffic light of the fuzzy image sequence and carrying out validity check on the identification result.
An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the method as described above when executing the program.
The above description is only an embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. A traffic light identification method under a visual fuzzy scene is characterized in that: the method comprises the following steps:
s1, collecting traffic light images and segmenting the images; the traffic light image comprises a clear image sequence and a fuzzy image sequence;
s2, carrying out traffic light identification on the segmented clear image sequence, classifying according to red and green colors, and then carrying out traffic light chromaticity direction angle calibration;
s3, aiming at the blurred image sequence, acquiring the traffic light chromaticity moving direction of the blurred image sequence by utilizing a geometric mean logarithmic chromaticity space according to a plurality of frame images at the tail of the blurred image sequence, and establishing a traffic light chromaticity moving direction diagram;
and S4, establishing a traffic light classifier according to the traffic light chromaticity moving directional diagram, carrying out traffic light identification on the fuzzy image sequence, and carrying out validity check on the identification result.
2. The method for identifying traffic lights in a visually blurred scene as claimed in claim 1, wherein: the method for segmenting the image in S1 specifically includes:
the method comprises the steps of identifying roads in a red road lamp image, calibrating vanishing points of the roads, dividing by taking a horizontal line where the vanishing points are located as a horizon, and selecting an area above the horizon as a traffic light identification area.
3. The method for identifying traffic lights in a visually blurred scene as claimed in claim 1, wherein: said S2 includes obtaining a standard two-dimensional chromaticity of the sequence of sharp images; the clear image sequence is classified into a red light image and a green light image, and the chromaticity direction angle corresponding to the red light image and the green light image is obtained by calculating the minimum entropy curve of the red light image and the green light image, so that the chromaticity direction angle calibration of the traffic light is completed.
4. The method for identifying traffic lights in a visually blurred scene as claimed in claim 3, wherein: the standard two-dimensional chrominance of the sequence of sharp images is
Figure FDA0003133354780000011
Wherein k represents the label of each pixel point;
the red light image and the green light image are respectively expressed as PR=(RRi,GRi,BRi) And PG=(RGi,GGi,BGi) Wherein i ═ 1,2, 3.., N;
the minimum entropy curve solving step is as follows:
1) solving the minimum entropy:
η=-∑Pi(Pθ)ln(Pi(Pθ)) (1)
where η is the entropy value, PθFor two-dimensional logarithmic chromaticity PkProjection result diagram in projection direction, PiIs PθProbability that the gray value in the gray histogram falls into the ith bar;
Figure FDA0003133354780000021
wherein W is the size of the image;
Pθthe more bins contained in the gray histogram, the more PiThe smaller the entropy η is, the larger the entropy η is; and (3) obtaining the chromaticity direction and the minimum entropy curve of the red light image and the green light image according to the formula (1) and the formula (2), thereby obtaining the chromaticity direction angle theta corresponding to the red light image and the green light image.
5. The method for identifying traffic lights in a visually blurred scene as claimed in claim 1, wherein: the method for obtaining the geometric mean logarithmic chromaticity space in the S3 comprises the following steps: obtaining an R, G, B channel in the fuzzy image sequence to obtain a geometric mean value, obtaining chromaticity through the geometric mean value, and then obtaining a logarithm of the chromaticity to obtain a logarithm of the geometric mean value chromaticity space;
the method for acquiring the chromaticity moving directional diagram of the traffic light comprises the following steps: and converting the geometric mean logarithmic chromaticity space into rho space vector for representation, and converting the rho space into a two-dimensional chi space through coordinate system transformation, so as to obtain the motion speed of a pixel point in a blurred image sequence in a vertical chromaticity direction angle and the motion speed of the pixel point, and further obtain a traffic light chromaticity moving directional diagram.
6. The method for identifying traffic lights in a visually blurred scene as claimed in claim 5, wherein: the geometric mean value is Cref
Figure FDA0003133354780000022
R, G, B is a parameter value of an RGB channel of a traffic light image in the blurred image sequence;
the chromaticity is Ck
Figure FDA0003133354780000023
For chroma CkTaking logarithm to obtain geometric mean logarithmic chromaticity space Pk
Pk=ln(Ck) (5)
PkThe rho space vector of the matrix is denoted as rho and
Figure FDA0003133354780000031
orthogonal;
the coordinate system transformation transforms rho space into two-dimensional χ space:
Figure FDA0003133354780000032
wherein χ is the two-dimensional column vector, and U ═ v1,v2]T
Figure FDA0003133354780000033
Figure FDA0003133354780000034
Wherein v is1,v2The motion speed v in the angular directions of the pixel points parallel to and perpendicular to the chromaticity directionX,vYThe motion speed of the pixel point is taken as the motion speed of the pixel point;
whereby said traffic light chromaticity shift pattern
Figure FDA0003133354780000035
Expressed as:
Figure FDA0003133354780000036
7. the method for identifying traffic lights in a visually blurred scene as claimed in claim 1, wherein: the traffic light classifier establishing method in the S4 comprises the following steps: fitting a manually calibrated region sample set such as red, green and the like by adopting a normal distribution model to obtain confidence intervals corresponding to the red and green lights respectively so as to obtain a traffic light classifier;
and (5) checking validity, and judging the circularity of the specific position.
8. The method of claim 7, wherein the method comprises: the traffic light classifier specifically comprises the following steps:
taking 90% of data of the normal distribution center of the red light region as the chromaticity characteristic of the red light, and obtaining the confidence interval of the red light as follows:
1,λ2]=[uR-1.65σR,uR+1.65σR] (10)
taking 90% of data in the normal distribution center of the green light region as the chromaticity characteristic of the green light, and obtaining the confidence interval of the green light as follows:
3,λ4]=[uG-1.65σG,uG+1.65σG] (11)
in the formula uR、uGRespectively representing the moving starting points of the moving direction diagrams of the pixels corresponding to the red light images and the green light images;
a traffic light identification classifier is then obtained:
Figure FDA0003133354780000041
the circularity determination specifically includes:
set the circularity SCComprises the following steps:
Figure FDA0003133354780000042
in the formula, s is the area of a red or green area block in a traffic light image, and p is the perimeter of the area block;
let the circularity threshold be TCThe circularity judging method comprises the following steps:
Figure FDA0003133354780000043
9. a traffic light recognition system under a visually blurred scene for implementing the traffic light recognition method under a visually blurred scene according to any one of claims 1 to 8, wherein: the method comprises the following steps:
the image segmentation module is used for segmenting the acquired traffic light image;
the traffic light chromaticity direction angle calibration module is used for carrying out traffic light identification on the segmented clear image sequence, classifying according to red and green colors and then carrying out the chromaticity direction angle calibration of the traffic light;
the traffic light chromaticity moving direction diagram acquisition module is used for acquiring the traffic light chromaticity moving direction of the fuzzy image sequence by utilizing a geometric mean logarithmic chromaticity space according to a plurality of frame images at the tail end of the clear image sequence and establishing a traffic light chromaticity moving direction diagram;
and the traffic light identification module is used for establishing a traffic light classifier according to the traffic light chromaticity moving directional diagram, identifying the traffic light of the fuzzy image sequence and carrying out validity check on the identification result.
10. An electronic device comprising a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein: the processor, when executing the program, implements the steps of the method as claimed in claims 1-8.
CN202110710218.6A 2021-06-25 2021-06-25 Traffic light identification system and method under visual fuzzy scene Pending CN113361453A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110710218.6A CN113361453A (en) 2021-06-25 2021-06-25 Traffic light identification system and method under visual fuzzy scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110710218.6A CN113361453A (en) 2021-06-25 2021-06-25 Traffic light identification system and method under visual fuzzy scene

Publications (1)

Publication Number Publication Date
CN113361453A true CN113361453A (en) 2021-09-07

Family

ID=77536450

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110710218.6A Pending CN113361453A (en) 2021-06-25 2021-06-25 Traffic light identification system and method under visual fuzzy scene

Country Status (1)

Country Link
CN (1) CN113361453A (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102663345A (en) * 2012-03-07 2012-09-12 中盟智能科技(苏州)有限公司 Method and apparatus for automatic identification of traffic lights
CN112989956A (en) * 2021-02-20 2021-06-18 潍柴动力股份有限公司 Traffic light identification method and system based on region of interest and storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102663345A (en) * 2012-03-07 2012-09-12 中盟智能科技(苏州)有限公司 Method and apparatus for automatic identification of traffic lights
CN112989956A (en) * 2021-02-20 2021-06-18 潍柴动力股份有限公司 Traffic light identification method and system based on region of interest and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
宋永超等: "基于色度方向角的离线城市红绿灯识别算法", 《北京交通大学学报》, no. 02, 15 April 2019 (2019-04-15), pages 75 - 77 *

Similar Documents

Publication Publication Date Title
CN110069986B (en) Traffic signal lamp identification method and system based on hybrid model
Wu et al. Lane-mark extraction for automobiles under complex conditions
CN107506760B (en) Traffic signal detection method and system based on GPS positioning and visual image processing
CN109190523B (en) Vehicle detection tracking early warning method based on vision
CN112215306A (en) Target detection method based on fusion of monocular vision and millimeter wave radar
CN107844761B (en) Traffic sign detection method and device
JPH07186833A (en) Surrounding state displaying device for vehicle
US20230394829A1 (en) Methods, systems, and computer-readable storage mediums for detecting a state of a signal light
CN111046741A (en) Method and device for identifying lane line
CN114372919B (en) Method and system for splicing panoramic all-around images of double-trailer train
CN106803073B (en) Auxiliary driving system and method based on stereoscopic vision target
FAN et al. Robust lane detection and tracking based on machine vision
CN108389177B (en) Vehicle bumper damage detection method and traffic safety early warning method
US10417518B2 (en) Vehicle camera system
CN107066929B (en) Hierarchical recognition method for parking events of expressway tunnel integrating multiple characteristics
CN112115737B (en) Vehicle orientation determining method and device and vehicle-mounted terminal
CN109800693B (en) Night vehicle detection method based on color channel mixing characteristics
Habib et al. Lane departure detection and transmission using Hough transform method
CN117078717A (en) Road vehicle track extraction method based on unmanned plane monocular camera
CN107992789B (en) Method and device for identifying traffic light and vehicle
CN113361453A (en) Traffic light identification system and method under visual fuzzy scene
Dai et al. A driving assistance system with vision based vehicle detection techniques
CN112364869B (en) Lane line identification method and device
CN110688876A (en) Lane line detection method and device based on vision
CN111833384B (en) Method and device for rapidly registering visible light and infrared images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210907