CN110858281B - Image processing method, image processing device, electronic eye and storage medium - Google Patents

Image processing method, image processing device, electronic eye and storage medium Download PDF

Info

Publication number
CN110858281B
CN110858281B CN201810961993.7A CN201810961993A CN110858281B CN 110858281 B CN110858281 B CN 110858281B CN 201810961993 A CN201810961993 A CN 201810961993A CN 110858281 B CN110858281 B CN 110858281B
Authority
CN
China
Prior art keywords
characteristic
characteristic value
chrominance
pixel point
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810961993.7A
Other languages
Chinese (zh)
Other versions
CN110858281A (en
Inventor
成东峻
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Uniview Technologies Co Ltd
Original Assignee
Zhejiang Uniview Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Uniview Technologies Co Ltd filed Critical Zhejiang Uniview Technologies Co Ltd
Priority to CN201810961993.7A priority Critical patent/CN110858281B/en
Publication of CN110858281A publication Critical patent/CN110858281A/en
Application granted granted Critical
Publication of CN110858281B publication Critical patent/CN110858281B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • G06V20/54Surveillance or monitoring of activities, e.g. for recognising suspicious objects of traffic, e.g. cars on the road, trains or boats
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/141Control of illumination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/24Aligning, centring, orientation detection or correction of the image
    • G06V10/245Aligning, centring, orientation detection or correction of the image by locating a pattern; Special marks for positioning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/62Text, e.g. of license plates, overlay texts or captions on TV images
    • G06V20/625License plates

Abstract

The embodiment of the invention provides an image processing method, an image processing device, an electronic eye and a storage medium, and relates to the field of image processing, wherein the method comprises the following steps: calculating HSV components corresponding to all pixel points in the target image; obtaining a brightness characteristic value and a chromaticity characteristic value corresponding to each pixel point according to HSV components corresponding to each pixel point; generating an original characteristic distribution matrix corresponding to the target image according to the brightness characteristic values and the chrominance characteristic values of all the pixel points, wherein the original characteristic distribution matrix represents a two-dimensional matrix formed by the brightness characteristic values and the chrominance characteristic values; and processing the original characteristic distribution matrix by using a preset template matrix and a preset normalized scale to obtain a target characteristic distribution matrix. The image processing method, the image processing device, the electronic eye and the storage medium provided by the embodiment of the invention can improve the efficiency of penalty judgment during violation detection.

Description

Image processing method, image processing device, electronic eye and storage medium
Technical Field
The present invention relates to the field of image processing, and in particular, to an image processing method, an image processing apparatus, an electronic eye, and a storage medium.
Background
In the intelligent transportation scheme, a large number of electronic eyes are required to be deployed and used for automatic snapshot of vehicle violation in the crossing scene. The electronic eye intelligent algorithm part generally comprises a vehicle capturing module, a tracking module, a traffic light identification module and a behavior analysis module. In a typical application scene of an electronic eye, a vehicle capturing area is generally arranged at the bottom of a camera, then a tracking module tracks a passing track in real time, a behavior analysis module judges whether violation behaviors are violated by combining traffic light states, and a reasonable time is selected to shoot an evidence picture to form a basis for actual punishment.
Disclosure of Invention
The invention aims to provide an image processing method, an image processing device, an electronic eye and a storage medium, which can improve the efficiency of penalty judgment during violation detection.
In order to achieve the above object, the embodiments of the present invention adopt the following technical solutions:
in a first aspect, an embodiment of the present invention provides an image processing method, where the method includes: calculating HSV components corresponding to each pixel point in the target image; obtaining a brightness characteristic value and a chromaticity characteristic value corresponding to each pixel point according to the HSV component corresponding to each pixel point; generating an original characteristic distribution matrix corresponding to the target image according to the brightness characteristic values and the chrominance characteristic values of all the pixel points, wherein the original characteristic distribution matrix represents a two-dimensional matrix formed by the brightness characteristic values and the chrominance characteristic values; and processing the original characteristic distribution matrix by using a preset template matrix and a preset normalization scale to obtain a target characteristic distribution matrix.
In a second aspect, an embodiment of the present invention provides an image processing apparatus, including: the HSV component calculation module is used for calculating the HSV component corresponding to each pixel point in the target image; the characteristic value generation module is used for obtaining a brightness characteristic value and a chroma characteristic value which correspond to each pixel point according to the HSV component which corresponds to each pixel point; the original characteristic matrix generating module is used for generating an original characteristic distribution matrix corresponding to the target image according to the brightness characteristic value and the chrominance characteristic value of all the pixel points; and the target characteristic matrix generation module is used for processing the original characteristic distribution matrix by using a preset template matrix and a preset normalization scale to obtain a target characteristic distribution matrix.
In a third aspect, an embodiment of the present invention provides an electronic eye, including a memory for storing one or more programs; a processor. The one or more programs, when executed by the processor, implement the image processing method described above.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the above-mentioned image processing method.
Compared with the prior art, the image processing method, the image processing device, the electronic eye and the storage medium provided by the embodiment of the invention generate the target characteristic distribution matrix corresponding to the target image by combining the brightness characteristic value and the chrominance characteristic value of all the pixel points in the target image, so that the electronic eye checks whether the tracking error exists according to the target characteristic distribution matrix.
In order to make the aforementioned and other objects, features and advantages of the present invention comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
FIG. 1 is a diagram of a typical set of false positives evidence;
FIG. 2 shows a schematic block diagram of an electronic eye provided by an embodiment of the invention;
FIG. 3 shows a schematic flow chart of an image processing method provided by an embodiment of the invention;
FIG. 4 is a schematic diagram of target image extraction;
FIG. 5 is a schematic flow chart of the substeps of step S400 in FIG. 3;
FIG. 6 is a schematic flow diagram of sub-steps of sub-step S420 of FIG. 5;
FIG. 7 is a schematic diagram of an original feature distribution matrix;
FIG. 8 is a schematic flow chart of the substeps of step S700 in FIG. 3;
FIG. 9 is a schematic diagram of a target feature distribution matrix;
FIG. 10 is a schematic diagram of the segmentation of the flaring region;
fig. 11 is a schematic configuration diagram showing an image processing apparatus according to an embodiment of the present invention;
FIG. 12 is a schematic block diagram of an object feature matrix generation module of an image processing apparatus according to an embodiment of the present invention;
fig. 13 is a schematic block diagram showing a data normalization processing module of an image processing apparatus according to an embodiment of the present invention;
fig. 14 is a schematic structural diagram illustrating a chromaticity standard processing unit of an image processing apparatus according to an embodiment of the present invention.
In the figure: 10-an electronic eye; 20-an image processing device; 110-a memory; 120-a processor; 130-a memory controller; 140-peripheral interfaces; 150-a radio frequency unit; 160-communication bus/signal line; 170-a camera unit; 200-an image dividing module; 300-HSV component calculation module; 400-a feature value generation module; 500-data standardization processing module; 510-a brightness criterion processing unit; 520-a chrominance standard processing unit; 521-a judgment subunit; 522-a first chrominance processing subunit; 523-second chroma processing subunit; 600-a eigenvalue update module; 700-original feature matrix generation module; 800-a target feature matrix generation module; 810-a convolution calculation unit; 820-a normalization scale calculation unit.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present invention, as presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be obtained by a person skilled in the art without inventive step based on the embodiments of the present invention, are within the scope of protection of the present invention.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Meanwhile, in the description of the present invention, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance.
It should be noted that, in this document, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising a … …" does not exclude the presence of another identical element in a process, method, article, or apparatus that comprises the element.
Some embodiments of the invention are described in detail below with reference to the accompanying drawings. The embodiments and features of the embodiments described below can be combined with each other without conflict.
In the prior art, for violation judgment, such as running a red light, not driving according to a lane and the like, the final driving direction of the vehicle needs to be judged, so that the vehicle needs to be tracked to drive away from an intersection, and the tracking distance is long. On the other hand, the intersection scene conditions are complex, including uneven illumination, pedestrian shielding, rear or transverse vehicle shielding, traffic signs, irregular intersections and the like, which may cause tracking errors. In conclusion, the risk of tracking errors is high, and violation misjudgment or evidence picture scrap is easily caused.
For example, fig. 1 is a typical set of misjudgment evidence graphs, and the trolleys in the target frames in fig. 1 all travel on the middle straight track in fig. 1 (a), 1 (b) and 1 (c). However, due to the tracking error of the electronic eye, the tracking target of the electronic eye is changed from the car shown in fig. 1 (a) and fig. 1 (b) to the bus shown in fig. 1 (c), however, the bus shown in fig. 1 (c) is turning to the left at this time, and the electronic eye misjudges that the car turns to the left on the straight lane by combining the three diagrams shown in fig. 1.
Such misjudgment is caused because: in the prior art, when a picture characteristic matrix is calculated, the characteristic matrix of the picture is generated only by using a brightness characteristic value or a chrominance characteristic value, so that the capacity of resisting the influence of the whole brightness or the chrominance is poor on the premise of reducing the step length and improving the target resolution; or on the premise of meeting the anti-interference capability, the target resolution is lower.
Based on the above-mentioned drawbacks of the prior art, the inventor provides a solution to the following problems: a target characteristic distribution matrix corresponding to the target image is generated by combining the brightness characteristic values and the chrominance characteristic values of all pixel points in the target image, and on the premise of meeting the set target resolution, the longer brightness characteristic step length and the longer chrominance characteristic step length are adopted, so that the capacity of resisting illumination interference is improved, and the effective rate of penalty during violation checking is further improved.
Specifically, referring to fig. 2, fig. 2 shows a schematic structural diagram of an electronic eye 10 according to an embodiment of the present invention, in the embodiment of the present invention, the electronic eye 10 includes a memory 110, a storage controller 130, one or more processors (only one is shown in the figure) 120, a peripheral interface 140, a radio frequency unit 150, a camera unit 170, and the like. These components communicate with each other via one or more communication buses/signal lines 160.
The memory 110 can be used for storing software programs and modules, such as program instructions/modules corresponding to the image processing apparatus 20 provided by the embodiment of the present invention, and the processor 120 executes various functional applications and image processing, such as the image processing method provided by the embodiment of the present invention, by running the software programs and modules stored in the memory 110.
The Memory 110 may be, but is not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a Programmable Read Only Memory (PROM), an Erasable Read Only Memory (EPROM), an electrically Erasable Read Only Memory (EEPROM), and the like.
The processor 120 may be an integrated circuit chip having signal processing capabilities. The Processor 120 may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), a voice Processor, a video Processor, and so on; but also be a digital signal processor, an application specific integrated circuit, a field programmable gate array or other programmable logic device, discrete gate or transistor logic, discrete hardware components. The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor 120 may be any conventional processor or the like.
The peripherals interface 140 couples various input/output devices to the processor 120 and to the memory 110. In some embodiments, the peripheral interface 140, the processor 120, and the memory controller 130 may be implemented in a single chip. In other embodiments of the present invention, they may be implemented by separate chips.
The rf unit 150 is configured to receive and transmit electromagnetic waves, and achieve interconversion between the electromagnetic waves and electrical signals, so as to communicate with a communication network or other devices.
The camera unit 170 is used to take a picture so that the processor 120 processes the taken picture.
It will be appreciated that the configuration shown in fig. 2 is merely illustrative and that the electronic eye 10 may include more or fewer components than shown in fig. 2, or have a different configuration than shown in fig. 2. The components shown in fig. 2 may be implemented in hardware, software, or a combination thereof.
After the electronic eye 10 analyzes and judges that the motor vehicle has the violation behaviors, a reasonable time is selected to snapshot an evidence picture, such as three evidence pictures shown in fig. 1. After the evidence picture is obtained through snapshot, the obtained evidence picture is also tracked and verified, whether the snapshot objects of the three snapshot evidence pictures are the same motor vehicle or not is detected, misjudgment of violation is avoided, the snapshot evidence picture is reserved when the violation is detected, and the basis of actual punishment is formed.
The tracking and checking method for the obtained evidence graph generally comprises the following steps: firstly, image processing is carried out on images in the target frames in the three images to obtain feature distribution matrixes in the target frames in the three images, and then whether the target frames of the three images track the same motor vehicle or not is judged according to the feature distribution matrixes of the target frames in the three images.
Specifically, referring to fig. 3, fig. 3 shows a schematic flowchart of an image processing method according to an embodiment of the present invention, in which the image processing method includes the following steps:
step S200, calculating HSV components corresponding to each pixel point in the target image.
When the captured evidence graph is processed, firstly, a target image is extracted from an original image by using a target frame, as shown in fig. 4, and then HSV components corresponding to each pixel point in the target image are calculated.
In the schematic diagram shown in fig. 4, a two-dimensional coordinate system is established in the original image, wherein the upper left corner of the drawing of the original image is taken as the origin of coordinates, the right side along the drawing is the positive direction of the x axis, and the downward side along the drawing is the positive direction of the y axis. The coordinates of the target box can be expressed as: a = { (x) 0 ,y 0 ),(x 1 ,y 1 )}。
In the tracking algorithm, if the electronic eye 10 uses an algorithm for tracking the target frame, such as KCF, MOT, or the like, the target frame may be directly used; if the electronic eye 10 uses an algorithm that uses LK and the like to track the tracking point, it needs to estimate the license plate width according to the position, and at this time:
Figure BDA0001774018580000081
wherein (x, y) is the tracking point coordinates of the electronic eye 10, L (y) represents the license plate width at the y position, which can be estimated from the calibrated lane line, the license plate detection and recognition results, and k 1 、k 2 、k 3 Respectively a preset horizontal direction expansion ratio, a preset upper edge direction expansion ratio and a preset lower edge direction expansion ratio, k 1 、k 2 、k 3 The values of the tracking points depend on the actual application scene, and the general tracking point is the license plate center by taking an LK algorithm as an example.
Generally, a camera of the electronic eye 10 obtains an 8-bit YUV picture, and first converts a target image into an 8-bit RGB format, and then converts the target image into an HSV format to obtain HSV components.
Specifically, assuming that the coordinates of any pixel point in the target frame are (x, Y), the YUV components of the pixel point are Y respectively YUV (x,y)、U YUV (x, y) and V YUV (x, y), HSV components are H HSV (x,y)、S HSV (x, y) and V HSV (x, y). The HSV component is obtained by the following steps:
firstly, converting into an RGB format:
Figure BDA0001774018580000091
then, the intermediate quantities Tmpmax, tmpmin and TtmpDelta are calculated as follows:
TmpMax=max(R rgb (x,y),G rgb (x,y),B rgb (x,y)),
TmpMin=min(R rgb (x,y),G rgb (x,y),B rgb (x,y)),
TmpDelta=TmpMax-TmpMin,
then obtaining and calculating HSV components:
Figure BDA0001774018580000101
Figure BDA0001774018580000102
V HSV (x,y)=TmpMax,
referring to fig. 3, in step S300, the brightness characteristic value and the chromaticity characteristic value corresponding to each pixel point are obtained according to the HSV component corresponding to each pixel point.
After obtaining the HSV component corresponding to each pixel point according to step S200, for example, a certain pixel point (x, y) has HSV components H corresponding to each of the HSV components HSV (x,y)、S HSV (x, y) and V HSV (x, y) among HSV components of the determined pixel point (x, y), V HSV (x, y) is the brightness characteristic value of the determined pixel point, H HSV And (x, y) is the chroma characteristic value of the determined pixel point.
Correspondingly, after traversing respective HSV components of all the pixel points, the brightness characteristic value and the chromaticity characteristic value corresponding to each pixel point are obtained.
Generally, the luminance characteristic value and the chrominance characteristic value corresponding to each pixel point obtained in step S300 have poor quantity convergence because no preprocessing is performed. To this end, as an embodiment, after performing step S300, the image data method further includes the steps of:
step S400, respectively carrying out data standardization processing on the brightness characteristic value and the chrominance characteristic value which respectively correspond to each pixel point to obtain a processed brightness characteristic value and a processed chrominance characteristic value.
Since each pixel includes a luminance characteristic value and a chrominance characteristic value, as an implementation manner, please refer to fig. 5, fig. 5 is a schematic flow chart of the sub-steps of step S400 in fig. 3, in an embodiment of the present invention, step S400 includes the following sub-steps:
and a substep S410, processing the brightness characteristic value corresponding to each pixel point according to a preset brightness characteristic step length to obtain a processed brightness characteristic value.
When the brightness characteristic value is processed by data standardization, the brightness characteristic value corresponding to the pixel point (x, y) is assumed to be V HSV (x, y), if the preset brightness characteristic step is FtStepV, the brightness characteristic value corresponding to the pixel point (x, y) is V HSV The processing procedure of (x, y) is as follows:
Figure BDA0001774018580000111
wherein FV (x, y) represents the processed luminance characteristic value. In general, the maximum value of the processed luminance feature values FV (x, y)
Figure BDA0001774018580000112
In the substep S420, the chrominance characteristic value corresponding to each pixel point is processed according to the preset chrominance characteristic step length, the preset saturation characteristic threshold value and the saturation characteristic value corresponding to each pixel point, so as to obtain the processed chrominance characteristic value.
The chroma characteristic value of the image has a certain relation with the saturation characteristic value, generally speaking, when the saturation characteristic value is smaller, the chroma is not obvious, and at the moment, the chroma characteristic value is also smaller, namely, the saturation is crossed to the bottom, and the color is not obvious. As an implementation manner, in the embodiment of the present invention, when the saturation characteristic value of the determined pixel point is smaller, the chroma characteristic value of the determined pixel point is directly replaced with a preset chroma characteristic value.
Assuming that the preset chromaticity characteristic step length is FtStepH, the preset chromaticity characteristic value is the maximum chromaticity characteristic value of all pixel points, and the preset chromaticity characteristic value
Figure BDA0001774018580000121
Specifically, referring to fig. 6, fig. 6 is a schematic flow chart of sub-steps of sub-step S420 in fig. 5, and in an embodiment of the present invention, sub-step S420 includes the following sub-steps:
in sub-step S421, it is determined whether the saturation characteristic value corresponding to the target pixel point is smaller than a preset saturation characteristic value? When no, perform substep S422; when yes, sub-step S423 is performed.
For the target pixel point (x, y), the corresponding saturation characteristic value is S in the HSV component HSV (x, y) assuming that the preset saturation feature value is denoted as S TH The substep S421 is a judgment S HSV (x, y) and S TH The size of (c) in between.
And a substep S422, processing the chrominance characteristic value corresponding to the target pixel point by using a preset chrominance characteristic step length to obtain a processed chrominance characteristic value.
When it is determined according to substep S421 that the saturation characteristic value corresponding to the target pixel point is greater than or equal to the preset saturation characteristic threshold, the chrominance characteristic value corresponding to the target characteristic point is processed according to the preset chrominance characteristic step length FtStepH to obtain a processed chrominance characteristic value, i.e., the processed chrominance characteristic value at this time
Figure BDA0001774018580000122
In the substep S423, the predetermined chrominance characteristic value is used as the processed chrominance characteristic value.
Correspondingly, when it is determined according to the sub-step S421 that the saturation characteristic value corresponding to the target pixel point is smaller than the preset saturation characteristic value, it indicates that the color characteristic of the target pixel point is not obvious at this time, that is, the preset chrominance characteristic value FtNumH is directly used as the processed chrominance characteristic value, that is, the processed chrominance characteristic value FH (x, y) = FtNumH at this time.
In summary, the processing procedure of the sub-step S420 is represented as:
Figure BDA0001774018580000131
wherein, FH (x, y) represents the chroma eigenvalue after the pixel (x, y) processing, S HSV (x, y) represents the saturation characteristic value of the pixel point (x, y), S TH Representing a preset saturation characteristic value.
And step S500, updating the respective brightness characteristic value and the chrominance characteristic value of each pixel point by the processed brightness characteristic value and the processed chrominance characteristic value.
After the processed luminance characteristic value and the processed chrominance characteristic value are obtained according to step S400, the processed luminance characteristic value and the processed chrominance characteristic value are used to replace the respective luminance characteristic value and the respective chrominance characteristic value of each pixel point, respectively, so as to replace the respective luminance characteristic value and the respective chrominance characteristic value of each pixel point.
Referring to fig. 3, in step S600, an original feature distribution matrix corresponding to the target image is generated according to the luminance feature values and the chrominance feature values of all the pixel points.
After obtaining the respective luminance eigenvalue and chrominance eigenvalue of all the phase pixel points, since each pixel point at least includes eigenvalues of two dimensions of the luminance eigenvalue and the chrominance eigenvalue, an original characteristic distribution matrix is constructed, as shown in fig. 7, wherein the original characteristic distribution matrix represents a two-dimensional matrix composed of the luminance eigenvalue and the chrominance eigenvalue, and all the pixel points are arranged in the original characteristic distribution matrix.
Specifically, in the embodiment of the present invention, a specific manner of generating an original feature distribution matrix corresponding to a target image according to luminance feature values and chrominance feature values of all pixel points is as follows: and arranging each pixel point in all the pixel points in the original characteristic distribution matrix according to the respective brightness characteristic value and the chrominance characteristic value.
For example, as shown in fig. 7, a two-dimensional characteristic distribution matrix coordinate system is established, in which the horizontal axis is a chromaticity characteristic value, the vertical axis is a luminance characteristic value, and the horizontal axis is positive to the right, that is, the horizontal axis is in the right direction, and the horizontal axis is in a direction in which the chromaticity characteristic value gradually increases; the vertical axis is oriented positive, i.e., upward, and is a direction in which the luminance characteristic value gradually increases. Because any pixel point comprises the characteristic values of two dimensions of the brightness characteristic value and the chrominance characteristic value, the original characteristic distribution matrix corresponding to the target image is obtained after all the pixel points are filled in the characteristic distribution matrix coordinate system according to the respective brightness characteristic value and the chrominance characteristic value of all the pixel points.
It should be noted that, for the above updated luminance characteristic value and chrominance characteristic value of each pixel, for the pixel whose saturation characteristic value is smaller than the preset saturation characteristic value, because the preset chrominance characteristic value is directly used as the processed chrominance characteristic value, in the original characteristic distribution matrix shown in fig. 7, the pixels of this type all exist in the last column of the original characteristic distribution matrix, that is, the rightmost column shown in fig. 7.
And S700, processing the original characteristic distribution matrix by using a preset template matrix and a preset normalization scale to obtain a target characteristic distribution matrix.
Since the original feature distribution matrix obtained in step S600 is generated from the luminance feature values and the chrominance feature values of all the pixels, the original feature distribution matrix is susceptible to the overall luminance and chrominance of the target image, and the final image matching result is also susceptible to the overall luminance and chrominance of the target image, the original feature distribution matrix needs to be processed to obtain the final target feature distribution matrix.
Specifically, referring to fig. 8, fig. 8 is a schematic flowchart of the sub-steps of step S700 in fig. 3, in an embodiment of the present invention, step S700 includes the following sub-steps:
and a substep S710 of convolving the original feature distribution matrix with a preset template matrix to generate an intermediate feature distribution matrix.
In order to increase the fault tolerance of the target image, a preset template matrix and the original characteristic distribution matrix are adopted for convolution processing so as to generate an intermediate characteristic distribution matrix.
Specifically, assuming that the preset template matrix is Tmpl, the original feature distribution matrix obtained by the optimization in steps S400 and S500 is T 0 As shown in fig. 7, since the chrominance characteristic values of the last column of pixel points are all preset chrominance characteristic values, the fault tolerance problem does not exist, and thus, the fault tolerance problem does not existAnd does not participate in the convolution processing process.
At this time, the process of convolution processing is expressed as:
Figure BDA0001774018580000151
wherein, T 1 Representing the intermediate feature distribution matrix.
As an embodiment, the preset template matrix Tmpl may take the following values:
Figure BDA0001774018580000152
and a substep S720, processing the intermediate characteristic distribution matrix according to a preset normalization scale to obtain a target characteristic distribution matrix.
Because the actual size of the vehicle is different at different positions, and the length and the width of different target frames are different. For the convenience of calculation, the above intermediate feature distribution matrix needs to be subjected to normalization scale processing.
Specifically, the way of processing the intermediate feature distribution matrix with a preset normalization scale is as follows:
Figure BDA0001774018580000153
wherein, w = x 1 -x 0 ;h=y 1 -y 0 ;T 1 Representing an intermediate feature distribution matrix; HI represents a preset normalization scale; t represents a target feature distribution matrix, and the final target feature distribution matrix obtained is shown in fig. 9.
Based on the above design, in the image processing method provided in the embodiment of the present invention, a target feature distribution matrix corresponding to a target image is generated by combining luminance feature values and chrominance feature values of all pixels in the target image, so that the electronic eye 10 checks whether a tracking error occurs according to the target feature distribution matrix.
As shown above, the electronic eye 10 is used to analyze and check whether the snap-shot objects of the snap-shot evidence pictures are the same vehicle, and generally checks three evidence pictures, such as the three evidence pictures shown in fig. 1. In the checking process, firstly, a feature distribution matrix of a target image in a first evidence image is used as a template feature matrix, a feature distribution matrix of the target image in a second evidence image is compared with the template feature matrix, when the comparison between the feature distribution matrix of the target image in the second evidence image and the template feature matrix is successful, the first evidence image and the second evidence image are tracked by the same motor vehicle, wherein when the absolute value of the difference value between the feature distribution matrix of the target image in the second evidence image and the template feature matrix is smaller than a preset threshold value, the comparison between the feature distribution matrix of the target image in the second evidence image and the template feature matrix is successful; correspondingly, the comparison process of the third evidence picture and the second evidence picture is the same as the comparison process of the second evidence picture and the first evidence picture.
The image processing procedure described above applies generally to the first evidence graph. However, in general, in order to adapt to a certain tracking error, when extracting the feature distribution matrices of the second evidence image and the third evidence image, a certain amount of extension is required to be performed on the target region, and the region after the extension is referred to as an extension region, as shown in fig. 10, the region in the extension frame is the extension region, where the target region is included in the extension region.
Therefore, when processing the second evidence graph and the third evidence graph, as an implementation manner, please continue to refer to fig. 3, in an embodiment of the present invention, before performing step S200, the image processing method further includes:
step S100, dividing an original image into a plurality of sub-images.
As shown in the schematic diagram of fig. 10, the area defined by the flaring frame is the flaring area, all the images included in the flaring area are referred to as original images during image processing, and when the second evidence image and the third evidence image are subjected to image processing, the original images are firstly divided into a plurality of sub-images, and then the flow steps of steps S200 to S700 are performed on each sub-image.
Specifically, as shown in fig. 10, taking the image processing process on the second evidence graph as an example, the process of dividing the original image into a plurality of sub-images may be: firstly, equally dividing an image range included in a target frame into a plurality of sub-images (namely a plurality of cells), and weighing each sub-image as a cell unit, wherein in the schematic diagram shown in fig. 10, a target area includes 3 × 3 cell units; then, performing an extension along the range of the target area by using one cell to obtain an extension area, for example, in the schematic diagram shown in fig. 10, the distance of one cell is extended toward the direction away from the target frame at each periphery of the target frame, and finally, the extension areas of 5 × 5 cells are obtained, and the frame body formed by all the cell units forming the extension area is the extension frame.
After the first evidence graph (as shown in fig. 1 (a)), the second evidence graph (as shown in fig. 1 (b)) and the third evidence graph (as shown in fig. 1 (c)) are obtained through the above steps, one cell divided in step S100 is used as a step length, and a subsequent matching process is performed on the components of different cells in a sliding window manner, so as to detect whether the target frames of the three graphs track the same motor vehicle.
Referring to fig. 11, fig. 11 shows a schematic structural diagram of an image processing apparatus 20 according to an embodiment of the present invention, in which the image processing apparatus 20 includes an HSV component calculating module 300, a feature value generating module 400, an original feature matrix generating module 700, and a target feature matrix generating module 800.
The HSV component calculating module 300 is configured to calculate HSV components corresponding to each pixel point in the target image.
The characteristic value generating module 400 is configured to obtain a luminance characteristic value and a chrominance characteristic value corresponding to each pixel point according to the HSV component corresponding to each pixel point.
The original feature matrix generating module 700 is configured to generate an original feature distribution matrix corresponding to the target image according to the luminance feature values and the chrominance feature values of all the pixel points. The original feature matrix generating module 700 generates an original feature distribution matrix corresponding to the target image in the following manner: and generating an original characteristic distribution matrix corresponding to the target image for each pixel point in all the pixel points according to the respective brightness characteristic value and the chrominance characteristic value.
The target feature matrix generation module 800 is configured to process the original feature distribution matrix according to a preset template matrix and a preset normalization scale to obtain a target feature distribution matrix.
Specifically, referring to fig. 12, fig. 12 is a schematic structural diagram of a target feature matrix generating module 800 of an image processing apparatus 20 according to an embodiment of the present invention, in which the target feature matrix generating module 800 includes a convolution calculating unit 810 and a normalization scale calculating unit 820.
The convolution calculating unit 810 is configured to convolve the original feature distribution matrix with a preset template matrix to generate an intermediate feature distribution matrix.
The normalization scale calculation unit 820 is configured to process the intermediate feature distribution matrix according to a preset normalization scale to obtain a target feature distribution matrix.
As an embodiment, referring to fig. 11, in an embodiment of the present invention, the image processing apparatus 20 further includes a data normalization processing module 500 and a feature value updating module 600.
The data normalization processing module 500 is configured to perform data normalization processing on the luminance characteristic value and the chrominance characteristic value corresponding to each pixel point, respectively, to obtain a processed luminance characteristic value and a processed chrominance characteristic value.
Specifically, referring to fig. 13, fig. 13 shows a schematic structural diagram of a data normalization processing module 500 of an image processing apparatus 20 according to an embodiment of the present invention, where in the embodiment of the present invention, the data normalization processing module 500 includes a luminance standard processing unit 510 and a chrominance standard processing unit 520.
The brightness standard processing unit 510 is configured to process the brightness characteristic value corresponding to each pixel point according to a preset brightness characteristic step length, so as to obtain the processed brightness characteristic value.
The chromaticity standard processing unit 520 is configured to process the chromaticity characteristic value corresponding to each pixel point according to a preset chromaticity characteristic step length, a preset saturation characteristic threshold, and a saturation characteristic value corresponding to each pixel point, so as to obtain the processed chromaticity characteristic value.
Specifically, referring to fig. 14, fig. 14 shows a schematic structure diagram of a chrominance standard processing unit 520 of an image processing apparatus 20 according to an embodiment of the present invention, in which the chrominance standard processing unit 520 includes a determining subunit 521, a first chrominance processing subunit 522, and a second chrominance processing subunit 523.
The determining subunit 521 is configured to determine whether the saturation characteristic value corresponding to the target pixel point is smaller than a preset saturation characteristic value.
The first chrominance processing subunit 522 is configured to, when the determining subunit 521 determines that the saturation characteristic value corresponding to the target pixel point is greater than or equal to the preset saturation characteristic threshold, process the chrominance characteristic value corresponding to the target pixel point according to a preset chrominance characteristic step length, to obtain a processed chrominance characteristic value.
The second chrominance processing subunit 523 is configured to, when the determining subunit 521 determines that the saturation characteristic value corresponding to the target pixel point is smaller than the preset saturation characteristic value, use the preset chrominance characteristic value as the processed chrominance characteristic value.
Referring to fig. 11, the eigenvalue updating module 600 is configured to update the luminance eigenvalue and the chrominance eigenvalue of each pixel point by using the processed luminance eigenvalue and the processed chrominance eigenvalue.
As an embodiment, with continuing reference to fig. 11, in an embodiment of the present invention, the image processing apparatus 20 further includes an image dividing module 200, where the image dividing module 200 is configured to divide the original image into a plurality of sub-images.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The apparatus embodiments described above are merely illustrative and, for example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, each functional module in the embodiments of the present invention may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiment of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
In summary, according to the image processing method, the image processing device, the electronic eye and the storage medium provided by the embodiments of the present invention, a target feature distribution matrix corresponding to a target image is generated by combining luminance feature values and chrominance feature values of all pixels in the target image, so that the electronic eye 10 checks whether a tracking error occurs according to the target feature distribution matrix.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned.

Claims (9)

1. An image processing method, characterized in that the method comprises:
calculating HSV components corresponding to all pixel points in the target image;
obtaining a brightness characteristic value and a chromaticity characteristic value corresponding to each pixel point according to the HSV component corresponding to each pixel point;
generating an original characteristic distribution matrix corresponding to the target image according to the brightness characteristic values and the chrominance characteristic values of all the pixel points, wherein the original characteristic distribution matrix represents a two-dimensional matrix formed by the brightness characteristic values and the chrominance characteristic values;
processing the original characteristic distribution matrix by a preset template matrix and a preset normalization scale to obtain a target characteristic distribution matrix;
the step of processing the original characteristic distribution matrix by a preset template matrix and a preset normalization scale to obtain a target characteristic distribution matrix comprises the following steps:
convolution processing the original characteristic distribution matrix by a preset template matrix to generate an intermediate characteristic distribution matrix;
and processing the intermediate characteristic distribution matrix according to a preset normalization scale to obtain a target characteristic distribution matrix.
2. The method of claim 1, wherein the step of generating the original feature distribution matrix corresponding to the target image according to the luminance feature value and the chrominance feature value of all the pixel points comprises:
and generating an original characteristic distribution matrix corresponding to the target image for each pixel point in all the pixel points according to the respective brightness characteristic value and the chrominance characteristic value.
3. The method according to claim 1, wherein after the step of obtaining the corresponding luminance characteristic value and the corresponding chrominance characteristic value of each pixel point according to the corresponding HSV component of each pixel point, the method further comprises:
respectively carrying out data standardization processing on the brightness characteristic value and the chromaticity characteristic value corresponding to each pixel point to obtain a processed brightness characteristic value and a processed chromaticity characteristic value;
and updating the respective brightness characteristic value and the chrominance characteristic value of each pixel point by using the processed brightness characteristic value and the processed chrominance characteristic value.
4. The method of claim 3, wherein the step of respectively performing data normalization processing on the luminance eigenvalue and the chrominance eigenvalue corresponding to each pixel point to obtain a processed luminance eigenvalue and a processed chrominance eigenvalue comprises:
processing the brightness characteristic value corresponding to each pixel point according to a preset brightness characteristic step length to obtain the processed brightness characteristic value;
and processing the chrominance characteristic value corresponding to each pixel point according to a preset chrominance characteristic step length, a preset saturation characteristic threshold value and the saturation characteristic value corresponding to each pixel point to obtain the processed chrominance characteristic value.
5. The method according to claim 4, wherein the step of processing the chrominance characteristic value corresponding to each pixel point according to a preset chrominance characteristic step size, a preset saturation characteristic threshold and the saturation characteristic value corresponding to each pixel point to obtain the processed chrominance characteristic value comprises:
when the saturation characteristic value corresponding to the target pixel point is larger than or equal to a preset saturation characteristic threshold value, processing the chrominance characteristic value corresponding to the target pixel point by a preset chrominance characteristic step length to obtain the processed chrominance characteristic value;
and when the saturation characteristic value corresponding to the target pixel point is smaller than the preset saturation characteristic value, taking the preset chroma characteristic value as the processed chroma characteristic value.
6. The method of claim 1, wherein prior to the step of calculating HSV components corresponding to a target region in a target image, the method further comprises:
the original image is divided into a plurality of sub-images.
7. An image processing apparatus, characterized in that the apparatus comprises:
the HSV component calculation module is used for calculating HSV components corresponding to all pixel points in the target image;
the characteristic value generation module is used for obtaining a brightness characteristic value and a chroma characteristic value which correspond to each pixel point according to the HSV component which corresponds to each pixel point;
the original characteristic matrix generating module is used for generating an original characteristic distribution matrix corresponding to the target image according to the brightness characteristic values and the chrominance characteristic values of all the pixel points;
the target characteristic matrix generation module is used for processing the original characteristic distribution matrix according to a preset template matrix and a preset normalization scale to obtain a target characteristic distribution matrix;
the target characteristic matrix generation module comprises a convolution calculation unit and a normalization scale calculation unit;
the convolution calculation unit is used for carrying out convolution processing on the original characteristic distribution matrix by using a preset template matrix to generate an intermediate characteristic distribution matrix;
and the normalization scale calculation unit is used for processing the intermediate characteristic distribution matrix according to a preset normalization scale to obtain a target characteristic distribution matrix.
8. An electronic eye, comprising:
a memory for storing one or more programs;
a processor;
the one or more programs, when executed by the processor, implement the method of any of claims 1-6.
9. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-6.
CN201810961993.7A 2018-08-22 2018-08-22 Image processing method, image processing device, electronic eye and storage medium Active CN110858281B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810961993.7A CN110858281B (en) 2018-08-22 2018-08-22 Image processing method, image processing device, electronic eye and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810961993.7A CN110858281B (en) 2018-08-22 2018-08-22 Image processing method, image processing device, electronic eye and storage medium

Publications (2)

Publication Number Publication Date
CN110858281A CN110858281A (en) 2020-03-03
CN110858281B true CN110858281B (en) 2022-10-04

Family

ID=69634969

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810961993.7A Active CN110858281B (en) 2018-08-22 2018-08-22 Image processing method, image processing device, electronic eye and storage medium

Country Status (1)

Country Link
CN (1) CN110858281B (en)

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101080025A (en) * 2007-06-18 2007-11-28 东莞黄江达裕科技电子厂 A white balance adjustment method and system
CN101345820A (en) * 2008-08-01 2009-01-14 中兴通讯股份有限公司 Image brightness reinforcing method
CN101399919A (en) * 2007-09-25 2009-04-01 展讯通信(上海)有限公司 Method for automatic exposure and automatic gain regulation and method thereof
CN101511033A (en) * 2008-12-18 2009-08-19 昆山锐芯微电子有限公司 Image processing process for CMOS image sensor
CN101814179A (en) * 2009-02-19 2010-08-25 富士通株式会社 Image enhancement method and image enhancement device
CN101854536A (en) * 2009-04-01 2010-10-06 深圳市融创天下科技发展有限公司 Method for improving image visual effect for video encoding and decoding
CN102298781A (en) * 2011-08-16 2011-12-28 长沙中意电子科技有限公司 Motion shadow detection method based on color and gradient characteristics
CN102780889A (en) * 2011-05-13 2012-11-14 中兴通讯股份有限公司 Video image processing method, device and equipment
CN103077537A (en) * 2013-01-15 2013-05-01 北京工业大学 Novel L1 regularization-based real-time moving target tracking method
CN103971347A (en) * 2014-06-04 2014-08-06 深圳市赛为智能股份有限公司 Method and device for treating shadow in video image
CN104144343A (en) * 2014-07-11 2014-11-12 东北大学 Digital image compressing, encrypting and encoding combined method
CN105184824A (en) * 2015-09-30 2015-12-23 重庆师范大学 Intelligent agricultural bird repelling system and method based on image sensing network
CN106131412A (en) * 2016-07-12 2016-11-16 浙江宇视科技有限公司 A kind of image processing method, system and electronic equipment
CN106604024A (en) * 2016-12-14 2017-04-26 北京集创北方科技股份有限公司 Image data processing method and apparatus
CN107071417A (en) * 2017-04-10 2017-08-18 电子科技大学 A kind of intra-frame prediction method for Video coding
CN107256543A (en) * 2017-06-21 2017-10-17 深圳市万普拉斯科技有限公司 Image processing method, device, electronic equipment and storage medium
CN107871303A (en) * 2016-09-26 2018-04-03 北京金山云网络技术有限公司 A kind of image processing method and device
CN108288253A (en) * 2018-01-08 2018-07-17 厦门美图之家科技有限公司 HDR image generation method and device

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101080025A (en) * 2007-06-18 2007-11-28 东莞黄江达裕科技电子厂 A white balance adjustment method and system
CN101399919A (en) * 2007-09-25 2009-04-01 展讯通信(上海)有限公司 Method for automatic exposure and automatic gain regulation and method thereof
CN101345820A (en) * 2008-08-01 2009-01-14 中兴通讯股份有限公司 Image brightness reinforcing method
CN101511033A (en) * 2008-12-18 2009-08-19 昆山锐芯微电子有限公司 Image processing process for CMOS image sensor
CN101814179A (en) * 2009-02-19 2010-08-25 富士通株式会社 Image enhancement method and image enhancement device
CN101854536A (en) * 2009-04-01 2010-10-06 深圳市融创天下科技发展有限公司 Method for improving image visual effect for video encoding and decoding
CN102780889A (en) * 2011-05-13 2012-11-14 中兴通讯股份有限公司 Video image processing method, device and equipment
CN102298781A (en) * 2011-08-16 2011-12-28 长沙中意电子科技有限公司 Motion shadow detection method based on color and gradient characteristics
CN103077537A (en) * 2013-01-15 2013-05-01 北京工业大学 Novel L1 regularization-based real-time moving target tracking method
CN103971347A (en) * 2014-06-04 2014-08-06 深圳市赛为智能股份有限公司 Method and device for treating shadow in video image
CN104144343A (en) * 2014-07-11 2014-11-12 东北大学 Digital image compressing, encrypting and encoding combined method
CN105184824A (en) * 2015-09-30 2015-12-23 重庆师范大学 Intelligent agricultural bird repelling system and method based on image sensing network
CN106131412A (en) * 2016-07-12 2016-11-16 浙江宇视科技有限公司 A kind of image processing method, system and electronic equipment
CN107871303A (en) * 2016-09-26 2018-04-03 北京金山云网络技术有限公司 A kind of image processing method and device
CN106604024A (en) * 2016-12-14 2017-04-26 北京集创北方科技股份有限公司 Image data processing method and apparatus
CN107071417A (en) * 2017-04-10 2017-08-18 电子科技大学 A kind of intra-frame prediction method for Video coding
CN107256543A (en) * 2017-06-21 2017-10-17 深圳市万普拉斯科技有限公司 Image processing method, device, electronic equipment and storage medium
CN108288253A (en) * 2018-01-08 2018-07-17 厦门美图之家科技有限公司 HDR image generation method and device

Also Published As

Publication number Publication date
CN110858281A (en) 2020-03-03

Similar Documents

Publication Publication Date Title
US7664315B2 (en) Integrated image processor
Wu et al. Lane-mark extraction for automobiles under complex conditions
Rezaei et al. Robust vehicle detection and distance estimation under challenging lighting conditions
Son et al. Real-time illumination invariant lane detection for lane departure warning system
US20210117704A1 (en) Obstacle detection method, intelligent driving control method, electronic device, and non-transitory computer-readable storage medium
JP4416039B2 (en) Striped pattern detection system, striped pattern detection method, and striped pattern detection program
US8599257B2 (en) Vehicle detection device, vehicle detection method, and vehicle detection program
Kim et al. Rear obstacle detection system with fisheye stereo camera using HCT
WO2023082784A1 (en) Person re-identification method and apparatus based on local feature attention
WO2020258077A1 (en) Pedestrian detection method and device
Chang et al. An efficient method for lane-mark extraction in complex conditions
CN112597846A (en) Lane line detection method, lane line detection device, computer device, and storage medium
KR20220049864A (en) Method of recognizing license number of vehicle based on angle of recognized license plate
WO2020238073A1 (en) Method for determining orientation of target object, intelligent driving control method and apparatus, and device
US8681221B2 (en) Vehicular image processing device and vehicular image processing program
CN110858281B (en) Image processing method, image processing device, electronic eye and storage medium
CN109785367B (en) Method and device for filtering foreign points in three-dimensional model tracking
WO2019085929A1 (en) Image processing method, device for same, and method for safe driving
CN114898306B (en) Method and device for detecting target orientation and electronic equipment
CN110633705A (en) Low-illumination imaging license plate recognition method and device
CN108259819B (en) Dynamic image feature enhancement method and system
CN112446230B (en) Lane line image recognition method and device
CN111461128A (en) License plate recognition method and device
US20240037976A1 (en) Information processing device, information processing method, and computer-readable recording medium
CN114119609B (en) Method, device and equipment for detecting image stain concentration and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant