CN105404856A - Public traffic vehicle seat occupied state detection method - Google Patents

Public traffic vehicle seat occupied state detection method Download PDF

Info

Publication number
CN105404856A
CN105404856A CN201510732676.4A CN201510732676A CN105404856A CN 105404856 A CN105404856 A CN 105404856A CN 201510732676 A CN201510732676 A CN 201510732676A CN 105404856 A CN105404856 A CN 105404856A
Authority
CN
China
Prior art keywords
frame
value
region
seat occupancy
difference
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510732676.4A
Other languages
Chinese (zh)
Other versions
CN105404856B (en
Inventor
肖梅
黄颖
张雷
颜建强
张慧铭
王杏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changan University
Original Assignee
Changan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changan University filed Critical Changan University
Priority to CN201510732676.4A priority Critical patent/CN105404856B/en
Publication of CN105404856A publication Critical patent/CN105404856A/en
Application granted granted Critical
Publication of CN105404856B publication Critical patent/CN105404856B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/593Recognising seat occupancy

Abstract

Disclosed is a public traffic vehicle seat occupied state detection method. A monitored seat region is selected firstly; according to bus GPS data, video image frames of a vehicle operating between stations are acquired; on the basis of a gray image, an abnormal illumination region of the seat region is removed; blue filtering is performed in an HSV color space, and a chroma filtering coefficient is calculated; for an initial frame, if the chroma filtering coefficient is greater than a high threshold, a vacant seat is judged to exist, otherwise the vacant seat is judged to not exist; for a non-initial frame, if the chroma filtering coefficient is greater than the high threshold, a vacant seat is judged to exist, and if the chroma filtering coefficient is less than a low threshold, the vacant seat is judged to not exist; if the chroma filtering coefficient is between the low threshold and the high threshold, a difference superposed region between a current frame and a previous frame is extracted, and a difference superposed coefficient is calculated; if the difference superposed coefficient is greater than a difference threshold, a vacant seat is judged to exist, otherwise, a similar superposed region is extracted, and a similar region superposed coefficient is calculated; and, if the similar region superposed coefficient is greater than a similar threshold, a vacant seat is judged to exist, otherwise the vacant seat is judged to not exist in the monitored region. The method has the advantage of high detection precision.

Description

A kind of public transit vehicle seat occupancy states detection method
Technical field:
The present invention relates to a kind of expanded application of bus camera monitoring system, particularly relate to a kind of public transit vehicle seat occupancy states detection method based on image processing techniques.
Background technology:
Urban public transport because carrying capacity is large, security and operational efficiency high, be the important component part of urban transportation, be considered to the approach of the best solving urban traffic congestion.The punctual, comfortable of public transit vehicle can increase the attractive force of public transport to passenger.Whether have vacant position in bus is one of the important evaluation index of public transport comfort level.
At present, in bus, room detection method is mainly obtained by seating capacity in contrast passenger inside the vehicle's number and car.The method of public transit vehicle passenger inside the vehicle programming count mainly contains: to swipe the card statistic law etc. based on the passenger flow statistics method of pressure pedal, infrared radiation detection apparatus statistic law, bill.Pressure pedal passenger flow statistics method and infrared radiation detection apparatus statistic law are passed through at upper and lower car door setting pressure pedal sensor or infrared radiation detection apparatus, the number that the real time measure is got on the bus and got off, thus the ridership in statistics car.When passenger crowded by time, the number of counting easily produces error, and sensor is expensive and easily damage.Bill swipe the card statistic law be passenger swipe the card at entrance door get on the bus, the exit door mode of getting off of swiping the card obtains passenger flow data, the method is simple, feasible, but cannot add up to get on the bus without card needs to pay existing part passenger.The passenger flow data that goes out of three kinds of method statistics above, needs concrete seating capacity in conjunction with public transport to judge the room situation of car door, and the result judged can produce the result of mistake along with the passenger flow error of statistics.In addition, direct surveillance's method is also had, mainly through human eye real-time monitored public transport video data, to judge ridership in car and Seats Occupied Information.It is lower to there is efficiency in direct surveillance's method, and cost of labor is higher, is not suitable for extensive detection.
Summary of the invention:
The object of the present invention is to provide a kind of public transit vehicle seat occupancy states detection method.
For achieving the above object, the present invention adopts following technical scheme to realize:
A kind of public transit vehicle seat occupancy states detection method, comprises the following steps:
Step 0: the seating area Ba choosing supervision; Wherein, selected seating area Ba is binary map, the region representation seating area of all pixel compositions of Ba (x, y)=1, the non-seating district of region representation of all pixel compositions of Ba (x, y)=0;
Step 1: vehicle arrives at a station detection
Read the GPS location data of public transit vehicle, if the position data of the GPS location data of this moment public transit vehicle and bus parking website overlaps, then reset by frame number, i.e. k=0, proceeds to step 1; If position data does not overlap, then proceed to step 2;
Step 2: read the video frame image in the bus also between storage sites
Sample frequency is 5S/ frame, the picture frame symbol E of collection krepresent, k refers to the frame number of current image frame, each reading one frame, and by frame number cumulative 1, i.e. k=k+1, image size is M × N, wherein, M and N separated image frame E khead office and total row;
Step 3: gray processing process
The picture frame E that step 2 is obtained kcarry out gray processing process, obtain gray level image f k;
Step 4: remove illumination abnormal area
In the seating area Ba that step 0 obtains, remove gray level image f kabnormal area of illumination, namely strong or too low region is crossed in illumination, obtains seating area Se to be measured k;
Step 5:RGB color space goes to hsv color space
By the RGB image E collected in step 2 kbe converted to hsv color space, obtain HSV image Be k;
Step 6: to HSV image Be kcarry out color filter
The seating area Se to be measured obtained in step 4 kin region, to the HSV image Be obtained in step 5 kcarry out color filter, obtain room district Ca k;
Step 7: calculate colourity filter factor a
Step 8: initial frame judges
Judge current detection frame whether as initial frame, namely judge whether k equals 1; If k=1, proceed to step 9; If k ≠ 1, then proceed to step 10;
Step 9: calculate initial frame seat occupancy mark, and proceed to step 21
Use fg krepresent the image seat occupancy mark of kth (k=1) frame, computing formula is as follows:
In formula, Ta 1for high threshold, value is 0-1;
Step 10: the seat occupancy mark calculating non-initial frame
Calculate the seat occupancy mark of non-initial frame, computing formula is as follows:
Wherein, fg k=2 represent that kth frame seat occupancy states is unknown, need to carry out subsequent analysis; Ta 2for Low threshold, value is 0-1 and meets Ta 2≤ Ta 1;
Step 11: seat occupancy states judges
If fg k≠ 2, i.e. fg k=0 or fg k=1, proceed to step 21; If fg k=2, proceed to step 12;
Step 12: the difference value calculating present frame and front frame
The colour difference, the saturation degree that calculate present frame and front frame are respectively poor, luminance difference and fusion poor;
Step 13: detect the difference section of present frame and front frame and similar district
For the arbitrary pixel in kth frame picture, if when the difference value of present frame and front frame is greater than the threshold value of restriction, think that this pixel is discrepancy, the region be made up of discrepancy is difference section; If when the difference value of present frame and front frame is less than the threshold value of restriction, think that this pixel is similitude, the region be made up of similitude is similar district;
At seating area Se to be measured kin, segmentation threshold is selected to difference value, obtains segmentation figure Mf k; Segmentation figure Mf kmiddle pixel value is the region of the pixel composition of 1 is difference section, uses symbol M D krepresent, pixel value is the region of the pixel composition of 0 is similar district, uses symbol M I krepresent;
Step 14: extract difference overlapping region
Extract difference section MD kwith room district Ca koverlapping region, obtain room disparity map HD k, computing formula is as follows:
Step 15: calculated difference district overlap coefficient af
The computing formula of difference section overlap coefficient af is as follows:
a f = S H k S M k - - - ( 16 )
Wherein, for the room disparity map HD that step 14 obtains karea, namely this pixel values in regions is the number of pixels of 1; for the difference section MD that step 13 obtains karea, namely this pixel values in regions is the number of pixels of 1;
Step 16: calculate seat occupancy mark
Calculate seat occupancy mark fg k, computing formula is as follows:
Wherein, Ta 3for discrepancy threshold, its value is 0-1;
Step 17: seat occupancy states judges
If fg k=0, proceed to step 21; If fg k=2, proceed to step 18;
Step 18: extract similar overlapping region
Extract similar district MI kwith room district Ca k-1overlapping region, obtain room similar diagram Ls k, computing formula is as follows:
Step 19: calculate similar district overlap coefficient as
The computing formula of similar district overlap coefficient as is as follows:
a s = S L k S C k - - - ( 19 )
Wherein, for the room similar diagram Ls that step 18 obtains karea, namely this pixel values in regions is the number of pixels of 1; Sc krepresent the room district Ca that step 6 obtains karea, namely this pixel values in regions is the sum of all pixels of 1;
Step 20: calculate seat occupancy mark fg k
Calculate seat occupancy mark fg k, computing formula is as follows:
Wherein, Ta 4for similar threshold value, its value is 0-1;
Step 21: seat occupancy states judges
If fg k=0, judging that this moment seat is unoccupied, is vacant seat state; If fg k=1, judge that this moment loses one's seat use, for without seat state.
The present invention further improves and is, in step 3, and the picture frame E that step 2 is obtained kcarry out gray processing process computing formula as follows:
f k(x,y)=0.3×R k(x,y)+0.59×G k(x,y)+0.11×B k(x,y)(1)
Wherein, f k(x, y) represents gray level image f kthe gray-scale value of pixel (x, y), R k(x, y), G k(x, y) and B k(x, y) represents picture frame E respectively kthe red color component value of location of pixels (x, y), green component values and blue color component value.
The present invention further improves and is, in step 4, in the seating area Ba that step 0 obtains, removes gray level image f kabnormal area of illumination computing formula as follows:
In formula, δ 1for low clear zone threshold value, value is 0-50; δ 2for highlight bar threshold value, value is 230-256.
The present invention further improves and is, in step 5, by the picture frame E collected in step 2 kbe converted to hsv color space computing formula as follows:
H k ( x , y ) = a r c c o s 1 2 [ ( R k ( x , y ) - G k ( x , y ) ) + ( R k ( x , y ) - B k ( x , y ) ) ] ( R k ( x , y ) - G k ( x , y ) ) 2 + ( R k ( x , y ) - B k ( x , y ) ) ( G k ( x , y ) - B k ( x , y ) ) - - - ( 3 )
S k ( x , y ) = m a x ( R k ( x , y ) , G k ( x , y ) , B k ( x , y ) ) - m i n ( R k ( x , y ) , G k ( x , y ) , B k ( x , y ) ) m a x ( R k ( x , y ) , G k ( x , y ) , B k ( x , y ) ) - - - ( 4 )
V k ( x , y ) = m a x ( R k ( x , y ) , G k ( x , y ) , B k ( x , y ) ) 255 - - - ( 5 )
Wherein, H k(x, y), S k(x, y) and V k(x, y) is respectively the chromatic value of pixel (x, y), intensity value and brightness value.
The present invention further improves and is, in step 6, and the seating area Se to be measured obtained in step 4 kin region, to the HSV image Be obtained in step 5 kcarry out color filter computing formula as follows:
In formula, h lfor low Chroma threshold, span is 0-1; h hfor high chroma threshold value, span is 0-1, and meets h l< h h; s lfor low saturation threshold value, span is 0-1; s hfor high saturation threshold value, value is 0-1, and meets s l< s h; v lfor low brightness threshold, value is 0-1; v hfor high brightness threshold value, value is 0-1, and meets v l< v h.
The present invention further improves and is, in step 7, the computing formula of colourity filter factor a is as follows:
a = S C k S S k - - - ( 7 )
Wherein, represent the room district Ca that step 6 obtains karea, namely in this region in pixel value be the sum of all pixels of 1; represent that step 4 obtains seating area Se to be checked karea, namely this pixel values in regions is the sum of all pixels of 1.
The present invention further improves and is, in step 12, the colour difference, the saturation degree that calculate present frame and front frame are poor, luminance difference and to merge the computing formula differed from as follows:
Ch k(x,y)=|H k(x,y)-H k-1(x,y)|(10)
Cs k(x,y)=|S k(x,y)-S k-1(x,y)|(11)
Cv k(x,y)=|V k(x,y)-V k-1(x,y)|(12)
Cz k(x,y)=Ch k(x,y)+Cs k(x,y)+Cv k(x,y)(13)
Wherein, Ch kfor the colour difference of kth frame picture and front frame; Cs kfor the saturation degree of kth frame picture and front frame is poor; Cv kfor the luminance difference of kth frame picture and front frame; Cz kfor the fusion of kth frame picture and front frame is poor.
The present invention further improves and is, in step 13, computing formula is as follows:
Wherein, C 1, C 2, C 3and C 4be segmentation threshold, C 1, C 2and C 3value be 0.15-0.3, C 4value be 0.5-0.7.
Relative to prior art, the video acquisition device that the present invention is based in bus obtains the vedio data in car, excavates video image information, analyzes the Seats Occupied Information of public transit vehicle.The present invention has following realistic meaning: one is the service level improving public transport, improves the trip attraction power of public transit vehicle.In bus, Seats Occupied Information is that passenger rides one of the important evaluation index of comfort level or crowding evaluation, after seat in car is occupied, crowding in further analysis car can be bus dispatching department and provides corresponding early warning signal, for vehicle scheduling provides Data support.Two is for public transit vehicle Truck type choice provides foundation.Because the volume of passenger traffic of different public bus network there are differences, by detecting Seats Occupied Information, choose reasonable the vehicle of this circuit can be applicable to.The automatic testing method of Seats Occupied Information in the bus that the present invention proposes, the advantages such as the method has simply, economy, accuracy of detection are high, travelling speed is fast, robustness is good.
Accompanying drawing illustrates:
Fig. 1 is seating area Ba schematic diagram.
Fig. 2 is picture frame E kschematic diagram.
Fig. 3 is gray level image f kschematic diagram.
Fig. 4 is seating area Se to be measured kschematic diagram.
Fig. 5 is room district Ca kschematic diagram.
Fig. 6 is segmentation figure Mf kschematic diagram.
Fig. 7 is room disparity map HD kschematic diagram.
Fig. 8 is room similar diagram Ls kschematic diagram.
Embodiment:
Below in conjunction with drawings and Examples, the present invention is described in further detail.
For monitoring zones of different in bus, current bus is generally laid with 3 and above camera, to gather the view data of zones of different at diverse locations such as front portion, middle part and rear portions.In a particular embodiment, select the vedio data of bus front channels, that uses method of the present invention supervision dress circle takies situation.The invention provides a kind of public transit vehicle seat occupancy states detection method, concrete implementation step is expressed as follows:
Step S0: the seating area Ba choosing supervision.
Due to the difference of bus model and camera installation site, in the video image that different buses collects, also can there is certain difference in the size of seating area and position, need carry out the delimitation of seating area according to different buses.Camera in embodiment mainly monitors front-seat seating area, first the selected seating area needing to monitor, conventional seating area method for selecting has chooses method and artificial selected method automatically.Automatically method of choosing is according to the positional information residing for seat and colouring information, automatically the selected seating area needing to monitor.Artificial selected method be staff by visual determination, the manually selected seating area needing to monitor, and selected seating area is stored for subsequent use.Compare and automatically choose method, the precision of artificial selected method is higher, and what adopt in the present embodiment is manually select method.The seating area delimited represents with symbol Ba, and artificial selected seating area Ba is binary map, Ba (x, the region representation seating area of all pixels composition of y)=1, the non-seating district of region representation of all pixel compositions of Ba (x, y)=0, as shown in Figure 1.
Proceed to step S1.
Step S1: vehicle arrives at a station detection.
Read the GPS location data of public transit vehicle, if the position data of the GPS location data of this moment public transit vehicle and bus parking website overlaps, then reset by frame number, i.e. k=0, proceeds to step S1; If position data does not overlap, then proceed to step S2.
Step S2: read the video frame image in the bus also between storage sites.
Sample frequency is 5S/ frame, the picture frame symbol E of collection krepresent, k refers to the frame number of current image frame, each reading one frame, by frame number cumulative 1, i.e. k=k+1.The image gathered is RGB color format, and image size is that M × N, M and N represent picture frame E respectively khead office and total row, i.e. M=576, N=704.What choose in the present embodiment is that the 12nd two field picture carries out seat detection, picture frame E kas shown in Figure 2, k=12.
Proceed to step S3.
Step S3: gray processing process.
By the picture frame E that step S2 obtains kcarry out gray processing process, obtain gray level image f k, as shown in Figure 3.Computing formula is as shown in formula (1):
f k(x,y)=0.3×R k(x,y)+0.59×G k(x,y)+0.11×B k(x,y)(1)
Wherein, f k(x, y) represents gray level image f kthe gray-scale value of pixel (x, y), R k(x, y), G k(x, y) and B k(x, y) represents picture frame E respectively kthe red color component value of location of pixels (x, y), green component values and blue color component value.
Proceed to step S4.
Step S4: remove illumination abnormal area.
Bus internal cause uneven illumination is even, makes the brightness in some region too high or too low, needs to remove these regions when detecting.In the seating area Ba that step S0 obtains, remove abnormal area of illumination, namely strong or too low region is crossed in illumination, obtains seating area Se to be measured k, as shown in Figure 4.Computing formula is as shown in formula (2):
In formula, δ 1for low clear zone threshold value, value is 0-50; δ 2for highlight bar threshold value, value is 230-256.In the present embodiment, δ 1get 30, δ 2get 250.
Proceed to step S5.
Step S5:RGB color space goes to hsv color space.
Hsv color space is convenient to perform color filter, therefore the image E will collected in step S2 kbe converted to hsv color space, obtain HSV image Be k, computing formula is as follows:
H k ( x , y ) = a r c c o s 1 2 &lsqb; ( R k ( x , y ) - G k ( x , y ) ) + ( R k ( x , y ) - B k ( x , y ) ) &rsqb; ( R k ( x , y ) - G k ( x , y ) ) 2 + ( R k ( x , y ) - B k ( x , y ) ) ( G k ( x , y ) - B k ( x , y ) ) - - - ( 3 )
S k ( x , y ) = m a x ( R k ( x , y ) , G k ( x , y ) , B k ( x , y ) ) - m i n ( R k ( x , y ) , G k ( x , y ) , B k ( x , y ) ) m a x ( R k ( x , y ) , G k ( x , y ) , B k ( x , y ) ) - - - ( 4 )
V k ( x , y ) = m a x ( R k ( x , y ) , G k ( x , y ) , B k ( x , y ) ) 255 - - - ( 5 )
Wherein, H k(x, y), S k(x, y) and V k(x, y) is respectively the chromatic value of pixel (x, y), intensity value and brightness value.
Proceed to step S6.
Step S6: to HSV image Be kcarry out color filter.
At seating area Se to be measured kin (obtaining in step S4) region, to the HSV image Be obtained in step S5 kcarry out color filter, obtain room district Ca k, as shown in Figure 5.In the present embodiment, public transport dress circle is blue, at seating area Se to be measured kin region, to HSV image Be kcarry out blue color filtered, computing formula is as shown in formula (6):
In the present embodiment, h lfor low Chroma threshold, value is 0.5-0.6; h hfor high chroma threshold value, value is 0.65-0.75; s lfor low saturation threshold value, value is 0.15-0.25; s hfor high saturation threshold value, value is 0.8-0.9; v lfor low brightness threshold, value is 0.2-0.35; v hfor high brightness threshold value, value is 0.9-1.0.In the present embodiment, h lvalue is 0.55, h hvalue is 0.7, s lvalue is 0.2, s hvalue is 0.8, v lvalue is 0.3, v hvalue is 0.9.
Proceed to step S7.
Step S7: calculate colourity filter factor a.
Colourity filter factor is for characterizing the colourity ratio at seat, and colourity ratio is larger, and the probability lost one's seat is less.The computing formula of colourity filter factor a is as shown in formula (7):
a = S C k S S k - - - ( 7 )
Wherein, represent the room district Ca that step S6 obtains karea, namely in this region in pixel value be the sum of all pixels of 1; represent that step S4 obtains seating area Se to be checked karea, namely this pixel values in regions is the sum of all pixels of 1.In the present embodiment, Ca karea be 5680, Se karea be 10021, coefficient a is 0.567.
Proceed to step S8.
Step S8: initial frame judges.
Judge current detection frame whether as initial frame, namely judge whether k equals 1.If k=1, proceed to step S9; If k ≠ 1, then proceed to step S10.
In embodiment, k=12, now proceeds to step S10.
Step S9: calculate initial frame seat occupancy mark.
Use fg krepresent the image seat occupancy mark of kth (k=1) frame, computing formula is as shown in formula (8):
In formula, Ta 1for high threshold, value is 0-1.In the present embodiment, Ta 1value be 0.85.
Proceed to step S21.
Step S10: the seat occupancy mark calculating non-initial frame.
Calculate the seat occupancy mark of non-initial frame, namely during k ≠ 1, computing formula is as shown in formula (9):
Wherein, fg k=2 represent that kth frame seat occupancy states is unknown, need to carry out subsequent analysis; Ta 2for Low threshold, value is 0-1 and meets Ta 2≤ Ta 1, in the present embodiment, Ta 2value be 0.55.Obtain colourity filter factor a=0.567 by step S7, its value is less than 0.85 and is greater than 0.55, and therefore taking of this moment marks fg kbe 2.
Proceed to step S11.
Step S11: seat occupancy states judges.
If fg k≠ 2, i.e. fg k=0 or fg k=1, proceed to step S21; If fg k=2, proceed to step S12.
Step S12: the difference value calculating present frame and front frame.
The colour difference, the saturation degree that calculate present frame and front frame are respectively poor, luminance difference and fusion poor.Computing formula is as follows:
Ch k(x,y)=|H k(x,y)-H k-1(x,y)|(10)
Cs k(x,y)=|S k(x,y)-S k-1(x,y)|(11)
Cv k(x,y)=|V k(x,y)-V k-1(x,y)|(12)
Cz k(x,y)=Ch k(x,y)+Cs k(x,y)+Cv k(x,y)(13)
Wherein, Ch kfor the colour difference of kth frame picture and front frame; Cs kfor the saturation degree of kth frame picture and front frame is poor; Cv kfor the luminance difference of kth frame picture and front frame; Cz kfor the fusion of kth frame picture and front frame is poor.
Proceed to step S13.
Step S13: detect the difference section of present frame and front frame and similar district.
For the arbitrary pixel in kth frame picture, if when the difference value of present frame and front frame is excessive, think that this pixel is discrepancy, the region be made up of discrepancy is difference section; If when the difference value of present frame and front frame is less, think that this pixel is similitude, the region be made up of similitude is similar district.
At seating area Se to be measured kin, suitable segmentation threshold is selected to above each difference value, obtains segmentation figure Mf k, as shown in Figure 6.Segmentation figure Mf kmiddle pixel value is the region of the pixel composition of 1 is difference section, uses symbol M D krepresent, pixel value is the region of the pixel composition of 0 is similar district, uses symbol M I krepresent, computing formula is as shown in formula (14):
Wherein, C 1, C 2, C 3and C 4be segmentation threshold, C 1, C 2and C 3value be 0.15-0.3, C 4value be 0.5-0.7.In the present embodiment, C 1value is 0.25, C 2value is 0.25, C 3value is 0.25, C 4value is 0.6.
Proceed to step S14.
Step S14: extract difference overlapping region.
Extract difference section MD k(being obtained by step S13) and room district Ca kthe overlapping region of (being obtained by step S6), obtains room disparity map HD k, as shown in Figure 7, computing formula is as shown in formula (15):
Proceed to step S15.
Step S15: calculated difference district overlap coefficient af.
The computing formula of difference section overlap coefficient af is as follows:
a f = S H k S M k - - - ( 16 )
Wherein, for the room disparity map HD that step S14 obtains karea, namely this pixel values in regions is the number of pixels of 1; for the segmentation figure Mf that step S13 obtains kmiddle difference section MD karea, i.e. this Mf kmiddle pixel value is the number of pixels of 1.In the present embodiment, room disparity map HD karea be 3; Difference section MD karea be 232, difference section overlap coefficient af is 0.012.
Proceed to step S16.
Step S16: calculate seat occupancy mark.
Calculate seat occupancy mark fg k, computing formula is as shown in formula (17):
Wherein, Ta 3for discrepancy threshold, its value is 0-1, in the present embodiment Ta 3value is 0.7.Obtain difference section overlap coefficient af=0.012 by step S16, its value is less than 0.7, and therefore taking of this moment marks fg kbe 2.
Proceed to step S17.
Step S17: seat occupancy states judges.
If fg k=0, proceed to step 21; If fg k=2, proceed to step S18.
Step S18: extract similar overlapping region.Extract segmentation figure Mf kin similar district MI k(being obtained by step S13) and room district Ca k-1the overlapping region of (being obtained by step S5).Obtain room similar diagram Ls k, as shown in Figure 8, computing formula is as shown in formula (18):
Proceed to step S19.
Step S19: calculate similar district overlap coefficient as.
The computing formula of similar district overlap coefficient as is as shown in formula (19):
a s = S L k S C k - - - ( 19 )
Wherein, for the room similar diagram Ls that step S18 obtains karea, namely this pixel values in regions is the number of pixels of 1; Sc krepresent the room district Ca that step S6 obtains karea, namely this pixel values in regions is the sum of all pixels of 1.In the present embodiment, room similar diagram Ls karea be 4740; Room district Ca karea be 4989, similar district overlap coefficient as is 0.95.
Proceed to step S20.
Step S20: calculate seat occupancy mark fg k.
Calculate seat occupancy mark fg k, computing formula is as shown in formula (20):
Wherein, Ta 4for similar threshold value, its value is 0-1, Ta in the present embodiment 4value is 0.56.Obtain difference section overlap coefficient as=0.95 by step S19, its value is greater than 0.56, and therefore taking of embodiment marks fg kbe 0.
Proceed to step S21.
Step S21: seat occupancy states judges.
If fg k=0, judging that this moment seat is unoccupied, is vacant seat state; If fg k=1, judge that this moment loses one's seat use, for without seat state.
Proceed to step S1.
According to the technical scheme of above invention, gather the vedio data in bus, utilize the detection method that the present invention adopts, the automatic detection of Seats Occupied Information in car can be realized, in bus, Seats Occupied Information data can be used for the service level improving public transport, improve the trip attraction power of public transit vehicle, also can be public transit vehicle Truck type choice and Data support is provided.
From working time and accuracy of detection two aspect, the relative merits of the present invention program are described.
(1) working time.Between certain station, duration 2 points is run 3 seconds for certain bus in embodiment, gather 24 two field pictures altogether, at IntelI3M350 processor, the computing machine of 4GB internal memory, utilizes MATLAB software to emulate, measures the picture frame collected, average every frame runs 0.19s consuming time, compared with image frame acquisitions frequency 5s/ frame, the inventive method processing speed is fast, can be applicable in real time processing system.
(2) accuracy of detection.Use W rrepresent the frame number be correctly detected, W represents the totalframes of detection, and η represents accuracy, and selected accuracy is as the evaluation index of accuracy of detection of the present invention, and strive for that rate is higher, accuracy of detection of the present invention is higher.Accuracy calculating formula is as follows:
&eta; = W r W &times; 100 % - - - ( 21 )
Run to terminal to certain bus in the present embodiment from starting point, run duration: when 1 23 points, gather the image run between 684 frame stations altogether, the frame number be correctly detected is 636 frames, calculates for embodiment, and the accuracy of algorithm is 93%.In addition, the 8 sections of video images gathered are tested, finds that accuracy of the present invention is all more than 90%, has higher robustness.

Claims (8)

1. a public transit vehicle seat occupancy states detection method, is characterized in that, comprises the following steps:
Step 0: the seating area Ba choosing supervision; Wherein, selected seating area Ba is binary map, the region representation seating area of all pixel compositions of Ba (x, y)=1, the non-seating district of region representation of all pixel compositions of Ba (x, y)=0;
Step 1: vehicle arrives at a station detection
Read the GPS location data of public transit vehicle, if the position data of the GPS location data of this moment public transit vehicle and bus parking website overlaps, then reset by frame number, i.e. k=0, proceeds to step 1; If position data does not overlap, then proceed to step 2;
Step 2: read the video frame image in the bus also between storage sites
Sample frequency is 5S/ frame, the picture frame symbol E of collection krepresent, k refers to the frame number of current image frame, each reading one frame, and by frame number cumulative 1, i.e. k=k+1, image size is M × N, wherein, M and N separated image frame E khead office and total row;
Step 3: gray processing process
The picture frame E that step 2 is obtained kcarry out gray processing process, obtain gray level image f k;
Step 4: remove illumination abnormal area
In the seating area Ba that step 0 obtains, remove gray level image f kabnormal area of illumination, namely strong or too low region is crossed in illumination, obtains seating area Se to be measured k;
Step 5:RGB color space goes to hsv color space
By the RGB image E collected in step 2 kbe converted to hsv color space, obtain HSV image Be k;
Step 6: to HSV image Be kcarry out color filter
The seating area Se to be measured obtained in step 4 kin region, to the HSV image Be obtained in step 5 kcarry out color filter, obtain room district Ca k;
Step 7: calculate colourity filter factor a
Step 8: initial frame judges
Judge current detection frame whether as initial frame, namely judge whether k equals 1; If k=1, proceed to step 9; If k ≠ 1, then proceed to step 10;
Step 9: calculate initial frame seat occupancy mark, and proceed to step 21
Use fg krepresent the image seat occupancy mark of kth (k=1) frame, computing formula is as follows:
In formula, Ta 1for high threshold, value is 0-1;
Step 10: the seat occupancy mark calculating non-initial frame
Calculate the seat occupancy mark of non-initial frame, computing formula is as follows:
Wherein, fg k=2 represent that kth frame seat occupancy states is unknown, need to carry out subsequent analysis; Ta 2for Low threshold, value is 0-1 and meets Ta 2≤ Ta 1;
Step 11: seat occupancy states judges
If fg k≠ 2, i.e. fg k=0 or fg k=1, proceed to step 21; If fg k=2, proceed to step 12;
Step 12: the difference value calculating present frame and front frame
The colour difference, the saturation degree that calculate present frame and front frame are respectively poor, luminance difference and fusion poor;
Step 13: detect the difference section of present frame and front frame and similar district
For the arbitrary pixel in kth frame picture, if when the difference value of present frame and front frame is greater than the threshold value of restriction, think that this pixel is discrepancy, the region be made up of discrepancy is difference section; If when the difference value of present frame and front frame is less than the threshold value of restriction, think that this pixel is similitude, the region be made up of similitude is similar district;
At seating area Se to be measured kin, segmentation threshold is selected to difference value, obtains segmentation figure Mf k; Segmentation figure Mf kmiddle pixel value is the region of the pixel composition of 1 is difference section, uses symbol M D krepresent, pixel value is the region of the pixel composition of 0 is similar district, uses symbol M I krepresent;
Step 14: extract difference overlapping region
Extract difference section MD kwith room district Ca koverlapping region, obtain room disparity map HD k, computing formula is as follows:
Step 15: calculated difference district overlap coefficient af
The computing formula of difference section overlap coefficient af is as follows:
a f = S H k S M k - - - ( 16 )
Wherein, S hkfor the room disparity map HD that step 14 obtains karea, namely this pixel values in regions is the number of pixels of 1; S mkfor the difference section MD that step 13 obtains karea, namely this pixel values in regions is the number of pixels of 1;
Step 16: calculate seat occupancy mark
Calculate seat occupancy mark fg k, computing formula is as follows:
Wherein, Ta 3for discrepancy threshold, its value is 0-1;
Step 17: seat occupancy states judges
If fg k=0, proceed to step 21; If fg k=2, proceed to step 18;
Step 18: extract similar overlapping region
Extract similar district MI kwith room district Ca k-1overlapping region, obtain room similar diagram Ls k, computing formula is as follows:
Step 19: calculate similar district overlap coefficient as
The computing formula of similar district overlap coefficient as is as follows:
a s = S L k S C k - - - ( 19 )
Wherein, SL kfor the room similar diagram Ls that step 18 obtains karea, namely this pixel values in regions is the number of pixels of 1; Sc krepresent the room district Ca that step 6 obtains karea, namely this pixel values in regions is the sum of all pixels of 1;
Step 20: calculate seat occupancy mark fg k
Calculate seat occupancy mark fg k, computing formula is as follows:
Wherein, Ta 4for similar threshold value, its value is 0-1;
Step 21: seat occupancy states judges
If fg k=0, judging that this moment seat is unoccupied, is vacant seat state; If fg k=1, judge that this moment loses one's seat use, for without seat state.
2. a kind of public transit vehicle seat occupancy states detection method according to claim 1, is characterized in that, in step 3, and the picture frame E that step 2 is obtained kcarry out gray processing process computing formula as follows:
f k(x,y)=0.3×R k(x,y)+0.59×G k(x,y)+0.11×B k(x,y)(1)
Wherein, f k(x, y) represents gray level image f kthe gray-scale value of pixel (x, y), R k(x, y), G k(x, y) and B k(x, y) represents picture frame E respectively kthe red color component value of location of pixels (x, y), green component values and blue color component value.
3. a kind of public transit vehicle seat occupancy states detection method according to claim 1, is characterized in that, in step 4, in the seating area Ba that step 0 obtains, removes gray level image f kabnormal area of illumination computing formula as follows:
In formula, δ 1for low clear zone threshold value, value is 0-50; δ 2for highlight bar threshold value, value is 230-256.
4. a kind of public transit vehicle seat occupancy states detection method according to claim 1, is characterized in that, in step 5, by the picture frame E collected in step 2 kbe converted to hsv color space computing formula as follows:
H k ( x , y ) = arccos 1 2 &lsqb; ( R k ( x , y ) - G k ( x , y ) ) + ( R k ( x , y ) - B k ( x , y ) ) &rsqb; ( R k ( x , y ) - G k ( x , y ) ) 2 + ( R k ( x , y ) - B k ( x , y ) ) ( G k ( x , y ) - B k ( x , y ) ) - - - ( 3 )
S k ( x , y ) = m a x ( R k ( x , y ) , G k ( x , y ) , B k ( x , y ) ) - m i n ( R k ( x , y ) , G k ( x , y ) , B k ( x , y ) ) m a x ( R k ( x , y ) , G k ( x , y ) , B k ( x , y ) ) - - - ( 4 )
V k ( x , y ) = m a x ( R k ( x , y ) , G k ( x , y ) , B k ( x , y ) ) 255 - - - ( 5 )
Wherein, H k(x, y), S k(x, y) and V k(x, y) is respectively the chromatic value of pixel (x, y), intensity value and brightness value.
5. a kind of public transit vehicle seat occupancy states detection method according to claim 1, is characterized in that, in step 6, and the seating area Se to be measured obtained in step 4 kin region, to the HSV image Be obtained in step 5 kcarry out color filter computing formula as follows:
In formula, h lfor low Chroma threshold, span is 0-1; h hfor high chroma threshold value, span is 0-1, and meets h l< h h; s lfor low saturation threshold value, span is 0-1; s hfor high saturation threshold value, value is 0-1, and meets s l< s h; v lfor low brightness threshold, value is 0-1; v hfor high brightness threshold value, value is 0-1, and meets v l< v h.
6. a kind of public transit vehicle seat occupancy states detection method according to claim 1, is characterized in that, in step 7, the computing formula of colourity filter factor a is as follows:
a = S C k S S k - - - ( 7 )
Wherein, S ckrepresent the room district Ca that step 6 obtains karea, namely in this region in pixel value be the sum of all pixels of 1; S skrepresent that step 4 obtains seating area Se to be checked karea, namely this pixel values in regions is the sum of all pixels of 1.
7. a kind of public transit vehicle seat occupancy states detection method according to claim 1, is characterized in that, in step 12, the colour difference, the saturation degree that calculate present frame and front frame are poor, luminance difference and to merge the computing formula differed from as follows:
Ch k(x,y)=|H k(x,y)-H k-1(x,y)|(10)
Cs k(x,y)=|S k(x,y)-S k-1(x,y)|(11)
Cv k(x,y)=|V k(x,y)-V k-1(x,y)|(12)
Cz k(x,y)=Ch k(x,y)+Cs k(x,y)+Cv k(x,y)(13)
Wherein, Ch kfor the colour difference of kth frame picture and front frame; Cs kfor the saturation degree of kth frame picture and front frame is poor; Cv kfor the luminance difference of kth frame picture and front frame; Cz kfor the fusion of kth frame picture and front frame is poor.
8. a kind of public transit vehicle seat occupancy states detection method according to claim 1, it is characterized in that, in step 13, computing formula is as follows:
Wherein, C 1, C 2, C 3and C 4be segmentation threshold, C 1, C 2and C 3value be 0.15-0.3, C 4value be 0.5-0.7.
CN201510732676.4A 2015-11-02 2015-11-02 A kind of public transit vehicle seat occupancy states detection method Expired - Fee Related CN105404856B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510732676.4A CN105404856B (en) 2015-11-02 2015-11-02 A kind of public transit vehicle seat occupancy states detection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510732676.4A CN105404856B (en) 2015-11-02 2015-11-02 A kind of public transit vehicle seat occupancy states detection method

Publications (2)

Publication Number Publication Date
CN105404856A true CN105404856A (en) 2016-03-16
CN105404856B CN105404856B (en) 2018-08-24

Family

ID=55470333

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510732676.4A Expired - Fee Related CN105404856B (en) 2015-11-02 2015-11-02 A kind of public transit vehicle seat occupancy states detection method

Country Status (1)

Country Link
CN (1) CN105404856B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106127124A (en) * 2016-06-17 2016-11-16 长安大学 The automatic testing method of the abnormal image signal in region, taxi front row
CN106127127A (en) * 2016-06-17 2016-11-16 长安大学 The taxi monitoring abnormal image signal detection method of static zones characteristic matching
CN106503668A (en) * 2016-11-02 2017-03-15 四川长虹电器股份有限公司 Status inquiry system and method in seat in bus arrival front truck
CN108090411A (en) * 2016-11-23 2018-05-29 福特全球技术公司 Traffic lights detection and classification are carried out using computer vision and deep learning
CN109544659A (en) * 2018-10-15 2019-03-29 阿里巴巴集团控股有限公司 The generation method and device of schematic diagram
CN113361315A (en) * 2021-02-23 2021-09-07 仲恺农业工程学院 Banana string identification method based on background saturation compression and difference threshold segmentation fusion

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070289799A1 (en) * 2006-06-20 2007-12-20 Takata Corporation Vehicle occupant detecting system
CN102867188A (en) * 2012-07-26 2013-01-09 中国科学院自动化研究所 Method for detecting seat state in meeting place based on cascade structure
CN104504377A (en) * 2014-12-25 2015-04-08 中邮科通信技术股份有限公司 Bus passenger crowding degree identification system and method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070289799A1 (en) * 2006-06-20 2007-12-20 Takata Corporation Vehicle occupant detecting system
CN102867188A (en) * 2012-07-26 2013-01-09 中国科学院自动化研究所 Method for detecting seat state in meeting place based on cascade structure
CN104504377A (en) * 2014-12-25 2015-04-08 中邮科通信技术股份有限公司 Bus passenger crowding degree identification system and method

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106127124A (en) * 2016-06-17 2016-11-16 长安大学 The automatic testing method of the abnormal image signal in region, taxi front row
CN106127127A (en) * 2016-06-17 2016-11-16 长安大学 The taxi monitoring abnormal image signal detection method of static zones characteristic matching
CN106127127B (en) * 2016-06-17 2019-05-14 长安大学 The taxi of static zones characteristic matching monitors abnormal image signal detection method
CN106503668A (en) * 2016-11-02 2017-03-15 四川长虹电器股份有限公司 Status inquiry system and method in seat in bus arrival front truck
CN108090411A (en) * 2016-11-23 2018-05-29 福特全球技术公司 Traffic lights detection and classification are carried out using computer vision and deep learning
CN109544659A (en) * 2018-10-15 2019-03-29 阿里巴巴集团控股有限公司 The generation method and device of schematic diagram
CN109544659B (en) * 2018-10-15 2023-07-18 创新先进技术有限公司 Schematic diagram generation method and device
CN113361315A (en) * 2021-02-23 2021-09-07 仲恺农业工程学院 Banana string identification method based on background saturation compression and difference threshold segmentation fusion
CN113361315B (en) * 2021-02-23 2021-12-07 仲恺农业工程学院 Banana string identification method based on background saturation compression and difference threshold segmentation fusion

Also Published As

Publication number Publication date
CN105404856B (en) 2018-08-24

Similar Documents

Publication Publication Date Title
CN105404856A (en) Public traffic vehicle seat occupied state detection method
CN106373426B (en) Parking stall based on computer vision and violation road occupation for parking monitoring method
TWI475524B (en) System and method for inspection of cars that violate traffic regulations using images
CN105744232B (en) A kind of method of the transmission line of electricity video external force damage prevention of Behavior-based control analytical technology
Bibi et al. Automatic parking space detection system
CN107665603A (en) A kind of real-time detection method for judging parking stall and taking
CN105913685A (en) Video surveillance-based carport recognition and intelligent guide method
CN109670404A (en) A kind of road ponding image detection method for early warning based on mixed model
CN102496058B (en) Passenger flow density detection method
CN106056968B (en) A kind of method for detecting parking stalls based on optical imagery
CN105702048A (en) Automobile-data-recorder-based illegal lane occupation identification system and method for automobile on highway
CN105118305B (en) Motor pool outlet vehicle management platform
CN105243701A (en) Driving information reporting method and driving recording terminal
CN104464290A (en) Road traffic parameter collecting and rule violation snapshot system based on embedded double-core chip
CN103021179B (en) Based on the Safe belt detection method in real-time monitor video
CN106127124A (en) The automatic testing method of the abnormal image signal in region, taxi front row
CN104805784A (en) Gate fare evasion detection system and gate fare evasion detection method
CN104966049A (en) Lorry detection method based on images
CN109559519A (en) Monitoring device and its parking offense detection method, device, readable storage medium storing program for executing
CN105740836B (en) A kind of illegal detection method for occupying Emergency Vehicle Lane
CN102024148A (en) Method for identifying green mark of taxi
CN103164958A (en) Method and system for vehicle monitoring
CN112988830A (en) People flow statistical method, device, system, storage medium and computer equipment
CN104517444B (en) A kind of detecting system driving to use mobile phone illegal activities
CN111984806A (en) Method, device and storage medium for determining association degree of vehicle and terminal

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20180824

Termination date: 20191102