CN107169918A - For determining the mobile mesh calibration method in target zone, device and system - Google Patents

For determining the mobile mesh calibration method in target zone, device and system Download PDF

Info

Publication number
CN107169918A
CN107169918A CN201610130964.7A CN201610130964A CN107169918A CN 107169918 A CN107169918 A CN 107169918A CN 201610130964 A CN201610130964 A CN 201610130964A CN 107169918 A CN107169918 A CN 107169918A
Authority
CN
China
Prior art keywords
pixel
vertical
longitude
latitude
horizontal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610130964.7A
Other languages
Chinese (zh)
Other versions
CN107169918B (en
Inventor
彭利
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ricoh Co Ltd
Original Assignee
Ricoh Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ricoh Co Ltd filed Critical Ricoh Co Ltd
Priority to CN201610130964.7A priority Critical patent/CN107169918B/en
Publication of CN107169918A publication Critical patent/CN107169918A/en
Application granted granted Critical
Publication of CN107169918B publication Critical patent/CN107169918B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/18Image warping, e.g. rearranging pixels individually
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10052Images from lightfield camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30232Surveillance

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)
  • Studio Devices (AREA)

Abstract

The present invention is provided to determine the methods, devices and systems of the mobile target in target zone.Including:Obtain the flake panoramic picture of target zone;The Coordinate Conversion of each flake panoramic pixel is turned into spherical coordinate, as corresponding sphere pixel, to constitute spherical diagram picture;The spherical coordinate of at least a portion sphere pixel is converted into horizontal latitude coordinates and vertical latitude coordinates respectively, to respectively constitute horizontal longitude and latitude image and vertical longitude and latitude image;Horizontal longitude and latitude image and vertical longitude and latitude image are respectively divided as multiple horizontal longitude and latitude regions and multiple vertical longitude and latitude regions;For each horizontal longitude and latitude region, the maximum gray scale difference value of acquisition level and level thresholds are calculated, and for each vertical longitude and latitude region, calculates and obtains vertical maximum gray scale difference value and vertical threshold;For each horizontal longitude and latitude region and each vertical longitude and latitude region, it is marked;Sphere pixel is marked;By labeled as the sphere pixel of the first symbol, it is defined as moving target.

Description

For determining the mobile mesh calibration method in target zone, device and system
Technical field
The present invention relates to the panoramic picture of photographic subjects scope, the side of the mobile target in target zone is more particularly, to determined Method, device and system.
Background technology
It is popular recently that 360 degree of panoramic pictures (including photo and video), which shoot, and in the market has been proposed can be with The double fisheye cameras of panorama of 360 degree panoramic pictures are shot, may apply to that displaying, video display, game, monitoring etc. are numerous to answer Use field.And in monitoring field, for dynamic object (vehicle, personnel, pet) tracking be very important function it One.Panorama Pisces eye camera small volume, good concealment a, panorama double fisheye cameras can just monitor 360 degree simultaneously and regard Open country, the quantity of monitoring device needed for reducing.
Current existing dynamic object tracing algorithm, greatly both for common flat image.And for 360 degree of panoramic pictures, The algorithm of the mobile target (mobile object) of detection is seldom.For the double fish eye images of 360 degree of panoramas, traditional warp and woof unfolding is calculated Method is only deployed in an angle, and can produce very big deformation in the top and bottom of warp and woof unfolding image (is exaggerated Many times), cause the situation of flase drop.In differential comparison algorithm, due to using fixed threshold value, in different intensities of illumination Under, can also occur the situation of flase drop and missing inspection.
The content of the invention
For the problems of the prior art, the invention provides for determining the mobile mesh calibration method in target zone, device And system, with an improved traditional warp and woof unfolding algorithm, by carrying out warp and woof unfolding simultaneously in horizontal and vertical two angles, Improve the accuracy of detection of the mobile object to image top and bottom.The present invention is moved by using the gray value of image pixel State ground given threshold, so as to reduce the situation of flase drop and missing inspection.
The present invention provides a kind of determination method for being used to determine the mobile target in target zone, and the determination method includes following step Suddenly:
A) target zone is shot using panorama camera, obtains flake panoramic picture and the institute of the target zone Each flake panoramic pixel in flake panoramic picture is stated respectively in moment t1 and moment t2 gray value;
B) Coordinate Conversion of each flake panoramic pixel in the flake panoramic picture is turned into spherical coordinate, as right The sphere pixel answered, to constitute spherical diagram picture;
C) by the spherical coordinate of at least a portion sphere pixel in the spherical diagram picture be converted into respectively horizontal latitude coordinates and Vertical latitude coordinates, respectively as corresponding horizontal pixel and corresponding vertical pixel, with respectively constitute horizontal longitude and latitude image and Vertical longitude and latitude image;
D) by the horizontal longitude and latitude image and the vertical longitude and latitude image be respectively divided as multiple horizontal longitude and latitude regions and it is multiple hang down Each horizontal longitude and latitude region in straight longitude and latitude region, the multiple horizontal longitude and latitude region has multiple horizontal pixels, the multiple Each vertical longitude and latitude region in vertical longitude and latitude region has multiple vertical pixels;
E) for each horizontal longitude and latitude region, each horizontal pixel in the multiple horizontal pixel is calculated at the moment T1 and the moment t2 gray scale difference value, obtain multiple horizontal gray scale difference values, by the maximum in the multiple horizontal gray scale difference value Value is as the maximum gray scale difference value of level, and according to the average value of the multiple horizontal gray scale difference value, calculated level threshold value, and
For each vertical longitude and latitude region, each vertical pixel in the multiple vertical pixel is calculated in the moment t1 With the gray scale difference value of the moment t2, multiple vertical gray scale difference values are obtained, by the maximum in the multiple vertical gray scale difference value As vertical maximum gray scale difference value, and according to the average value of the multiple vertical gray scale difference value, vertical threshold is calculated;
F) for each horizontal longitude and latitude region, the maximum gray scale difference value of the level is compared with the level thresholds, It is the first symbol by corresponding horizontal longitude and latitude zone marker if the maximum gray scale difference value of the level is more than the level thresholds, Otherwise it is labeled as the second symbol different from first symbol;
For each vertical longitude and latitude region, the vertical maximum gray scale difference value is compared with the vertical threshold, such as Really described vertical maximum gray scale difference value is more than the vertical threshold, then will vertical longitude and latitude zone marker be accordingly the described first symbol Number, otherwise labeled as second symbol;
G) symbol marked according to each horizontal longitude and latitude region and each vertical longitude and latitude region, by the ball Each sphere pixel in the image of face is respectively labeled as first symbol or second symbol;
H) it is the sphere pixel of the first symbol by mark in the spherical diagram picture, is defined as the mobile target.
Panorama camera has two fish eye lenses, so that two local flake panoramic pictures are formed in the flake panoramic picture, Two hemisphere face images that described two local flake panoramic pictures are corresponded respectively in the spherical diagram picture.
Using two fish eye lenses, 360 degree of pan-shots can be carried out to target zone.
In step b), the coordinate of a-th of flake panoramic pixel in the flake panoramic picture is (xa, ya), to sitting Mark (xa, ya) changed, obtain the corresponding sphere pixel aBallSpherical coordinate (rBall, θaa),
Wherein, 1≤a≤b, b are the numbers of the flake panoramic pixel, and a, b is integer, the rBallIt is the spherical diagram picture Spherical radius, determined by the pixel resolution of the flake panoramic picture, θaIt is the sphere pixel aBallThe elevation angle, φaIt is the sphere pixel aBallAzimuth.
In this way, corresponding three-dimensional spherical diagram picture can be obtained from flake panoramic picture.
In step c), sphere pixel of the elevation angle theta within predetermined angular range is regard as at least a portion sphere mapping Element.The predetermined angular range is greater than 30 degree and less than 150 degree.
In this way, only can be analyzed and processed to the region of non-deformed, it may provide for moving the accuracy of detection of target.Separately Outside, due to reducing the pixel quantity of conversion, therefore the processing speed of the system can also be improved.
In step c), the corresponding horizontal pixel a is converted to by below equation relation 1WaterHorizontal latitude coordinates (xA water, yA water),
Also, the corresponding vertical pixel a is converted to by below equation relation 2Hang downVertical latitude coordinates (xA hangs down, yA hangs down),
In this way, corresponding horizontal longitude and latitude image and vertical longitude and latitude image can be obtained, it is easy to subsequent treatment.
In step d), the multiple horizontal longitude and latitude region is n × 2n, and the multiple vertical longitude and latitude region is n × 2n, The multiple horizontal pixel is m × m, and the multiple vertical pixel is m × m, and n, m are the integer more than 1. The level thresholds and the vertical threshold are obtained according to below equation relation 3,
Wherein, C1 is the level thresholds, and C2 is the vertical threshold, and A1 is the m × m horizontal gray scale difference values Average value, A2 is the average value of the m × m vertical gray scale difference values, and T and λ are pre-determined factors.
In this way, can dynamically given threshold, so as to reduce the situation of flase drop and missing inspection.
In step g), by sphere corresponding with the coordinate of the horizontal pixel in the horizontal longitude and latitude image in the spherical diagram picture Pixel mark the horizontal pixel respectively where the symbol that is marked of horizontal longitude and latitude region, by the spherical diagram picture with it is described The corresponding sphere pixel of the coordinate of vertical pixel in vertical longitude and latitude image is respectively labeled as hanging down where the vertical pixel The symbol that straight longitude and latitude region is marked, and by the seat in the spherical diagram picture with the horizontal pixel in the horizontal longitude and latitude image The corresponding and sphere pixel corresponding with the coordinate of the vertical pixel in the vertical longitude and latitude image is marked to be respectively labeled as The symbol that horizontal longitude and latitude region where the horizontal pixel is marked so that each sphere pixel is respectively labeled as described First symbol or second symbol.
In this way, the sphere pixel in spherical diagram picture can be marked, it is easy to subsequent treatment.
The present invention also provides a kind of determining device for being used to determine the mobile target in target zone, and the determining device includes:
Acquiring unit, is shot using panorama camera to the target zone, and the acquiring unit obtains the target zone Flake panoramic picture and the flake panoramic picture in each flake panoramic pixel respectively moment t1's and moment t2 Gray value;
Sphere image conversion unit is complete by each flake in the flake panoramic picture described in sphere image conversion unit The Coordinate Conversion of scene element turns into spherical coordinate, as corresponding sphere pixel, to constitute spherical diagram picture;
Longitude and latitude image conversion unit, the longitude and latitude image conversion unit is by least a portion sphere pixel in the spherical diagram picture Spherical coordinate be converted into horizontal latitude coordinates and vertical latitude coordinates respectively, respectively as corresponding horizontal pixel and correspondingly Vertical pixel, to respectively constitute horizontal longitude and latitude image and vertical longitude and latitude image;
The horizontal longitude and latitude image and the vertical longitude and latitude image are respectively divided as multiple division unit, the division unit Each horizontal longitude and latitude region in horizontal longitude and latitude region and multiple vertical longitude and latitude regions, the multiple horizontal longitude and latitude region has many Each vertical longitude and latitude region in individual horizontal pixel, the multiple vertical longitude and latitude region has multiple vertical pixels;;
Computing unit, the computing unit calculates every in the multiple horizontal pixel for each horizontal longitude and latitude region Individual horizontal pixel obtains multiple horizontal gray scale difference values in the moment t1 and the gray scale difference value of the moment t2, will be described many Maximum in individual horizontal gray scale difference value is as the maximum gray scale difference value of level, and being averaged according to the multiple horizontal gray scale difference value Value, calculated level threshold value, and
The computing unit calculates each vertical pixel in the multiple vertical pixel for each vertical longitude and latitude region In the moment t1 and the gray scale difference value of the moment t2, multiple vertical gray scale difference values are obtained, by the multiple vertical gray scale Maximum in difference is calculated and hung down as vertical maximum gray scale difference value, and according to the average value of the multiple vertical gray scale difference value Straight threshold value;
Compare indexing unit, for each horizontal longitude and latitude region, the indexing unit that compares is by the maximum gray scale of the level Difference is compared with the level thresholds, will be corresponding if level maximum gray scale difference value is more than the level thresholds Horizontal longitude and latitude zone marker be the first symbol, otherwise labeled as second symbol different from first symbol, and
For each vertical longitude and latitude region, it is described compare indexing unit by the vertical maximum gray scale difference value with it is described vertical Threshold value is compared, will vertical longitude and latitude region accordingly if the vertical maximum gray scale difference value is more than the vertical threshold Labeled as first symbol, otherwise labeled as second symbol;
Sphere pixel indexing unit, the sphere pixel indexing unit is according to each horizontal longitude and latitude region and described each The symbol that vertical longitude and latitude region is marked, first symbol is respectively labeled as by each sphere pixel in the spherical diagram picture Or second symbol;
Mobile target determination unit, the mobile target determination unit marks the spherical diagram for the first symbol as in Sphere pixel, is defined as the mobile target.
The present invention also provides a kind of system for being used to determine the mobile target in target zone, and the system includes:
Panorama camera, the panorama camera is shot to the target zone;
Device as described above, the flake panoramic picture for the target zone that the device is photographed according to the panorama camera, The mobile target is determined,
Display device, for showing the mobile target determined by the determining device.
Brief description of the drawings
Fig. 1 is the schematic diagram for being used to determine the system of the mobile target in target zone according to embodiments of the present invention;
Fig. 2 is the schematic diagram for being used to determine the device of the mobile target in target zone according to embodiments of the present invention;
Fig. 3 is the schematic diagram for being used to determine the flow of the mobile mesh calibration method in target zone according to embodiments of the present invention.
Fig. 4 is the schematic diagram of the flake panoramic picture of target zone according to embodiments of the present invention;
Fig. 5 is the schematic diagram of horizontal longitude and latitude image according to embodiments of the present invention;
Fig. 6 is the schematic diagram of vertical longitude and latitude image according to embodiments of the present invention.
Embodiment
Below in conjunction with the accompanying drawings, embodiments of the invention are specifically described.
Fig. 1 is the schematic diagram for being used to determine the system 1 of the mobile target in target zone according to embodiments of the present invention.As schemed Shown in 1, the system 1 includes panorama camera 10, the determining device 11 for determining the mobile target in target zone and shown Showing device 12.Panorama camera 10 is shot to target zone, the mesh that determining device 11 is photographed according to panorama camera 10 The flake panoramic picture of scope is marked, it is determined that mobile target, display device 12 is used to show described determined by determining device 11 Mobile target.Determining device 11 is such as can be computer, smart mobile phone, tablet personal computer.Display device 12 is, for example, Liquid crystal display.
Fig. 2 is the schematic diagram for being used to determine the determining device 11 of the mobile target in target zone according to embodiments of the present invention. As shown in Fig. 2 the determining device 11 includes acquiring unit 110, sphere image conversion unit 112, longitude and latitude image conversion list Member 114, division unit 116, computing unit 118, compare indexing unit 120, sphere pixel indexing unit 122 and move Moving-target determining unit 124.
Fig. 3 is the signal for being used to determine the flow of the determination method of the mobile target in target zone according to embodiments of the present invention Figure.
As shown in figure 3, in step S31, being shot using panorama camera 10 to target zone, acquiring unit 110 Obtain each flake panoramic pixel in the flake panoramic picture and flake panoramic picture of target zone respectively in moment t1 and Moment t2 gray value.
Panorama camera 10 has two fish eye lenses, so as to form two local flake panoramic pictures in flake panoramic picture. Wherein, two local flake panoramic pictures are two circular two-dimentional flake panoramic pictures.Fig. 4 is according to embodiments of the present invention Target zone flake panoramic picture schematic diagram, with two local flake panoramic pictures.
In step s 32, sphere image conversion unit 112 is by the coordinate of each flake panoramic pixel in flake panoramic picture Spherical coordinate is converted into, as corresponding sphere pixel, to constitute spherical diagram picture.
Here, the coordinate of a-th of flake panoramic pixel in flake panoramic picture is (xa, ya), sphere image conversion unit 112 utilize below equation relation E to coordinate (xa, ya) changed, obtain corresponding sphere pixel aBallSpherical coordinate (rBall, θaa),
Wherein, 1≤a≤b, b are the numbers of flake panoramic pixel, and a, b is integer, rBallIt is the spherical radius of spherical diagram picture, Determined by the pixel resolution of flake panoramic picture, θaIt is sphere pixel aBallThe elevation angle, φaIt is sphere pixel aBallSide Parallactic angle.
In the present embodiment, for example, the pixel resolution of flake panoramic picture is 2048 × 1024, wherein, π × rBall=1024, R is obtained so as to calculateBallFor 326.In addition, the coordinate of such as a-th flake panoramic pixel is (- 100,200), and substitute into Above-mentioned equilibrium relationships 1, are obtained
- 100=326 × θa×cosφa
200=326 × θa×sinφa
θ is obtained in this way, calculatingaFor 0.686, φaFor 0.805, i.e. be converted to corresponding sphere pixel aBallSphere sit Mark (rBall, θaa) it is (326,0.686,0.805).That is, the sphere pixel a on spherical diagram pictureBallBall Areal coordinate is (326,0.686,0.805).Here, corresponding sphere pixel aBallGray value be exactly that a-th of flake is complete The gray value of scene element.
In this way, the coordinate of each flake panoramic pixel can be converted to the sphere of the corresponding sphere pixel on spherical diagram picture Coordinate.In addition, two hemisphere face figures that two as described above local flake panoramic pictures are corresponded respectively in the spherical diagram picture Picture.
Next, in step S33, longitude and latitude image conversion unit 114 is by least a portion sphere mapping in above-mentioned spherical diagram picture The spherical coordinate of element is converted into horizontal latitude coordinates and vertical latitude coordinates respectively, respectively as corresponding horizontal pixel and right The vertical pixel answered, to respectively constitute horizontal longitude and latitude image and vertical longitude and latitude image.
Wherein, longitude and latitude image conversion unit 114 regard sphere pixel of the elevation angle theta within predetermined angular range as at least one Bulb separation face pixel, the predetermined angular range may be greater than 30 degree and less than 150 degree, i.e. by 30 degree<θ<150 degree Sphere pixel is used as at least a portion sphere pixel.
Here, if the sphere pixel in addition at least a portion sphere pixel is converted into corresponding horizontal pixel respectively And vertical pixel, then the ultra large deformation part of horizontal longitude and latitude image and the top and bottom in vertical longitude and latitude image will be turned into, And this part can reduce the accuracy of detection for mobile target.Therefore, by only by 30 degree<θ<150 degree of sphere pixel Corresponding horizontal pixel and vertical pixel are respectively converted into, the accuracy of detection for mobile target can be improved.Further, since The pixel quantity of conversion is reduced, therefore the processing speed of the system can also be improved.
Longitude and latitude image conversion unit 114 is by below equation relation 1 by spherical coordinate (rBall, θaa) it is converted to correspondence Horizontal pixel aWaterHorizontal latitude coordinates (xA water, yA water),
Also, by below equation relation 2 by spherical coordinate (rBall, θaa) it is converted to corresponding vertical pixel aHang down Vertical latitude coordinates (xA hangs down, yA hangs down),
In this example, the sphere pixel a obtained from step S32BallSpherical coordinate be (326,0.686,0.805), according to Above-mentioned equilibrium relationships 2 and equilibrium relationships 3, can respectively calculate and obtain and sphere pixel aBallCorresponding horizontal pixel aWater's Horizontal latitude coordinates are (268,288), with sphere pixel aBallCorresponding vertical pixel aHang downVertical latitude coordinates for (774, 288).Here, corresponding horizontal pixel aWaterGray value and corresponding vertical pixel aHang downGray value be exactly sphere pixel aBallGray value, that is, a-th of flake panoramic pixel gray value.That is, the gray value of each horizontal pixel and each The pixel value of vertical pixel is the gray value of corresponding flake panoramic pixel.
Each sphere pixel is respectively converted into corresponding horizontal pixel and vertical pixel in the manner described above, so as to respectively constitute Horizontal longitude and latitude image and vertical longitude and latitude image.As shown in Figure 5 and Figure 6, Fig. 5 is horizontal longitude and latitude according to embodiments of the present invention The schematic diagram of image, Fig. 6 is the schematic diagram of vertical longitude and latitude image according to embodiments of the present invention.
In step S34, horizontal longitude and latitude image and vertical longitude and latitude image are respectively divided and passed through as multiple levels by division unit 116 There are multiple horizontal pixels, each vertical longitude and latitude region in latitude region and multiple vertical longitude and latitude regions, each horizontal longitude and latitude region In have multiple vertical pixels.
Wherein, multiple horizontal longitude and latitude regions are n × 2n, and multiple vertical longitude and latitude regions are n × 2n, multiple horizontal pixels For m × m, multiple vertical pixels are m × m, and n, m are the integer more than 1.
In this example, such as m is 16, then each horizontal longitude and latitude region has 16 × 16 horizontal pixels, each vertical warp There are 16 × 16 vertical pixels in latitude region.In addition, the resolution ratio of for example horizontal longitude and latitude image and vertical longitude and latitude image is all It is 2048 × 1024, according to 2n × 16=2048, n × 16=1024, it can be deduced that n=64 in this example.
Next, in step S35, computing unit 118 is directed to each horizontal longitude and latitude region, calculates 16 × 16 horizontal pixels In each horizontal pixel in moment t1 and moment t2 gray scale difference value, 16 × 16 horizontal gray scale difference values are obtained, by 16 Maximum in × 16 horizontal gray scale difference values is as the maximum gray scale difference value of level, and according to 16 × 16 horizontal gray scale difference values Average value, calculated level threshold value, and
Computing unit 118 for each vertical longitude and latitude region, calculate each vertical pixel in 16 × 16 vertical pixels when T1 and moment t2 gray scale difference value is carved, 16 × 16 vertical gray scale difference values are obtained, by 16 × 16 vertical gray scale difference values Maximum as vertical maximum gray scale difference value, and according to the average value of 16 × 16 vertical gray scale difference values, calculate vertical threshold Value.
As described above, the gray value of each horizontal pixel is exactly the gray value of corresponding flake panoramic pixel, and therefore, Mei Geshui Flat pixel is exactly gray scale of the corresponding flake panoramic pixel in moment t1 and moment t2 in moment t1 and moment t2 gray scale difference value The difference of value.Equally, the gray value of each vertical pixel is exactly the gray value of corresponding flake panoramic pixel, therefore, each Vertical pixel is exactly corresponding flake panoramic pixel in moment t1 and moment t2 ash in moment t1 and moment t2 gray scale difference value The difference of angle value.
For example, for a horizontal longitude and latitude region, 16 × 16 horizontal gray scale difference values of acquisition, maximum therein are calculated as above E.g. 125, i.e. the maximum gray scale difference value of the level in the horizontal longitude and latitude region is 125.Equally, for example it is vertical for one Longitude and latitude region, is calculated as above the vertical gray scale difference value of acquisition 16 × 16, maximum therein is, for example, 125, i.e. this is vertical The vertical maximum gray scale difference value in longitude and latitude region is 125.Computing unit 118 according to below equation relation 3 obtain level thresholds and Vertical threshold,
Wherein, C1 is level thresholds, and A1 is the average value of 16 × 16 horizontal gray scale difference values, and T and λ are pre-determined factors.This example In, T is, for example, that 50, λ is, for example, 2, and the average value A1 of 16 × 16 horizontal gray scale difference values is, for example, 7.8, so meter It is 65.6 that calculation, which obtains C1,.
Wherein, C2 is vertical threshold, and A2 is the average value of 16 × 16 vertical gray scale difference values, and T and λ are pre-determined factors. In this example, T is, for example, that 50, λ is, for example, 2, and the average value A2 of 16 × 16 vertical gray scale difference values is, for example, 7.8, such as It is 65.6 that this calculating, which obtains C2,.
In this way, to each horizontal longitude and latitude region in horizontal longitude and latitude image and each vertical longitude and latitude area in vertical longitude and latitude image Domain carries out identical calculating.
In step S36, for each horizontal longitude and latitude region, compare indexing unit 120 by the maximum gray scale difference value of level and level Threshold value is compared, if level maximum gray scale difference value is more than level thresholds, is the by corresponding horizontal longitude and latitude zone marker One symbol, otherwise labeled as second symbol different from the first symbol, and for each vertical longitude and latitude region, compares mark Vertical maximum gray scale difference value is compared by unit 120 with vertical threshold, if vertical maximum gray scale difference value is more than vertical threshold, To vertical longitude and latitude zone marker be accordingly then the first symbol, otherwise labeled as the second symbol.
For example, for the horizontal longitude and latitude region of said one, comparing indexing unit 120 by the maximum gray scale difference value of its level and level Threshold value is compared, wherein, level maximum gray scale difference value is 125, and level thresholds C1 is 65.6, i.e. level maximum gray scale Difference is more than level thresholds, then be just the first symbol by the horizontal longitude and latitude zone marker.In this example, the first symbol is, for example, " 1 ", the second symbol is, for example, " 0 ".
Equally, for example, for the vertical longitude and latitude region of said one, comparing indexing unit 120 by its vertical maximum gray scale difference value It is compared with vertical threshold, wherein, vertical maximum gray scale difference value is 125, and vertical threshold C1 is 65.6, i.e. vertically most High-gray level difference is more than vertical threshold, then be just the first symbol " 1 " by the vertical longitude and latitude zone marker.
Then, in step S37, sphere pixel indexing unit 122 is according to each horizontal longitude and latitude region and each vertical longitude and latitude The symbol that region is marked, the first symbol " 1 " or the second symbol are respectively labeled as by each sphere pixel in spherical diagram picture “0”。
Specifically, sphere pixel indexing unit 122 is by the coordinate phase in spherical diagram picture with the horizontal pixel in horizontal longitude and latitude image Corresponding sphere pixel distinguish mark level pixel where the symbol that is marked of horizontal longitude and latitude region, will be in spherical diagram picture with hanging down Directly the corresponding sphere pixel of the coordinate of the vertical pixel in longitude and latitude image is respectively labeled as the vertical longitude and latitude where vertical pixel The symbol that region is marked, and by corresponding with the coordinate of the horizontal pixel in horizontal longitude and latitude image in spherical diagram picture and The sphere pixel corresponding with the coordinate of the vertical pixel in vertical longitude and latitude image is respectively labeled as the level where horizontal pixel The symbol that longitude and latitude region is marked so that each sphere pixel is respectively labeled as first symbol or second symbol.
In this example, with sphere pixel aBallExemplified by, the horizontal pixel a in horizontal longitude and latitude imageWaterCoordinate (268,288) it is right Should be in sphere pixel aBall, and the vertical pixel a in vertical longitude and latitude imageHang downCoordinate (744,288) also correspond to sphere Pixel aBall, i.e. sphere pixel aBallCorresponding to horizontal pixel aWaterAnd vertical pixel aHang down, then sphere pixel indexing unit 122 Just by sphere pixel aBallLabeled as horizontal pixel aWaterThe symbol that the horizontal longitude and latitude region at place is marked.For example, horizontal pixel aWaterThe symbol that the horizontal longitude and latitude region at place is marked is the first symbol " 1 ", then sphere pixel indexing unit 122 is just by ball Face pixel aBallLabeled as " 1 ".
In addition, if another sphere pixel a 'BallFor example correspond only to the horizontal pixel a ' in horizontal longitude and latitude imageWater, and do not have Have corresponding to any vertical pixel in vertical longitude and latitude image, and such as horizontal pixel a 'WaterThe horizontal longitude and latitude region institute at place The symbol of mark is the second symbol " 0 ", then sphere pixel indexing unit 122 is just by sphere pixel a 'BallLabeled as " 0 ". If another sphere pixel a "BallFor example correspond only to the vertical pixel a " in vertical longitude and latitude imageHang down, without corresponding to Any horizontal pixel in horizontal longitude and latitude image, and such as vertical pixel a "Hang downWhat the vertical longitude and latitude region at place was marked Symbol is the second symbol " 1 ", then sphere pixel indexing unit 122 is just by sphere pixel a "BallLabeled as " 1 ".
In the manner described above, all sphere pixels on spherical diagram picture can be marked.
In step S38, target determination unit 124 is moved by sphere pixel of the mark in spherical diagram picture for symbol " 1 ", It is defined as moving target.
Mark in spherical diagram picture is attached by mobile target determination unit 124 for the sphere pixel of symbol " 1 ", shape It is defined as moving target into multiple marked regions, and by a maximum marked region of marked region area.
After mobile target is determined, user can adjust the broadcasting angle of panoramic picture so that need the mobile mesh tracked Mark is shown in the middle of display device 12, facilitates user to be observed.
Present invention improves over traditional warp and woof unfolding algorithm, by carrying out warp and woof unfolding simultaneously in horizontal and vertical two angles, Improve the accuracy of detection of the mobile object to image top and bottom.The present invention is moved by using the gray value of image pixel The given threshold of state, so as to reduce the situation of flase drop and missing inspection.
Although the particular embodiment of the present invention has been described, these embodiments are only stated by way of example, not It is intended to limit the scope of the present invention.In fact, innovative approach described herein can be implemented by various other forms;This Outside, the various omissions to method described herein and system, replacement and the essence changed without departing from the present invention can also be carried out God.The purpose of appended claims and its equivalents is to cover be considered within the scope and spirit of the invention such each The form of kind or modification.

Claims (17)

1. a kind of determination method for being used to determine the mobile target in target zone, it is characterised in that the determination method includes Following steps:
A) target zone is shot using panorama camera, obtains flake panoramic picture and the institute of the target zone Each flake panoramic pixel in flake panoramic picture is stated respectively in moment t1 and moment t2 gray value;
B) Coordinate Conversion of each flake panoramic pixel in the flake panoramic picture is turned into spherical coordinate, as right The sphere pixel answered, to constitute spherical diagram picture;
C) by the spherical coordinate of at least a portion sphere pixel in the spherical diagram picture be converted into respectively horizontal latitude coordinates and Vertical latitude coordinates, respectively as corresponding horizontal pixel and corresponding vertical pixel, with respectively constitute horizontal longitude and latitude image and Vertical longitude and latitude image;
D) by the horizontal longitude and latitude image and the vertical longitude and latitude image be respectively divided as multiple horizontal longitude and latitude regions and it is multiple hang down Each horizontal longitude and latitude region in straight longitude and latitude region, the multiple horizontal longitude and latitude region has multiple horizontal pixels, the multiple Each vertical longitude and latitude region in vertical longitude and latitude region has multiple vertical pixels;
E) for each horizontal longitude and latitude region, each horizontal pixel in the multiple horizontal pixel is calculated at the moment T1 and the moment t2 gray scale difference value, obtain multiple horizontal gray scale difference values, by the maximum in the multiple horizontal gray scale difference value Value is as the maximum gray scale difference value of level, and according to the average value of the multiple horizontal gray scale difference value, calculated level threshold value, and
For each vertical longitude and latitude region, each vertical pixel in the multiple vertical pixel is calculated in the moment t1 With the gray scale difference value of the moment t2, multiple vertical gray scale difference values are obtained, by the maximum in the multiple vertical gray scale difference value As vertical maximum gray scale difference value, and according to the average value of the multiple vertical gray scale difference value, vertical threshold is calculated;
F) for each horizontal longitude and latitude region, the maximum gray scale difference value of the level is compared with the level thresholds, It is the first symbol by corresponding horizontal longitude and latitude zone marker if the maximum gray scale difference value of the level is more than the level thresholds, Otherwise it is labeled as the second symbol different from first symbol;
For each vertical longitude and latitude region, the vertical maximum gray scale difference value is compared with the vertical threshold, such as Really described vertical maximum gray scale difference value is more than the vertical threshold, then will vertical longitude and latitude zone marker be accordingly the described first symbol Number, otherwise labeled as second symbol;
G) symbol marked according to each horizontal longitude and latitude region and each vertical longitude and latitude region, by the ball Each sphere pixel in the image of face is respectively labeled as first symbol or second symbol;
H) it is the sphere pixel of the first symbol by mark in the spherical diagram picture, is defined as the mobile target.
2. method is determined as claimed in claim 1, it is characterised in that the panorama camera has two fish eye lenses, from And two local flake panoramic pictures are formed in the flake panoramic picture, described two local flake panoramic pictures are right respectively Two hemisphere face images in spherical diagram picture described in Ying Yu.
3. method is determined as claimed in claim 2, it is characterised in that in step b), in the flake panoramic picture A-th of flake panoramic pixel coordinate be (xa, ya), to coordinate (xa, ya) changed, obtain described corresponding Sphere pixel aBallSpherical coordinate (rBall, θaa),
Wherein, 1≤a≤b, b are the numbers of the flake panoramic pixel, and a, b is integer, the rBallIt is the spherical diagram picture Spherical radius, determined by the pixel resolution of the flake panoramic picture, θaIt is the sphere pixel aBallThe elevation angle, φaIt is the sphere pixel aBallAzimuth.
4. method is determined as claimed in claim 3, it is characterised in that in step c), by elevation angle theta in predetermined angular model Sphere pixel within enclosing is used as at least a portion sphere pixel.
5. method is determined as claimed in claim 4, it is characterised in that in step c), pass through below equation relation 1 It is converted to the corresponding horizontal pixel aWaterHorizontal latitude coordinates (xA water, yA water),
Also, the corresponding vertical pixel a is converted to by below equation relation 2Hang downVertical latitude coordinates (xA hangs down, yA hangs down),
6. method is determined as claimed in claim 5, it is characterised in that in step d), the multiple horizontal longitude and latitude area Domain is n × 2n, and the multiple vertical longitude and latitude region is n × 2n, and the multiple horizontal pixel is m × m, described Multiple vertical pixels are m × m, and n, m are the integer more than 1.
7. method is determined as claimed in claim 6, it is characterised in that in step e), according to below equation relation 3 The level thresholds and the vertical threshold are obtained,
Wherein, C1 is the level thresholds, and C2 is the vertical threshold, and A1 is the m × m horizontal gray scale difference values Average value, A2 is the average value of the m × m vertical gray scale difference values, and T and λ are pre-determined factors.
8. the determination method as any one of claim 1-7, it is characterised in that in step g), by the spherical diagram The sphere pixel corresponding with the coordinate of the horizontal pixel in the horizontal longitude and latitude image marks the horizontal pixel respectively as in The symbol that the horizontal longitude and latitude region at place is marked, by the spherical diagram picture with the vertical pixel in the vertical longitude and latitude image The corresponding sphere pixel of coordinate be respectively labeled as the vertical pixel where the symbol that is marked of vertical longitude and latitude region, and And by it is corresponding with the coordinate of the horizontal pixel in the horizontal longitude and latitude image in the spherical diagram picture and with it is described vertical The corresponding sphere pixel of the coordinate of vertical pixel in longitude and latitude image is respectively labeled as the level warp where the horizontal pixel The symbol that latitude region is marked so that each sphere pixel is respectively labeled as first symbol or second symbol.
9. a kind of determining device for being used to determine the mobile target in target zone, it is characterised in that the determining device includes:
Acquiring unit, is shot using panorama camera to the target zone, and the acquiring unit obtains the target zone Flake panoramic picture and the flake panoramic picture in each flake panoramic pixel respectively moment t1's and moment t2 Gray value;
Sphere image conversion unit is complete by each flake in the flake panoramic picture described in sphere image conversion unit The Coordinate Conversion of scene element turns into spherical coordinate, as corresponding sphere pixel, to constitute spherical diagram picture;
Longitude and latitude image conversion unit, the longitude and latitude image conversion unit is by least a portion sphere pixel in the spherical diagram picture Spherical coordinate be converted into horizontal latitude coordinates and vertical latitude coordinates respectively, respectively as corresponding horizontal pixel and correspondingly Vertical pixel, to respectively constitute horizontal longitude and latitude image and vertical longitude and latitude image;
The horizontal longitude and latitude image and the vertical longitude and latitude image are respectively divided as multiple division unit, the division unit Each horizontal longitude and latitude region in horizontal longitude and latitude region and multiple vertical longitude and latitude regions, the multiple horizontal longitude and latitude region has many Each vertical longitude and latitude region in individual horizontal pixel, the multiple vertical longitude and latitude region has multiple vertical pixels;;
Computing unit, the computing unit calculates every in the multiple horizontal pixel for each horizontal longitude and latitude region Individual horizontal pixel obtains multiple horizontal gray scale difference values in the moment t1 and the gray scale difference value of the moment t2, will be described many Maximum in individual horizontal gray scale difference value is as the maximum gray scale difference value of level, and being averaged according to the multiple horizontal gray scale difference value Value, calculated level threshold value, and
The computing unit calculates each vertical pixel in the multiple vertical pixel for each vertical longitude and latitude region In the moment t1 and the gray scale difference value of the moment t2, multiple vertical gray scale difference values are obtained, by the multiple vertical gray scale Maximum in difference is calculated and hung down as vertical maximum gray scale difference value, and according to the average value of the multiple vertical gray scale difference value Straight threshold value;
Compare indexing unit, for each horizontal longitude and latitude region, the indexing unit that compares is by the maximum gray scale of the level Difference is compared with the level thresholds, will be corresponding if level maximum gray scale difference value is more than the level thresholds Horizontal longitude and latitude zone marker be the first symbol, otherwise labeled as second symbol different from first symbol, and
For each vertical longitude and latitude region, it is described compare indexing unit by the vertical maximum gray scale difference value with it is described vertical Threshold value is compared, will vertical longitude and latitude region accordingly if the vertical maximum gray scale difference value is more than the vertical threshold Labeled as first symbol, otherwise labeled as second symbol;
Sphere pixel indexing unit, the sphere pixel indexing unit is according to each horizontal longitude and latitude region and described each The symbol that vertical longitude and latitude region is marked, first symbol is respectively labeled as by each sphere pixel in the spherical diagram picture Or second symbol;
Mobile target determination unit, the mobile target determination unit marks the spherical diagram for the first symbol as in Sphere pixel, is defined as the mobile target.
10. determining device as claimed in claim 9, it is characterised in that the panorama camera has two fish eye lenses, from And two local flake panoramic pictures are formed in the flake panoramic picture, described two local flake panoramic pictures are right respectively Two hemisphere face images in spherical diagram picture described in Ying Yu.
11. determining device as claimed in claim 10, it is characterised in that a-th of flake in the flake panoramic picture The coordinate of panoramic pixel is (xa, ya), the sphere image conversion unit is to coordinate (xa, ya) changed, obtain institute State corresponding sphere pixel aBallSpherical coordinate (rBall, θaa),
Wherein, 1≤a≤b, b are the numbers of the flake panoramic pixel, and a, b is integer, the rBallIt is the spherical diagram The spherical radius of picture, is determined, θ by the pixel resolution of the flake panoramic pictureaIt is the sphere pixel aBallThe elevation angle, φaIt is sphere pixel aBallAzimuth.
12. determining device as claimed in claim 11, it is characterised in that the longitude and latitude image conversion unit exists elevation angle theta Sphere pixel within predetermined angular range is used as at least a portion sphere pixel.
13. determining device as claimed in claim 13, it is characterised in that the longitude and latitude image conversion unit passes through such as the following Formula relation 1 is converted to the corresponding horizontal pixel aWaterHorizontal latitude coordinates (xA water, yA water),
Also, the corresponding vertical pixel a is converted to by below equation relation 2Hang downVertical latitude coordinates (xA hangs down, yA hangs down),
14. determining device as claimed in claim 13, it is characterised in that the multiple horizontal longitude and latitude region is n × 2n, The multiple vertical longitude and latitude region is n × 2n, and the multiple horizontal pixel is m × m, and the multiple vertical pixel is M × m, n, m are the integer more than 1.
15. determining device as claimed in claim 14, it is characterised in that the computing unit is according to below equation relation 3 The level thresholds and the vertical threshold are obtained,
Wherein, C1 is the level thresholds, and C2 is the vertical threshold, and A1 is being averaged for the individual horizontal gray scale difference values of the m × m Value, A2 is the average value of the m × m vertical gray scale difference values, and T and λ are pre-determined factors.
16. the determining device as any one of claim 9-15, it is characterised in that the sphere pixel indexing unit Sphere pixel corresponding with the coordinate of the horizontal pixel in the horizontal longitude and latitude image in the spherical diagram picture is marked respectively The symbol that horizontal longitude and latitude region where the horizontal pixel is marked, by the spherical diagram picture with the vertical longitude and latitude image In the corresponding sphere pixel of coordinate of vertical pixel be respectively labeled as vertical longitude and latitude region institute where the vertical pixel The symbol of mark, and by corresponding with the coordinate of the horizontal pixel in the horizontal longitude and latitude image in the spherical diagram picture and And the sphere pixel corresponding with the coordinate of the vertical pixel in the vertical longitude and latitude image is respectively labeled as the horizontal pixel The symbol that the horizontal longitude and latitude region at place is marked so that each sphere pixel is respectively labeled as first symbol or institute State the second symbol.
17. a kind of system for being used to determine the mobile target in target zone, it is characterised in that the system includes:
Panorama camera, the panorama camera is shot to the target zone;
Determining device as any one of claim 9-16, the determining device is photographed according to the panorama camera The flake panoramic picture of the target zone, determines the mobile target;
Display device, the display device is used to show the mobile target determined by the determining device.
CN201610130964.7A 2016-03-08 2016-03-08 Method, device and system for determining a moving object in an object range Active CN107169918B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610130964.7A CN107169918B (en) 2016-03-08 2016-03-08 Method, device and system for determining a moving object in an object range

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610130964.7A CN107169918B (en) 2016-03-08 2016-03-08 Method, device and system for determining a moving object in an object range

Publications (2)

Publication Number Publication Date
CN107169918A true CN107169918A (en) 2017-09-15
CN107169918B CN107169918B (en) 2020-08-04

Family

ID=59848670

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610130964.7A Active CN107169918B (en) 2016-03-08 2016-03-08 Method, device and system for determining a moving object in an object range

Country Status (1)

Country Link
CN (1) CN107169918B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220194435A1 (en) * 2020-12-21 2022-06-23 Move-X Autonomous Driving Technology Co., Ltd. Unmanned logistics vehicle, transaction system and method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2530187C (en) * 2003-07-03 2010-12-07 Physical Optics Corporation Panoramic video system with real-time distortion-free imaging
CN101938605A (en) * 2009-06-30 2011-01-05 爱国者全景(北京)网络科技发展有限公司 Method for generating panoramic video
CN104835117A (en) * 2015-05-11 2015-08-12 合肥工业大学 Spherical panorama generating method based on overlapping way
US20150312498A1 (en) * 2014-04-28 2015-10-29 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and storage medium
CN105184809A (en) * 2014-05-26 2015-12-23 富士通株式会社 Moving object detection method and moving object detection device
CN105335748A (en) * 2014-08-07 2016-02-17 株式会社理光 Image feature extraction method and system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2530187C (en) * 2003-07-03 2010-12-07 Physical Optics Corporation Panoramic video system with real-time distortion-free imaging
CN101938605A (en) * 2009-06-30 2011-01-05 爱国者全景(北京)网络科技发展有限公司 Method for generating panoramic video
US20150312498A1 (en) * 2014-04-28 2015-10-29 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and storage medium
CN105049673A (en) * 2014-04-28 2015-11-11 佳能株式会社 Image processing apparatus, and image processing method
CN105184809A (en) * 2014-05-26 2015-12-23 富士通株式会社 Moving object detection method and moving object detection device
CN105335748A (en) * 2014-08-07 2016-02-17 株式会社理光 Image feature extraction method and system
CN104835117A (en) * 2015-05-11 2015-08-12 合肥工业大学 Spherical panorama generating method based on overlapping way

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220194435A1 (en) * 2020-12-21 2022-06-23 Move-X Autonomous Driving Technology Co., Ltd. Unmanned logistics vehicle, transaction system and method

Also Published As

Publication number Publication date
CN107169918B (en) 2020-08-04

Similar Documents

Publication Publication Date Title
US10332237B2 (en) Equatorial stitching of hemispherical images in a spherical image capture system
TWI397317B (en) Method for providing output image in either cylindrical mode or perspective mode
CN107749268B (en) Screen detection method and equipment
CN111750820B (en) Image positioning method and system
US8005264B2 (en) Method of automatically detecting and tracking successive frames in a region of interesting by an electronic imaging device
US9940717B2 (en) Method and system of geometric camera self-calibration quality assessment
CN106357976A (en) Omni-directional panoramic image generating method and device
CN110211043A (en) A kind of method for registering based on grid optimization for Panorama Mosaic
CN106530313B (en) A kind of real-time detection method for sea sky based on region segmentation
CN108230397A (en) Multi-lens camera is demarcated and bearing calibration and device, equipment, program and medium
CN106033614B (en) A kind of mobile camera motion object detection method under strong parallax
US20130070094A1 (en) Automatic registration of multi-projector dome images
CN110268712A (en) Method and apparatus for handling image attributes figure
WO2021017589A1 (en) Image fusion method based on gradient domain mapping
US11146727B2 (en) Method and device for generating a panoramic image
US20140009570A1 (en) Systems and methods for capture and display of flex-focus panoramas
CN107009962B (en) A kind of panorama observation method based on gesture recognition
CN107038758A (en) A kind of augmented reality three-dimensional registration method based on ORB operators
CN106127115A (en) A kind of mixing sensation target localization method based on panorama and conventional visual
JP6585668B2 (en) Object detection device
CN107169918A (en) For determining the mobile mesh calibration method in target zone, device and system
US20190005675A1 (en) Methods and Apparatus for Tracking A Light Source In An Environment Surrounding A Device
JP2008005110A5 (en)
CN104937608B (en) Road area detection
Li et al. A hybrid pose tracking approach for handheld augmented reality

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant