CN115631451A - Queuing distance limiting prompting method and device, storage medium and electronic equipment - Google Patents

Queuing distance limiting prompting method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN115631451A
CN115631451A CN202211276927.9A CN202211276927A CN115631451A CN 115631451 A CN115631451 A CN 115631451A CN 202211276927 A CN202211276927 A CN 202211276927A CN 115631451 A CN115631451 A CN 115631451A
Authority
CN
China
Prior art keywords
image
pixel point
filtered
distance
queuing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211276927.9A
Other languages
Chinese (zh)
Inventor
闫传为
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bank of China Ltd
Original Assignee
Bank of China Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bank of China Ltd filed Critical Bank of China Ltd
Priority to CN202211276927.9A priority Critical patent/CN115631451A/en
Publication of CN115631451A publication Critical patent/CN115631451A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/97Determining parameters from multiple pictures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/36Applying a local operator, i.e. means to operate on image points situated in the vicinity of a given point; Non-linear local filtering operations, e.g. median filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • G06V10/7515Shifting the patterns to accommodate for positional errors
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/18Status alarms
    • G08B21/182Level alarms, e.g. alarms responsive to variables exceeding a threshold
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/18Status alarms
    • G08B21/24Reminder alarms, e.g. anti-loss alarms

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Emergency Management (AREA)
  • Business, Economics & Management (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Nonlinear Science (AREA)
  • Image Processing (AREA)

Abstract

The application provides a queuing distance-limiting prompting method and device, a storage medium and an electronic device, which can be applied to the field of mobile internet or finance. This scheme has realized lining up the distance calculation to the team of lining up to when the distance satisfies apart from the early warning rule, in time point out the user and keep the distance, and then avoid the people information of withdrawing money to reveal.

Description

Queuing distance limiting prompting method and device, storage medium and electronic equipment
Technical Field
The present application relates to the field of computer applications, and in particular, to a queuing distance-limiting prompting method and apparatus, a storage medium, and an electronic device.
Background
At present, money taking at a self-service teller machine is a choice of most people, but the queuing distance is too short in the money taking process, so that the risk of information leakage of the payee exists, and therefore, how to provide a technical scheme for calculating the queuing distance of a queuing team is provided, so that when the distance meets a distance early warning rule, a user is prompted to keep the distance in time, and the information leakage of the payee is avoided.
Disclosure of Invention
The application provides a queuing distance-limiting prompting method and device, a storage medium and electronic equipment, and aims to realize the calculation of queuing distance of a queuing team, so that when the distance meets a distance early warning rule, a user is prompted to keep the distance in time, and further the information leakage of a teller is avoided.
In order to achieve the purpose, the invention provides the following technical scheme:
a queuing distance-limiting prompting method comprises the following steps:
acquiring a left image obtained by shooting the left side of a queuing team and a right image obtained by shooting the right side of the queuing team;
performing matching cost calculation on the left image and the right image to obtain a first image to be filtered;
performing guided filtering processing on the first image to be filtered to obtain a first filtered image;
calculating an initial disparity value of the first filtered image and calculating a hamming distance between the left image and the right image;
performing cost recombination on the initial parallax value of the first filtering image and the Hamming distance to obtain a second image to be filtered;
performing the guided filtering processing on the second image to be filtered to obtain a second filtered image;
filling holes in the second filtered image, and performing weighted median filtering on the second filtered image after the holes are filled to obtain a target image;
calculating the queuing distance of the queuing team through the target image;
if the queuing distance of the queuing team is determined to meet the distance early warning rule, outputting prompt information; the prompt message is used for prompting the user to keep the queuing distance.
Optionally, in the method, the performing matching cost calculation on the left image and the right image to obtain a first image to be filtered includes:
respectively acquiring color information of each pixel point in the left image and the right image based on an RGB color space;
respectively acquiring color information of each pixel point in the left image and the right image based on a CIELAB color space;
calculating the matching cost of each pixel point in the first image based on the color information based on the RGB color space of each pixel point in the left side image and the right side image and the color information based on the CIELAB color space of each pixel point in the left side image and the right side image; the first image is the left image or the right image;
respectively acquiring gradient information of each pixel point in the left image and the right image;
calculating the matching cost of each pixel point in the first image based on the gradient information according to the gradient information of each pixel point in the left image and the right image;
calculating the fusion cost of each pixel point in the first image based on the matching cost of each pixel point in the first image based on color information and the matching cost of each pixel point in the first image based on gradient information;
aiming at each pixel point in the first image, taking the fusion cost of the pixel point as the pixel value of the pixel point;
and constructing a first image to be filtered based on the pixel value of each pixel point in the first image.
The method mentioned above, optionally, the calculating an initial disparity value of the first filtered image includes:
calculating an initial parallax value of each pixel point in the first filtered image through a parallax value calculation formula based on the pixel value and the position information of each pixel point in the first filtered image;
the calculating a hamming distance between the left image and the right image comprises:
performing stereo matching census transformation on the left image and the right image;
and calculating the Hamming distance between each pixel point in the left image and the corresponding pixel point in the right image based on the left image and the right image transformed by census.
Optionally, in the method, the performing cost recombination on the initial disparity value of the first filtered image and the hamming distance to obtain a second image to be filtered includes:
calculating the fusion cost of each pixel point in the first filtering image based on the initial parallax value of each pixel point in the first filtering image and the Hamming distance between each pixel point in the left image and the corresponding pixel point in the right image;
aiming at each pixel point in the first filtering image, taking the fusion cost of the pixel point as the pixel value of the pixel point;
and constructing a second image to be filtered based on the pixel value of each pixel point in the first filtered image.
In the foregoing method, optionally, the performing guided filtering processing on the first image to be filtered to obtain a first filtered image includes:
constructing a traditional guide filtering energy function based on the first image to be filtered and a preset guide image, and constructing a target guide filtering energy function based on the traditional guide filtering energy function and an edge holding term; the target guide filtering energy function comprises linear coefficients;
solving the target guide filtering energy function to obtain a solution value of a linear coefficient;
and obtaining a first filtering image based on the solving value of the linear coefficient and the guide image.
Optionally, the above method, where the hole filling is performed on the second filtered image, includes:
calculating a target disparity value of each pixel point in the second filtered image based on the pixel value of each pixel point in the second filtered image;
identifying an error pixel point in the second filtered image based on the target disparity value of each pixel point in the second filtered image;
and filling holes in the identified error pixel points.
Optionally, the above method, wherein calculating a queuing distance of the queuing team through the target image includes:
aiming at each pixel point in the target image, calculating the queuing distance of the pixel point through a distance calculation formula based on the pixel value of the pixel point;
the distance calculation formula is as follows:
Figure BDA0003895708020000041
wherein Z is the queuing distance of the pixel point, b is the preset baseline length, f is the focal length of the camera device, and d is the pixel value of the pixel point;
the queuing distance of each pixel point is formed into the queuing distance of the queuing team;
the determining that the queuing distance of the queuing team meets the distance early warning rule comprises:
aiming at each pixel point in the target image, if the queuing distance of the pixel point is greater than a preset distance threshold, determining the pixel point as a target pixel point;
and if the ratio between the number of the target pixel points and the number of the pixel points included in the target image is greater than a preset ratio threshold, determining that the queuing distance of the queuing queue meets the distance early warning rule.
A queuing distance limiting prompting device comprises:
an acquisition unit configured to acquire a left image obtained by imaging a left side of a queuing group and a right image obtained by imaging a right side of the queuing group;
the cost calculation unit is used for performing matching cost calculation on the left image and the right image to obtain a first image to be filtered;
the first filtering unit is used for conducting guiding filtering processing on the first image to be filtered to obtain a first filtered image;
a first calculation unit for calculating an initial disparity value of the first filtered image and calculating a hamming distance between the left image and the right image;
the cost recombination unit is used for performing cost recombination on the initial parallax value of the first filtering image and the Hamming distance to obtain a second image to be filtered;
the second filtering unit is used for conducting the guided filtering processing on the second image to be filtered to obtain a second filtered image;
the void filling unit is used for filling voids in the second filtered image and performing weighted median filtering on the second filtered image after the voids are filled to obtain a target image;
the second calculation unit is used for calculating the queuing distance of the queuing team through the target image;
the prompting unit is used for outputting prompting information if the queuing distance of the queuing team is determined to meet the distance early warning rule; the prompt message is used for prompting the user to keep the queuing distance.
A storage medium storing a set of instructions, wherein the set of instructions, when executed by a processor, implement the queuing distance-limiting prompting method described above.
An electronic device, comprising:
a memory for storing at least one set of instructions;
and the processor is used for executing the instruction set stored in the memory and realizing the queuing distance-limiting prompting method by executing the instruction set.
Compared with the prior art, the method has the following advantages:
the application provides a queuing distance-limiting prompting method and device, a storage medium and electronic equipment, wherein a first image to be filtered is obtained by performing matched cost calculation on a left image and a right image, the first image to be filtered is subjected to guided filtering processing to obtain a first filtered image, an initial parallax value of the first filtered image is calculated, a Hamming distance between the left image and the right image is calculated, cost recombination is performed on the initial parallax value and the Hamming distance of the first filtered image to obtain a second image to be filtered, the guided filtering processing is performed on the second image to be filtered to obtain a second filtered image, cavity filling is performed on the second filtered image, weighted median filtering is performed on the second filtered image after the cavity filling to obtain a target image, the queuing distance of a queue is calculated through the target image, and if the queuing distance of the queue is determined to meet a distance early warning rule, prompting information is output; the prompt message is used for prompting the user to keep the queuing distance. Therefore, according to the scheme, the queuing distance calculation of the queuing team is achieved, so that when the distance meets the distance early warning rule, a user is prompted to keep the distance in time, information leakage of a money drawing person is avoided, and the distance calculation precision is improved by conducting secondary guide filtering processing on the image.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
Fig. 1 is a flowchart of a method of a queuing distance-limiting prompting method provided in the present application;
fig. 2 is a flowchart of another method of a queuing distance limit prompting method provided in the present application;
fig. 3 is an exemplary diagram of a queuing distance limit prompting method provided in the present application;
fig. 4 is a flowchart of another method of a queuing distance limit prompting method provided in the present application;
fig. 5 is a diagram illustrating another example of a queuing distance limit prompting method provided in the present application;
fig. 6 is a flowchart of another method of a queuing distance-limiting prompting method provided in the present application;
fig. 7 is a diagram illustrating another example of a queuing distance limit prompting method provided in the present application;
fig. 8 is a diagram illustrating another example of a queuing distance-limiting prompting method provided in the present application;
FIG. 9 is a flowchart of another method of a queuing distance limit prompting method provided by the present application;
FIG. 10 is a diagram illustrating another example of a queuing distance limit prompting method provided in the present application;
fig. 11 is a diagram illustrating a queuing distance limit prompting method according to another embodiment of the present disclosure;
fig. 12 is a schematic structural diagram of a queuing distance limit prompting device provided in the present application;
fig. 13 is a schematic structural diagram of an electronic device provided in the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The term "include" and variations thereof as used herein are open-ended, i.e., "including but not limited to". The term "based on" is "based at least in part on". The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments". Relevant definitions for other terms will be given in the following description.
It should be noted that the terms "first", "second", and the like in the disclosure of the present application are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence of the functions performed by the devices, modules or units.
It is noted that references to "a" or "an" in the disclosure are intended to be illustrative rather than limiting and that those skilled in the art will understand that reference to "one or more" unless the context clearly dictates otherwise.
The application is operational with numerous general purpose or special purpose computing device environments or configurations. For example: personal computers, server computers, hand-held or portable devices, tablet-type devices, multi-processor apparatus, distributed computing environments that include any of the above devices or equipment, and the like.
The queuing distance-limiting prompting method and device, the storage medium and the electronic equipment can be used in the field of mobile internet or the field of finance. The above description is only an example, and does not limit the application fields of the queuing distance limit prompting method and apparatus, the storage medium, and the electronic device provided in the present application.
In the present embodiment, for convenience of understanding, the terms related to the present application are described as follows:
the guide filtering means to perform smoothing processing on the obtained image, that is, to remove unimportant noise information in the image, and to better retain the edge information of the image.
The dual-color space is different from the traditional RGB space pixel-based feature extraction method, the features of RGB and CIELAB color spaces are extracted, and the three-dimensional features are converted into six-dimensional features.
Referring to fig. 1, a flow chart of a method of a queuing distance-limiting prompting method is shown in fig. 1, and specifically includes:
s101, acquiring a left image obtained by shooting the left side of the queue and a right image obtained by shooting the right side of the queue.
In the embodiment, a left image obtained by imaging the left side of the queue by the imaging device and a right image obtained by imaging the right side of the queue by the imaging device are obtained.
Wherein the camera device may be a camera.
Note that the left image and the right image are images based on an RGB color space.
It should be noted that, for each pixel point in the left image, the right image has a corresponding pixel point, that is, the pixel point in the left image corresponds to the pixel point in the right image one to one.
And S102, performing matching cost calculation on the left image and the right image to obtain a first image to be filtered.
In this embodiment, matching cost calculation is performed on the left image and the right image, and the first image to be filtered is constructed based on the result of the matching cost calculation.
Referring to fig. 2, the process of calculating the matching cost of the left image and the right image to obtain the first image to be filtered specifically includes the following steps:
s201, color information of each pixel point in the left image and the right image based on the RGB color space is obtained respectively.
In this embodiment, color information of each pixel point in the left image and the right image based on the RGB color space is obtained respectively, that is, color information of each pixel point in the left image based on the RGB color space is obtained, and color information of each pixel point in the right image based on the RGB color space is obtained.
Wherein, each pixel point is the color information of each channel of each pixel point in RGB color space based on RGB color information, including the color of channel RInformation I R Color information I of channel G G And color information I of channel B B
S202, color information of each pixel point in the left image and the right image based on a CIELAB color space is obtained respectively.
In this embodiment, color information of each pixel point in the left image and the right image based on the CIELAB color space is obtained, that is, the left image is converted from the RGB color space to the CIELAB color space, and the right image is converted from the RGB color space to the CIELAB color space.
In this embodiment, after the color space conversion is completed, color information of each pixel point in the left image based on the CIELAB color space is obtained, and color information of each pixel point in the right image based on the CIELAB color space is obtained.
Wherein, the color information of each pixel point in the CIELAB color space based on the CIELAB color information comprises the color information I of the channel L L Color information of channel A I A And color information I of channel B b
S203, based on the color information of each pixel point in the left image and the right image based on the RGB color space and the color information of each pixel point in the left image and the right image based on the CIELAB color space, calculating the matching cost of each pixel point in the first image based on the color information.
In the embodiment, based on color information of each pixel point in the left image based on the RGB color space, color information of each pixel point in the right image based on the RGB color space, color information of each pixel point in the left image based on the CIELAB color space, and color information of each pixel point in the right image based on the CIELAB color space, the matching cost of each pixel point in the first image based on the color information is calculated through a first matching cost calculation formula; wherein the first image is a left image or a right image.
Wherein, the first matching cost calculation formula is:
C color (p,d)=min[(1/6)|C left (P)-C rifht (p-d)|| 1 ,T AD ]
wherein, C color (P, d) represents the matching cost of the pixel point P based on the color information, C left (P) represents the color characteristics (i.e., color information) of the image on the left side of the pixel P, C rifht (P-d) represents the color characteristic of the pixel point P in the right image, d represents the distance between the pixel point P in the left image and the right image, T AD Representing a color cutoff threshold.
It should be noted that the color characteristic of each pixel point can be represented as C = (I) R ,I G ,I B ,I L ,I A ,I b )。
In the method provided by the embodiment of the application, the color space is converted from the RGB space to the CIELAB color space by converting the color spaces of the left image and the right image, and the color information of the L, A and B channels of the CIELAB color space is extracted, so that the color characteristics (namely the color information) of the pixel points are changed into six bits from three dimensions, the color information of the images is enriched, the problem of low matching precision of weak texture regions caused by similar color change trends of the RGB images is solved, and the color western information of the original images (namely the left image and the right image) is fully reserved.
And S204, respectively obtaining the gradient information of each pixel point in the left image and the right image.
In this embodiment, the gradient information of each pixel point in the left image and the right image is obtained respectively, that is, the gradient information of each pixel point in the left image is obtained, and the gradient information of each pixel point in the right image is obtained. Specifically, the left image and the right image are respectively subjected to graying processing, the gradient information of each pixel point in the left image is extracted based on the grayed left image, and the gradient information of each pixel point in the right image is extracted based on the grayed right image.
S205, based on the gradient information of each pixel point in the left image and the right image, calculating the matching cost of each pixel point in the first image based on the gradient information.
In this embodiment, based on the gradient information of each pixel point in the left image and the gradient information of each pixel point in the right image, the matching cost of each pixel point in the first image based on the gradient information is calculated through the second matching cost calculation formula.
Wherein, the second matching cost calculation formula is:
Figure BDA0003895708020000091
wherein, C grad (P, d) represents the matching cost of the pixel point P based on the gradient information,
Figure BDA0003895708020000092
the gradient information representing the image of the pixel point P on the left side,
Figure BDA0003895708020000093
representing the gradient information, T, of the pixel P in the right image grad Representing the gradient cutoff threshold.
S206, calculating the fusion cost of each pixel point in the first image according to the matching cost of each pixel point in the first image based on the color information and the matching cost of each pixel point in the first image based on the gradient information.
In this embodiment, the fusion cost of each pixel point in the first image is calculated through a first cost fusion calculation formula according to the matching cost of each pixel point in the first image based on the color information and the matching cost of each pixel point in the first image based on the gradient information.
Wherein, the first price fusion calculation formula is as follows:
g(p,d)=αC color +(1-α)C grad
wherein, g (P, d) represents the fusion cost of the pixel point P, and alpha is weight and is used for controlling the ratio of color information to gradient information.
And S207, aiming at each pixel point in the first image, taking the fusion cost of the pixel point as the pixel value of the pixel point.
In this embodiment, for each pixel point in the first image, the fusion cost of the pixel point is used as the pixel value of the pixel point.
S208, constructing a first image to be filtered based on the pixel value of each pixel point in the first image.
In this embodiment, the first image to be filtered is constructed based on the pixel value of each pixel point in the first image, that is, the pixel value of each pixel point in the first image to be filtered is the fusion cost of the pixel point.
According to the method, the color information and the gradient information of the pixel points are extracted through the bicolor space, so that the color information and the gradient information of the pixel points are extracted based on the bicolor space, the matching cost is calculated, and the matching precision of the weak texture region is improved.
For example, fig. 3 is a parameter diagram, and fig. 3 shows an effect diagram obtained by applying the present application to perform matching cost calculation and an effect diagram obtained by applying the other existing matching cost calculation methods to perform matching cost calculation, where a diagram in fig. 3 is an effect diagram obtained by performing matching cost calculation based on color information in an RGB color space, b diagram in fig. 3 is an effect diagram obtained by performing matching cost calculation based on gradient information in the RGB color space, c diagram in fig. 3 is an effect diagram obtained by performing matching cost calculation based on color information and gradient information in the RGB color space, and d diagram in fig. 3 is an effect diagram obtained by applying the present application to perform matching cost calculation, that is, an effect diagram obtained by performing color information and gradient information in the RGB and CIELAB color spaces, and as can be seen from fig. 3, textures of an effect diagram obtained by applying the present application to perform matching cost calculation are richer.
S103, conducting guiding filtering processing on the first image to be filtered to obtain a first filtered image.
In this embodiment, the first image to be filtered is subjected to the guided filtering processing by introducing the edge-preserving term, so as to obtain the first filtered image.
Referring to fig. 4, the process of performing guided filtering processing on the first image to be filtered to obtain the first filtered image specifically includes the following steps:
s401, constructing a traditional guide filtering energy function based on a first image to be filtered and a preset guide image, and constructing a target guide filtering energy function based on the traditional guide filtering energy function and an edge holding term.
In this embodiment, a conventional guided filtering energy function is constructed based on the first image to be filtered and the preset guide image.
Wherein, the traditional guide filtering energy function is:
Figure BDA0003895708020000111
wherein g is a first image to be filtered, I is a guide image, epsilon is a regularization parameter, and a k And b k Is a linear coefficient. Optionally, epsilon is greater than 0, it should be noted that the regularization parameter epsilon acts as a linear coefficient a k And punishment is carried out, the larger the epsilon value is, the stronger the punishment is, the smoother the filtered image is, the smaller the epsilon value is, the weaker the punishment is, and the more complete the edge retention of the filtered image is.
In this embodiment, a target guide filtering energy function is constructed based on a conventional guide filtering energy function and an edge preservation term.
Wherein the edge retention term is gamma G (k),Γ G (k) Can be expressed as:
Figure BDA0003895708020000112
Figure BDA0003895708020000113
wherein, w k For a window centered on k pixels, | w k And | | is the number of pixel points in the window with the k pixel point as the center, q is the rest pixel points in the window, and h, s and v are three components of the HSV color space.
In the present embodiment, Γ is defined according to the average color difference G (k) The larger the imageThe color and the texture are richer. Compared with the traditional variance regularization parameter epsilon based on the RGB color space, the weighting strategy based on the HSV space can obtain higher matching precision. In the same scene, the color changes of the R, G and B channels of the image have similarity, the discrimination is not high, and the information of the image cannot be effectively extracted, while the HSV color space extracts the image information from three aspects of hue, saturation and brightness, and the information of the image can be more effectively extracted. The H component mainly reflects the color change condition of the image, and can effectively distinguish the foreground and the background of the image; the S component highlights the image boundary and color change; the V component mainly reflects the brightness change in the image, so that the problem of repeated calculation in an RGB space can be well solved, and the matching precision is improved.
Exemplarily, referring to fig. 5, fig. 5 illustrates respective component images of the Tsukuba image in RGB space and HSV space, including an R component image, a G component image, a B component image, an H component image, an S component image, and a V component image.
Wherein the target guided filter energy function is:
Figure BDA0003895708020000121
it should be noted that the target-oriented filtering energy function includes a linear coefficient a k And b k
S402, solving the target guide filtering energy function to obtain a solved value of the linear coefficient.
In this embodiment, a least square method is used to solve the target-guided filtering energy function, so as to obtain a solution value of the linear coefficient.
Wherein, the solving value of the sexual coefficient is as follows:
Figure BDA0003895708020000122
Figure BDA0003895708020000123
wherein, mu k Is I in the window w k The average value of the inner (c) average value,
Figure BDA0003895708020000124
is g in the window w k Mean of the inner.
And S403, obtaining a first filtering image based on the linear coefficient solving value and the guide image.
In this embodiment, based on the solution value of the linear coefficient and the guide image, the first filter image is obtained by guiding the filter, and the first filter image may be q i It is shown that the process of the present invention,
Figure BDA0003895708020000125
in this embodiment, for the window w with rich texture and containing edges k ,Γ G (k) If the value is larger, then
Figure BDA0003895708020000126
Small value, a k The size of the filter is larger, so that the filter is sensitive to edge information and can well reserve edges; for a less textured, flatter window w k ,Γ G (k) If the value is smaller, then
Figure BDA0003895708020000127
Large value, a k And the filter is small, so that the filter is insensitive to the edge, and can well smooth a non-edge area and remove noise. Compared with the traditional guide filter, the improved guide filter can adapt to the filtering of each area of the image and obtain better effect.
And S104, calculating an initial parallax value of the first filtered image, and calculating a Hamming distance between the left image and the right image.
In this embodiment, the initial parallax value of the first filtered image is calculated, that is, the initial parallax value of each pixel in the first filtered image is calculated, and specifically, the initial parallax value of each pixel in the first filtered image is calculated through a first parallax value calculation formula based on the pixel value and the position information of each pixel in the first filtered image.
Wherein, the first parallax value calculation formula is:
C new (x,y,d)=|d-D (x,y) |
wherein, C new (x, y, d) is the initial parallax value of the pixel, x and y are the coordinates of the pixel, d is equal to [0, d ] max ],d max Is the maximum parallax (i.e., the distance between a pixel point in the left image and a corresponding pixel point in the right image), D (x,y) The pixel values of the pixel points in the first filtered image.
In this embodiment, a hamming distance between the left image and the right image is calculated, that is, a hamming distance between each pixel point in the left image and a corresponding pixel point in the right image is calculated, and specifically, stereo matching census transformation is performed on the left image and the right image; and calculating the Hamming distance between each pixel point in the left image and the corresponding pixel point in the right image based on the left image and the right image transformed by census.
And S105, performing cost recombination on the initial parallax value and the Hamming distance of the first filtering image to obtain a second image to be filtered.
In this embodiment, cost recombination is performed on the initial disparity value of the first filtered image and the calculated hamming distance, so that a second image to be filtered is obtained.
Referring to fig. 6, the process of performing cost recombination on the initial disparity value and the hamming distance of the first filtered image to obtain the second image to be filtered specifically includes the following steps:
s601, calculating the fusion cost of each pixel point in the first filtering image based on the initial parallax value of each pixel value in the first filtering image and the Hamming distance between each pixel point in the left image and the corresponding pixel point in the right image.
In this embodiment, based on the initial parallax value of each pixel point in the first filtered image and the hamming distance between each pixel point in the left image and the corresponding pixel point in the right image, the fusion cost of each pixel point in the first filtered image is calculated through the second cost fusion calculation formula.
Wherein, the second price fusion calculation formula is:
C=C new ×exp(-Ham ming d )
wherein C is the fusion cost of the pixel points, hamming d Is the hamming distance with a disparity d.
According to the method provided by the embodiment of the application, the characteristic that census transformation is insensitive to illumination and noise is effectively utilized, the matching precision under the conditions of unbalanced illumination and noise is improved, and meanwhile, the image edge is further reserved.
S602, aiming at each pixel point in the first filtering image, taking the fusion cost of the pixel point as the pixel value of the pixel point.
In this embodiment, for each pixel point in the first filtered image, the fusion cost of the pixel point is used as the pixel value of the pixel point.
S603, constructing a second image to be filtered based on the pixel value of each pixel point in the first filtered image.
In this embodiment, the second image to be filtered is constructed based on the pixel value of each pixel point in the first filtered image.
In order to verify the effectiveness of the second image to be filtered, the inventor conducts a simulation experiment, and effect graphs obtained by applying the method and other methods are shown in fig. 7, wherein the first column is an effect graph obtained by applying standard parallax, the second column is an effect graph obtained by applying an adaptive window, the third column is an effect graph obtained by applying Cost Filter, and the fourth column is an effect graph obtained by applying the method.
In order to make the effect map more obvious, detail enlargement processing is performed on part of the effect map in fig. 7, as shown in fig. 8. In conjunction with fig. 7 and 8, it can be seen that the method of the present application is significantly superior to other methods in both image edge retention and image smoothness.
And S106, conducting guiding filtering processing on the second image to be filtered to obtain a second filtered image.
In this embodiment, the guided filtering process is performed on the second image to be filtered to obtain a second filtered image, and for a specific implementation process, reference is made to step S103, which is not described herein again.
And S107, filling holes in the second filtered image, and performing weighted median filtering on the second filtered image after the holes are filled to obtain a target image.
In this embodiment, the hole filling is performed on the second filtered image, specifically, referring to fig. 9, the method includes the following steps:
s901, calculating a target parallax value of each pixel point in the second filtered image based on the pixel value of each pixel point in the second filtered image.
In this embodiment, based on the pixel value of each pixel point in the second filtered image, the target parallax value of each pixel point in the second filtered image is calculated through the second parallax calculation formula.
Wherein the second parallax calculation formula is:
d L =arg minC
wherein d is L The target disparity value of the pixel point is obtained.
S902, identifying error pixel points in the second filtered image based on the target parallax value of each pixel point in the second filtered image.
In this embodiment, the error pixel in the second filtered image is identified based on the target parallax value of each pixel in the second filtered image, specifically, for each pixel in the second filtered image, whether the target parallax value of the pixel is greater than a preset threshold is determined, if so, the pixel is determined to be an error point, and if not, the pixel is not determined to be an error point.
Referring to fig. 10, error points identified by applying the queuing distance-limiting prompting method of the present application and error points identified by applying the Cost Filter and the adaptive window are shown in fig. 10, where black points are error points. In a clear view of the above, it is known that,
and S903, filling holes in the identified error pixel points.
In this embodiment, the hole filling is performed on all the identified error pixel points.
In this embodiment, after the hole filling of the error pixel point in the second filtered image is completed, the weighted median filtering is performed on the second filtered image after the hole filling, so as to obtain a target image.
And S108, calculating the queuing distance of the queuing team through the target image.
In this embodiment, a queuing distance of the queue is calculated through the target image, and specifically, for each pixel point in the target image, the queuing distance of the pixel point is calculated through a distance calculation formula based on a pixel value of the pixel point;
the distance calculation formula is as follows:
Figure BDA0003895708020000151
wherein Z is the queuing distance of the pixel points, b is the preset base length, f is the focal length of the camera device, and d is the pixel value of the pixel points;
in this embodiment, the queuing distance of each pixel point is used to form the queuing distance of the queuing team.
S109, if the queuing distance of the queuing team is determined to meet the distance early warning rule, outputting prompt information; the prompt message is used for prompting the user to keep the queuing distance.
In the embodiment, whether the queuing distance of the queuing group meets the distance early warning rule or not is judged, and if the queuing distance of the queuing group meets the distance early warning rule is determined, prompt information is output; and the prompt information is used for prompting the user to keep the queuing distance, and if the queuing distance of the queuing group is determined not to meet the distance early warning rule, no processing is executed.
In this embodiment, the process of determining that the queuing distance of the queuing team satisfies the distance warning rule includes the following steps:
aiming at each pixel point in the target image, if the queuing distance of the pixel point is greater than a preset distance threshold, determining the pixel point as a target pixel point;
and if the ratio between the number of the target pixel points and the number of the pixel points included in the target image is greater than a preset ratio threshold, determining that the queuing distance of the queuing queue meets the distance early warning rule.
In this embodiment, the preset duty threshold is preferably 80%.
The queuing distance limiting prompting method provided by the embodiment of the application realizes the calculation of the queuing distance of the queuing team, so that when the distance meets the distance early warning rule, a user is prompted to keep the distance in time, and further the information leakage of a teller is avoided. And the pixel characteristics are improved to six dimensions by performing matching cost calculation in the RGB color space and the CIELAB color space, so that the defect that the pixel characteristics of the traditional algorithm are insufficient in a weak texture area is overcome. And an edge holding item is added in the guide filtering, so that the image edge is better kept, the cost recombination is completed through census transformation and parallax after the cost calculation, and then secondary guide filtering is performed, so that the precision is greatly improved.
Referring to fig. 11, a specific implementation process of the queuing distance-limiting prompting method provided by the present application is illustrated as follows:
step S1, inputting an image pair shot by a left camera and a right camera;
and S2, firstly, cost calculation is carried out on the input image pair, the color information of the image is extracted in the RGB color space and the CIELAB color space, then the gradient information of the image is extracted, and finally the final matching cost is obtained through weighting fusion.
S3, conducting cost aggregation on the matching cost obtained in the step S1 by using guided filtering, obtaining an initial parallax value from the aggregated result through WTA, conducting census transformation on the input image pair to obtain a Hamming distance, conducting weighted fusion on the obtained Hamming distance and the initial parallax value to complete cost recombination, and finally conducting secondary guided filtering on the matching cost after the cost recombination;
and S4, completing parallax selection on the matching cost obtained in the step S3 through a WTA, wherein a WTA formula can be expressed as:
and S5, firstly carrying out left-right consistency detection on the initial disparity map obtained in the step S4, then carrying out cavity filling on the detected error points, and finally carrying out weighted median filtering processing on the filled disparity map to obtain a final disparity map.
And S6, obtaining the final distance of the parallax obtained in the step S5 through a formula, comparing the final distance with a set threshold value, and if the final distance exceeds the threshold value, prompting to keep the distance through voice.
According to the queuing distance-limiting prompting method, cost calculation is carried out in an RGB color space and a CIELAB color space, pixel characteristics are improved to six dimensions, and the defect that the pixel characteristics of a traditional algorithm in a weak texture area are insufficient is overcome. In the cost aggregation stage, guiding filtering of an introduced edge holding item is used, a parallax value is obtained through parallax calculation, hamming distances are fused to update an initial cost value, secondary guiding filtering is performed, initial parallax is obtained through WTA in the parallax selection stage, parallax optimization is performed on a parallax image to obtain a final parallax image, and the distance is obtained through final calculation. When the distance is smaller than the set threshold, the system prompts to keep the distance. And secondly, innovation is carried out on the process of stereo matching, and the accuracy of the algorithm is further improved.
It should be noted that while operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. Under certain circumstances, multitasking and parallel processing may be advantageous.
It should be understood that the various steps recited in the method embodiments disclosed herein may be performed in a different order and/or performed in parallel. Moreover, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the disclosure is not limited in this respect.
Corresponding to the method described in fig. 1, an embodiment of the present application further provides a queuing distance limiting prompting device, which is used for specifically implementing the method in fig. 1, and a schematic structural diagram of the queuing distance limiting prompting device is shown in fig. 12, and specifically includes:
an acquisition unit 1201 for acquiring a left image obtained by capturing the left side of the queue and a right image obtained by capturing the right side of the queue;
a cost calculation unit 1202, configured to perform matching cost calculation on the left image and the right image to obtain a first image to be filtered;
a first filtering unit 1203, configured to perform guided filtering processing on the first image to be filtered to obtain a first filtered image;
a first calculating unit 1204 for calculating an initial disparity value of the first filtered image and calculating a hamming distance between the left image and the right image;
a cost recombining unit 1205, configured to perform cost recombination on the initial disparity value of the first filtered image and the hamming distance, to obtain a second image to be filtered;
a second filtering unit 1206, configured to perform the guided filtering processing on the second image to be filtered to obtain a second filtered image;
a hole filling unit 1207, configured to perform hole filling on the second filtered image, and perform weighted median filtering on the second filtered image after the hole filling to obtain a target image;
a second calculating unit 1208, configured to calculate a queuing distance of the queuing team according to the target image;
a prompt unit 1209, configured to output a prompt message if it is determined that the queuing distance of the queuing team satisfies the distance early warning rule; the prompt message is used for prompting the user to keep the queuing distance.
The queuing distance limiting prompting device provided by the embodiment of the application realizes the calculation of the queuing distance of a queuing team, so that a user is prompted to keep the distance in time when the distance meets the distance early warning rule, and further the information leakage of a teller is avoided. And the pixel characteristics are improved to six dimensions by performing matching cost calculation in the RGB color space and the CIELAB color space, and the defect that the pixel characteristics of the traditional algorithm are insufficient in the weak texture area is overcome. And an edge holding item is added in the guide filtering, so that the image edge is better kept, the cost recombination is completed through census transformation and parallax after the cost calculation, and then secondary guide filtering is performed, so that the precision is greatly improved.
In an embodiment of the present application, based on the foregoing scheme, the cost calculating unit 1202 is specifically configured to:
respectively acquiring color information of each pixel point in the left image and the right image based on an RGB color space;
respectively acquiring color information of each pixel point in the left image and the right image based on a CIELAB color space;
calculating the matching cost of each pixel point in the first image based on the color information based on the RGB color space of each pixel point in the left side image and the right side image and the color information based on the CIELAB color space of each pixel point in the left side image and the right side image; the first image is the left image or the right image;
respectively acquiring gradient information of each pixel point in the left image and the right image;
calculating the matching cost of each pixel point in the first image based on the gradient information according to the gradient information of each pixel point in the left image and the right image;
calculating the fusion cost of each pixel point in the first image based on the matching cost of each pixel point in the first image based on color information and the matching cost of each pixel point in the first image based on gradient information;
aiming at each pixel point in the first image, taking the fusion cost of the pixel point as the pixel value of the pixel point;
and constructing a first image to be filtered based on the pixel value of each pixel point in the first image.
In an embodiment of the present application, based on the foregoing solution, when the first calculating unit 1204 calculates the initial disparity value of the first filtered image, it is specifically configured to:
calculating an initial parallax value of each pixel point in the first filtering image through a parallax value calculation formula based on the pixel value and the position information of each pixel point in the first filtering image;
the first calculating unit 1204, when calculating the hamming distance between the left image and the right image, is specifically configured to:
performing stereo matching census transformation on the left image and the right image;
and calculating the Hamming distance between each pixel point in the left image and the corresponding pixel point in the right image based on the left image and the right image transformed by census.
In an embodiment of the present application, based on the foregoing scheme, the cost restructuring unit 1205 is specifically configured to:
calculating the fusion cost of each pixel point in the first filtering image based on the initial parallax value of each pixel point in the first filtering image and the Hamming distance between each pixel point in the left image and the corresponding pixel point in the right image;
aiming at each pixel point in the first filtering image, taking the fusion cost of the pixel point as the pixel value of the pixel point;
and constructing a second image to be filtered based on the pixel value of each pixel point in the first filtered image.
In an embodiment of the present application, based on the foregoing scheme, the first filtering unit 1203 is specifically configured to:
constructing a traditional guide filtering energy function based on the first image to be filtered and a preset guide image, and constructing a target guide filtering energy function based on the traditional guide filtering energy function and an edge holding term; the target guide filtering energy function comprises linear coefficients;
solving the target guide filtering energy function to obtain a solution value of a linear coefficient;
and obtaining a first filtering image based on the solving value of the linear coefficient and the guide image.
In an embodiment of the present application, based on the foregoing scheme, the hole filling unit 1207 is specifically configured to:
calculating a target disparity value of each pixel point in the second filtered image based on the pixel value of each pixel point in the second filtered image;
identifying an error pixel point in the second filtered image based on the target disparity value of each pixel point in the second filtered image;
and filling holes in the identified error pixel points.
In an embodiment of the present application, based on the foregoing scheme, the second calculating unit 1208 is specifically configured to:
aiming at each pixel point in the target image, calculating the queuing distance of the pixel point through a distance calculation formula based on the pixel value of the pixel point;
the distance calculation formula is as follows:
Figure BDA0003895708020000201
wherein Z is the queuing distance of the pixel point, b is the preset base length, f is the focal length of the camera device, and d is the pixel value of the pixel point;
the queuing distance of each pixel point is formed into the queuing distance of the queuing team;
the prompting unit 1209 is specifically configured to:
aiming at each pixel point in the target image, if the queuing distance of the pixel point is greater than a preset distance threshold, determining the pixel point as a target pixel point;
and if the ratio between the number of the target pixel points and the number of the pixel points included in the target image is greater than a preset ratio threshold, determining that the queuing distance of the queuing queue meets the distance early warning rule.
The embodiment of the present application further provides a storage medium, where an instruction set is stored in the storage medium, where the queuing distance-limiting prompting method disclosed in any of the above embodiments is executed when the instruction set is executed.
An electronic device is further provided in the embodiments of the present application, and a schematic structural diagram of the electronic device is shown in fig. 13, and specifically includes a memory 1301 for storing at least one group of instruction sets; a processor 1302, configured to execute the instruction set stored in the memory, and implement the queuing distance-limiting prompting method disclosed in any of the above embodiments by executing the instruction set.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
While several specific implementation details are included in the above discussion, these should not be construed as limitations on the scope of the disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
The foregoing description is only exemplary of the preferred embodiments disclosed herein and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the disclosure herein is not limited to the particular combination of features described above, but also encompasses other arrangements formed by any combination of the above features or their equivalents without departing from the spirit of the disclosure. For example, the above features and (but not limited to) technical features having similar functions disclosed in the present disclosure are mutually replaced to form the technical solution.

Claims (10)

1. A queuing distance limit prompting method is characterized by comprising the following steps:
acquiring a left image obtained by shooting the left side of a queuing team and a right image obtained by shooting the right side of the queuing team;
performing matching cost calculation on the left image and the right image to obtain a first image to be filtered;
performing guided filtering processing on the first image to be filtered to obtain a first filtered image;
calculating an initial disparity value of the first filtered image and calculating a hamming distance between the left image and the right image;
performing cost recombination on the initial parallax value of the first filtering image and the Hamming distance to obtain a second image to be filtered;
performing the guided filtering processing on the second image to be filtered to obtain a second filtered image;
filling holes in the second filtered image, and performing weighted median filtering on the second filtered image after the holes are filled to obtain a target image;
calculating the queuing distance of the queuing team through the target image;
if the queuing distance of the queuing team is determined to meet the distance early warning rule, outputting prompt information; the prompt message is used for prompting the user to keep the queuing distance.
2. The method according to claim 1, wherein the performing the matching cost calculation on the left image and the right image to obtain a first image to be filtered includes:
respectively acquiring color information of each pixel point in the left image and the right image based on an RGB color space;
respectively acquiring color information of each pixel point in the left image and the right image based on a CIELAB color space;
calculating the matching cost of each pixel point in the first image based on the color information based on the RGB color space of each pixel point in the left side image and the right side image and the color information based on the CIELAB color space of each pixel point in the left side image and the right side image; the first image is the left image or the right image;
respectively obtaining gradient information of each pixel point in the left image and the right image;
calculating the matching cost of each pixel point in the first image based on the gradient information according to the gradient information of each pixel point in the left image and the right image;
calculating the fusion cost of each pixel point in the first image based on the matching cost of each pixel point in the first image based on color information and the matching cost of each pixel point in the first image based on gradient information;
aiming at each pixel point in the first image, taking the fusion cost of the pixel point as the pixel value of the pixel point;
and constructing a first image to be filtered based on the pixel value of each pixel point in the first image.
3. The method of claim 2, wherein said computing an initial disparity value for the first filtered image comprises:
calculating an initial parallax value of each pixel point in the first filtering image through a parallax value calculation formula based on the pixel value and the position information of each pixel point in the first filtering image;
the calculating a hamming distance between the left image and the right image comprises:
performing stereo matching census transformation on the left image and the right image;
and calculating the Hamming distance between each pixel point in the left image and the corresponding pixel point in the right image based on the left image and the right image transformed by census.
4. The method according to claim 3, wherein the performing cost recomposition on the initial disparity value and the hamming distance of the first filtered image to obtain a second image to be filtered comprises:
calculating the fusion cost of each pixel point in the first filtering image based on the initial parallax value of each pixel point in the first filtering image and the Hamming distance between each pixel point in the left image and the corresponding pixel point in the right image;
aiming at each pixel point in the first filtering image, taking the fusion cost of the pixel point as the pixel value of the pixel point;
and constructing a second image to be filtered based on the pixel value of each pixel point in the first filtered image.
5. The method according to claim 1, wherein the performing a guided filtering process on the first image to be filtered to obtain a first filtered image comprises:
constructing a traditional guide filtering energy function based on the first image to be filtered and a preset guide image, and constructing a target guide filtering energy function based on the traditional guide filtering energy function and an edge holding term; the target guide filtering energy function comprises linear coefficients;
solving the target guide filtering energy function to obtain a solution value of a linear coefficient;
and obtaining a first filtering image based on the solving value of the linear coefficient and the guide image.
6. The method of claim 1, wherein said hole filling said second filtered image comprises:
calculating a target disparity value of each pixel point in the second filtered image based on the pixel value of each pixel point in the second filtered image;
identifying an error pixel point in the second filtered image based on the target disparity value of each pixel point in the second filtered image;
and filling holes in the identified error pixel points.
7. The method of claim 1, wherein said calculating a queuing distance for said queuing team from said target image comprises:
aiming at each pixel point in the target image, calculating the queuing distance of the pixel point through a distance calculation formula based on the pixel value of the pixel point;
the distance calculation formula is as follows:
Figure FDA0003895708010000031
wherein Z is the queuing distance of the pixel point, b is the preset baseline length, f is the focal length of the camera device, and d is the pixel value of the pixel point;
the queuing distance of each pixel point is formed into the queuing distance of the queuing team;
the determining that the queuing distance of the queuing team meets the distance early warning rule comprises:
aiming at each pixel point in the target image, if the queuing distance of the pixel point is greater than a preset distance threshold, determining the pixel point as a target pixel point;
and if the ratio between the number of the target pixel points and the number of the pixel points included in the target image is greater than a preset ratio threshold, determining that the queuing distance of the queuing queue meets the distance early warning rule.
8. A queuing distance limit prompting device is characterized by comprising:
an acquisition unit configured to acquire a left image obtained by capturing the left side of a queue and a right image obtained by capturing the right side of the queue;
the cost calculation unit is used for performing matching cost calculation on the left image and the right image to obtain a first image to be filtered;
the first filtering unit is used for conducting guiding filtering processing on the first image to be filtered to obtain a first filtered image;
a first calculation unit for calculating an initial disparity value of the first filtered image and calculating a hamming distance between the left image and the right image;
the cost recombination unit is used for performing cost recombination on the initial parallax value of the first filtering image and the Hamming distance to obtain a second image to be filtered;
the second filtering unit is used for performing the guided filtering processing on the second image to be filtered to obtain a second filtered image;
the cavity filling unit is used for filling cavities in the second filtered image and performing weighted median filtering on the second filtered image after the cavities are filled to obtain a target image;
the second calculation unit is used for calculating the queuing distance of the queuing team through the target image;
the prompting unit is used for outputting prompting information if the queuing distance of the queuing team is determined to meet the distance early warning rule; the prompt message is used for prompting the user to keep the queuing distance.
9. A storage medium storing a set of instructions, wherein the set of instructions, when executed by a processor, implement the queuing distance-indicating method of any of claims 1-7.
10. An electronic device, comprising:
a memory for storing at least one set of instructions;
a processor for executing a set of instructions stored in said memory, said set of instructions being executable to implement the queuing range-limiting prompting method of any of claims 1-7.
CN202211276927.9A 2022-10-18 2022-10-18 Queuing distance limiting prompting method and device, storage medium and electronic equipment Pending CN115631451A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211276927.9A CN115631451A (en) 2022-10-18 2022-10-18 Queuing distance limiting prompting method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211276927.9A CN115631451A (en) 2022-10-18 2022-10-18 Queuing distance limiting prompting method and device, storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN115631451A true CN115631451A (en) 2023-01-20

Family

ID=84907451

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211276927.9A Pending CN115631451A (en) 2022-10-18 2022-10-18 Queuing distance limiting prompting method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN115631451A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116542938A (en) * 2023-05-09 2023-08-04 深圳聚源视芯科技有限公司 Binocular vision-based parallax post-processing system and method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116542938A (en) * 2023-05-09 2023-08-04 深圳聚源视芯科技有限公司 Binocular vision-based parallax post-processing system and method
CN116542938B (en) * 2023-05-09 2024-04-09 深圳聚源视芯科技有限公司 Binocular vision-based parallax post-processing system and method

Similar Documents

Publication Publication Date Title
CN111741211B (en) Image display method and apparatus
CN107301402B (en) Method, device, medium and equipment for determining key frame of real scene
CN108446694B (en) Target detection method and device
CN108389224B (en) Image processing method and device, electronic equipment and storage medium
US20140079319A1 (en) Methods for enhancing images and apparatuses using the same
Nalpantidis et al. Biologically and psychophysically inspired adaptive support weights algorithm for stereo correspondence
CN105243371A (en) Human face beauty degree detection method and system and shooting terminal
US20060110052A1 (en) Image signal processing
CN111754396B (en) Face image processing method, device, computer equipment and storage medium
US20130170736A1 (en) Disparity estimation depth generation method
JP2016500975A (en) Generation of depth maps from planar images based on combined depth cues
CN110675385B (en) Image processing method, device, computer equipment and storage medium
CN104574358B (en) From the method and apparatus for focusing heap image progress scene cut
Liu et al. Image de-hazing from the perspective of noise filtering
CN105809651A (en) Image saliency detection method based on edge non-similarity comparison
CN111340077B (en) Attention mechanism-based disparity map acquisition method and device
US10582179B2 (en) Method and apparatus for processing binocular disparity image
CN110111347B (en) Image sign extraction method, device and storage medium
Ling et al. Image quality assessment for free viewpoint video based on mid-level contours feature
CN116681636B (en) Light infrared and visible light image fusion method based on convolutional neural network
CN111127309A (en) Portrait style transfer model training method, portrait style transfer method and device
CN114998320B (en) Method, system, electronic device and storage medium for visual saliency detection
CN115578284A (en) Multi-scene image enhancement method and system
CN115631451A (en) Queuing distance limiting prompting method and device, storage medium and electronic equipment
CN113592018A (en) Infrared light and visible light image fusion method based on residual dense network and gradient loss

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination