CN117541995A - Scene monitoring method and device, electronic equipment and computer readable storage medium - Google Patents

Scene monitoring method and device, electronic equipment and computer readable storage medium Download PDF

Info

Publication number
CN117541995A
CN117541995A CN202311606006.9A CN202311606006A CN117541995A CN 117541995 A CN117541995 A CN 117541995A CN 202311606006 A CN202311606006 A CN 202311606006A CN 117541995 A CN117541995 A CN 117541995A
Authority
CN
China
Prior art keywords
image
scene
difference
pixel
pixel point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311606006.9A
Other languages
Chinese (zh)
Inventor
林国森
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ruiyun Qizhi Qingdao Technology Co ltd
Original Assignee
Ruiyun Qizhi Qingdao Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ruiyun Qizhi Qingdao Technology Co ltd filed Critical Ruiyun Qizhi Qingdao Technology Co ltd
Priority to CN202311606006.9A priority Critical patent/CN117541995A/en
Publication of CN117541995A publication Critical patent/CN117541995A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/54Extraction of image or video features relating to texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/761Proximity, similarity or dissimilarity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Computational Linguistics (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to the field of monitoring and discloses a scene monitoring method, a scene monitoring device, electronic equipment and a computer readable storage medium. The scene monitoring method comprises the following steps: acquiring a current scene image, a historical scene image and a reference scene image of an operation scene; respectively acquiring current difference characteristics of the current scene image and the reference scene image and historical difference characteristics of the historical scene image and the reference scene image according to a preset difference algorithm; and determining whether the operation scene changes according to the current difference characteristic and the historical difference characteristic. Compared with the prior art, the scene monitoring method, the scene monitoring device, the electronic equipment and the computer readable storage medium have the advantage of being capable of improving the accuracy of monitoring results while reducing the requirement on hardware resources.

Description

Scene monitoring method and device, electronic equipment and computer readable storage medium
Technical Field
The present invention relates to the field of monitoring, and in particular, to a scene monitoring method, apparatus, electronic device, and computer readable storage medium.
Background
The monitoring equipment is widely applied to daily production and life, namely an important application scene of the monitoring equipment is to monitor whether a fixed area changes, for example, in the fire-fighting field, fire-fighting regulations define whether any unit or person cannot occupy or block a fire-fighting channel, and the monitoring equipment can be used for monitoring whether the fixed scene of the fire-fighting channel is occupied or blocked, and particularly can be used for monitoring whether the fire-fighting channel changes; for example, in the field of unmanned vending, it is necessary to monitor whether or not goods are sold, that is, whether or not the vending area is changed. However, when monitoring whether the fixed area is changed, the monitoring device in the prior art generally needs to use a deep learning algorithm to improve the monitoring accuracy, but the calculation amount of the deep learning algorithm is large, the requirement on hardware resources is high, and the monitoring accuracy is lower due to the fact that the influence of light disturbance is larger when the deep learning algorithm is not used.
Disclosure of Invention
The invention aims to provide a scene monitoring method, a scene monitoring device, electronic equipment and a computer readable storage medium, which can reduce the requirement on hardware resources and improve the accuracy of monitoring results.
In a first aspect, an embodiment of the present invention provides a scene monitoring method, including: acquiring a current scene image, a historical scene image and a reference scene image of an operation scene; respectively acquiring current difference characteristics of the current scene image and the reference scene image and historical difference characteristics of the historical scene image and the reference scene image according to a preset difference algorithm; determining whether the operation scene changes according to the current difference characteristic and the historical difference characteristic; the preset difference algorithm comprises the following steps: acquiring an operation texture feature image and an operation color feature image of an operation scene image, wherein the operation scene image is the current scene image or the historical scene image; acquiring a reference texture feature image and a reference color feature image of the reference scene image; obtaining texture feature differences of the operation texture feature image and the reference texture feature image, and obtaining color feature differences of the operation color feature image and the reference color feature image; and obtaining the difference characteristics of the operation scene image and the reference scene image according to the texture characteristic difference and the color characteristic difference.
Compared with the prior art, in the scene monitoring method provided by the embodiment of the invention, the current difference characteristics of the current scene image and the reference scene image and the historical difference characteristics of the historical scene image and the reference scene image are obtained, whether the operation scene changes or not is determined according to the current difference characteristics and the historical difference characteristics, and the difference characteristics comprise texture characteristic differences and color characteristic differences, so that the influence of the change of illumination intensity in the monitoring process is less, and the accuracy of the monitoring result is improved; in addition, the scene monitoring method provided by the embodiment of the invention does not need to use a deep learning algorithm, thereby reducing the requirement on hardware resources.
In an alternative embodiment, the acquiring an operational texture feature image of the operational scene image includes: acquiring an operation gray image of the operation scene image; dividing the operation gray image into a plurality of pixel areas, wherein each pixel area comprises a plurality of pixel points; for each pixel area, calculating the pixel mean value of all pixel points in the pixel area, and determining the texture characteristic value of each pixel point in the pixel area according to the pixel mean value; and constructing the operation texture feature image according to the texture feature values of all pixel points in the operation gray level image.
In an optional embodiment, the determining the texture feature value of each pixel point in the pixel area according to the pixel mean value includes: determining the texture characteristic value of a class of pixel points as a first preset characteristic value, determining the texture characteristic value of a class of pixel points as a second preset characteristic value, wherein the class of pixel points are the pixel points with pixel values larger than the pixel mean value, and the class of pixel points are the pixel points with pixel values smaller than or equal to the pixel mean value.
In an alternative embodiment, acquiring an operational color feature image of an operational scene image includes: acquiring an HSV space image of the operation scene image; for each pixel point in the HSV space image, determining an operation color of the pixel point according to a H, S, V component of the pixel point, and determining a color characteristic value of the pixel point according to the operation color; and constructing the operation color feature image according to the color feature values of all pixel points in the HSV space image.
In an alternative embodiment, the acquiring the texture difference of the operational texture image and the reference texture image includes: for each operation pixel point in the operation texture feature image, determining a pixel point texture feature difference of the operation pixel point according to a difference value of a texture feature value of the operation pixel point and a texture feature value of a target reference pixel point, wherein the target reference pixel point is the pixel point with the same position as the operation pixel point in the reference texture feature image; the texture feature differences comprise pixel point texture feature differences corresponding to all operation pixel points in the operation texture feature image.
In an alternative embodiment, said obtaining a color feature difference of said operational color feature image and said reference color feature image comprises: for each operation pixel point in the operation color feature image, determining a pixel point color feature difference of the operation pixel point according to a difference value of a color feature value of the operation pixel point and a color feature value of a target reference pixel point, wherein the target reference pixel point is the pixel point with the same position as the operation pixel point in the reference color feature image; the color characteristic differences comprise pixel point color characteristic differences corresponding to all operation pixel points in the operation color characteristic image.
In an alternative embodiment, the obtaining the difference feature of the operation scene image and the reference scene image according to the texture feature difference and the color feature difference includes: for each operation pixel point in the operation color feature image, determining a pixel point difference feature of the operation pixel point according to the sum value of the pixel point texture feature difference and the pixel point color feature difference of the operation pixel point; the difference features comprise pixel point difference features corresponding to all operation pixel points in the operation color feature image.
In an optional embodiment, the determining whether the operation scene changes according to the current difference feature and the historical difference feature includes: determining current difference pixel points of the current scene image and the reference scene image according to the current difference characteristics, and determining historical difference pixel points of the historical scene image and the reference scene image according to the historical difference characteristics; determining a difference pixel area according to the current difference pixel point and the historical difference pixel point; and determining whether the operation scene changes according to the difference pixel area. And determining a difference pixel area according to the current difference pixel point and the historical difference pixel point, and compared with the comparison of a single image, the accuracy of a comparison result is improved, and the accuracy of the whole monitoring result is further improved.
In a second aspect, an embodiment of the present invention provides a scene monitoring device, including: the image acquisition module is used for acquiring a current scene image, a historical scene image and a reference scene image of the operation scene; the difference feature acquisition module is used for respectively acquiring current difference features of the current scene image and the reference scene image and historical difference features of the historical scene image and the reference scene image according to a preset difference algorithm; the change determining module is used for determining whether the operation scene changes according to the current difference characteristic and the historical difference characteristic; the preset difference algorithm comprises the following steps: acquiring an operation texture feature image and an operation color feature image of an operation scene image, wherein the operation scene image is the current scene image or the historical scene image; acquiring a reference texture feature image and a reference color feature image of the reference scene image; obtaining texture feature differences of the operation texture feature image and the reference texture feature image, and obtaining color feature differences of the operation color feature image and the reference color feature image; and obtaining the difference characteristics of the operation scene image and the reference scene image according to the texture characteristic difference and the color characteristic difference.
In a third aspect, an embodiment of the present invention provides an electronic device, including: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform a scene monitoring method as described above.
In a fourth aspect, an embodiment of the present invention provides a computer readable storage medium storing a computer program, where the computer program is executed by a processor to implement the foregoing scene monitoring method.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of a scene monitoring method according to an embodiment of the present disclosure;
fig. 2 is a flowchart of a preset difference algorithm in a scene monitoring method according to an embodiment of the present disclosure;
fig. 3 is a flowchart illustrating a method for acquiring an operational texture feature image in a scene monitoring method according to an embodiment of the present disclosure;
fig. 4 is a flowchart illustrating a process of acquiring an operational color feature image in a scene monitoring method according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of a scene monitoring device according to a second embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of an electronic device according to a third embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. The components of the embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the invention, as presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures.
It should be noted that the features of the embodiments of the present invention may be combined with each other without conflict.
An embodiment of the present invention provides a scene monitoring method, as shown in fig. 1, including the following steps:
step S101: and acquiring a current scene image, a historical scene image and a reference scene image of the operation scene.
Step S102: and respectively acquiring the current difference characteristics of the current scene image and the reference scene image and the historical difference characteristics of the historical scene image and the reference scene image according to a preset difference algorithm.
Step S103: and determining whether the operation scene changes according to the current difference characteristic and the historical difference characteristic.
Compared with the prior art, in the scene monitoring method provided by the embodiment of the invention, the current difference characteristics of the current scene image and the reference scene image and the historical difference characteristics of the historical scene image and the reference scene image are obtained, whether the operation scene changes or not is determined according to the current difference characteristics and the historical difference characteristics, and the difference characteristics comprise texture characteristic differences and color characteristic differences, so that the influence of the change of illumination intensity in the monitoring process is less, and the accuracy of the monitoring result is improved; in addition, the scene monitoring method provided by the embodiment of the invention does not need to use a deep learning algorithm, thereby reducing the requirement on hardware resources.
Specifically, in step S101, the current scene image may be an image obtained by photographing the operation scene at the current moment, the history scene image may be an image obtained by photographing the operation scene in the previous photographing period, and the reference scene image may be an image obtained by photographing the operation scene for the first time. For example, an imaging device such as a camera is specifically configured to shoot an operation scene such as a fire fighting access, an image may be shot for a preset fixed time period every interval, an image obtained by first shooting is used as a reference image, an image shot at the current moment is used as a current scene image, and an image shot at the last shooting moment is used as a historical scene image; or shooting a video, taking a first frame image as a reference image, taking the image of the latest frame as a current scene image, taking the image of the last frame as a historical scene image and the like, and particularly, flexibly setting according to actual needs.
In addition, in some embodiments of the present invention, the current scene image, the historical scene image and the reference scene image may be images of different moments of the operation scene at the same shooting angle and the same shooting distance, that is, actual positions in the operation scene corresponding to pixels at the same pixel position in the current scene image, the historical scene image and the reference scene image are the same, and pixels at the same pixel position in the current scene image, the historical scene image and the reference scene image correspond to each other and correspond to the same actual position in the operation scene.
Specifically, as shown in fig. 2, in step S102, the preset difference algorithm includes:
step S201: and acquiring an operation texture feature image and an operation color feature image of the operation scene image, and acquiring a reference texture feature image and a reference color feature image of the reference scene image.
In this step, the operational scene image may be a current scene image or a historical scene image. When the operation scene image is the current scene image, the operation texture feature image and the operation color feature image are texture feature images and color feature images corresponding to the current scene image; when the operation scene image is a historical scene image, the operation texture feature image and the operation color feature image are texture feature images and color feature images corresponding to the historical scene image.
In some embodiments of the present invention, as shown in fig. 3, acquiring an operational texture feature image of an operational scene image may specifically include:
step S301: and acquiring an operation gray image of the operation scene image.
In this step, the operation scene image may be subjected to gradation conversion, and the converted gradation image may be used as the operation gradation image.
Step S302: the operation gray image is divided into a plurality of pixel areas, and each pixel area comprises a plurality of pixel points.
In this step, the size of the pixel region may be set according to the size of the operation gray-scale image, and may be set to a square region having a plurality of pixel points of 4, 5, 6, 7, and 8 … … sides, for example.
Step S303: and for each pixel area, calculating the pixel mean value of all the pixel points in the pixel area, and determining the texture characteristic value of each pixel point in the pixel area according to the pixel mean value.
In this step, taking a square area with a side length of 5 pixels as an example, the square area includes 25 pixels, calculating the pixel mean value of all the pixels in the pixel area is calculating the average value of 25 gray values of 25 pixels in the pixel area, and the average value is the pixel mean value of the pixel area.
In some embodiments of the present invention, determining texture feature values of each pixel point in the pixel area according to the pixel mean value may specifically be: determining texture characteristic values of a class of pixel points as first preset characteristic values, determining texture characteristic values of a class of pixel points as second preset characteristic values, wherein the class of pixel points are pixel points with pixel values larger than a pixel mean value, and the class of pixel points are pixel points with pixel values smaller than or equal to the pixel mean value. For example, the magnitude relation between the gray value of each pixel point and the pixel mean value may be sequentially compared, the texture feature value of the pixel point with the gray value greater than the pixel mean value is determined to be a first feature value, the first feature value may be 255, the texture feature value of the pixel point with the gray value less than or equal to the pixel mean value is determined to be a second feature value, and the second feature value may be 0, for example.
Step S304: and constructing an operation texture feature image according to the texture feature values of all pixel points in the operation gray image.
In this step, the texture feature value corresponding to each pixel point may be used as the gray value of the pixel point to form an operation texture feature image corresponding to the operation scene image.
In some embodiments of the present invention, as shown in fig. 4, acquiring an operational color feature image of an operational scene image may specifically include:
step S401: and acquiring an HSV space image of the operation scene image.
In this step, the operation scene image may be subjected to image format conversion, and converted into an HSV space image. The HSV space image is a color space and consists of three parameters of hue (H), saturation (S) and brightness (V). Taking an image with an operation scene image as an RGB format as an example, the converting the operation scene image with the RGB format into the HSV space image may specifically include:
s1: the maximum and minimum values of three components (R, G, B) of the operational scene image are calculated.
S2: and normalizing the three components of the operation scene image.
S3: calculate hue (H): if the maximum value is B, h= (4+2g—r)/6; if the maximum value is R, h= (6+2b-G)/6; if the maximum value is G, h= (2+4 r-B)/6.
S4: calculate saturation (S): s=1-3 min (R, G, B)/(r+g+b).
S5: calculate brightness (V): v=max (R, G, B).
S6: and combining the calculation results according to the sequence of H, S, V to obtain the HSV space image.
Step S402: for each pixel point in the HSV space image, determining the operation color of the pixel point according to the H, S, V component of the pixel point, and determining the color characteristic value of the pixel point according to the operation color.
In this step, the operation color of the pixel point can be determined according to the H, S, V component of the pixel point by the table look-up method. Specific tables are shown in the following table 1, and the H, S, V component of each pixel point is substituted into the following table 1 to determine the operation color of the pixel point. In some embodiments of the present invention, each operation color may be numbered, and a number value corresponding to the operation color of each pixel is used as a color feature value of the pixel. For example, for the pixel points with H, S, V components of 125, 145 and 133, the corresponding operation color is purple and the number corresponding to the purple is 9, and the color feature value corresponding to the pixel point is 9.
Step S403: and constructing an operation color characteristic image according to the color characteristic values of all the pixel points in the HSV space image.
In this step, the color feature value corresponding to each pixel point may be used as the gray value of the pixel point to form an operation color feature image corresponding to the operation scene image.
Step S202: and obtaining the texture feature difference of the operation texture feature image and the reference texture feature image, and obtaining the color feature difference of the operation color feature image and the reference color feature image.
In the step, for each operation pixel point in the operation texture feature image, determining a pixel point texture feature difference of the operation pixel point according to a difference value between a texture feature value of the operation pixel point and a texture feature value of a target reference pixel point, wherein the target reference pixel point is the same pixel point as the operation pixel point in the reference texture feature image; the texture feature differences comprise pixel point texture feature differences corresponding to all operation pixel points in the operation texture feature image. For each operation pixel point in the operation color feature image, determining the pixel point color feature difference of the operation pixel point according to the difference value of the color feature value of the operation pixel point and the color feature value of a target reference pixel point, wherein the target reference pixel point is the same pixel point as the operation pixel point in the reference color feature image; the color feature differences comprise pixel point color feature differences corresponding to all operation pixels in the operation color feature image.
In some embodiments of the present invention, for example, the single-channel operational texture feature image and the single-channel operational color feature image may be synthesized into a dual-channel operational feature image according to the correspondence between pixel points. Each pixel point in the dual-channel feature image comprises two channel values, wherein one channel value is a color feature value, and the other channel value is a texture feature value. The same method can also synthesize a reference texture feature image and a reference color feature image to form a dual-channel reference feature image. And respectively calculating the difference value of the channel values of the operation feature image and the reference feature image on the corresponding channel to obtain the pixel point texture feature difference and the pixel point color feature difference.
Further, in some embodiments of the present invention, normalization processing may be performed on the pixel texture feature differences and the pixel color feature differences. For example, when the difference of the pixel point texture features is not 0, the difference of the pixel point texture features is set to 255, and when the difference of the pixel point texture features is 0, the difference of the pixel point texture features is set to 0; when the pixel color characteristic difference is not 0, the pixel color characteristic difference is set to 255, and when the pixel color characteristic difference is 0, the pixel color characteristic difference is set to 0.
Step S203: and obtaining the difference characteristics of the operation scene image and the reference scene image according to the texture characteristic differences and the color characteristic differences.
In this step, for each operation pixel in the operation color feature image, a pixel difference feature of the operation pixel may be determined according to a sum of a pixel texture feature difference and a pixel color feature difference of the operation pixel; the difference features comprise pixel point difference features corresponding to all operation pixels in the operation color feature image.
In some embodiments of the present invention, taking the foregoing construction of the two-channel operation feature image and the reference feature image in step S202 as an example, the difference values of the two channel values of the pixel points may be summed and normalized, where when the sum value is not 0, the pixel point difference feature is set to 255, and when the sum value is 0, the pixel point difference feature is set to 0.
In step S103, a current difference pixel point of the current scene image and the reference scene image may be determined according to the current difference feature, and a history difference pixel point of the history scene image and the reference scene image may be determined according to the history difference feature; determining a difference pixel area according to the current difference pixel point and the historical difference pixel point; and determining whether the operation scene changes according to the difference pixel areas.
Specifically, the current difference feature and the historical difference feature may be subjected to an and process, that is, a current difference feature value and a historical difference feature value corresponding to a corresponding pixel position are subjected to an and process, where a pixel position with a current difference feature value other than 0 is a current difference pixel point of the current scene image and the reference scene image, and a pixel position with a historical difference feature value other than 0 is a historical difference pixel point of the historical scene image and the reference scene image. The pixel position with the result of not being 0 after the AND processing is the differential pixel point, and the region formed by the differential pixel point is the differential pixel region.
The determining whether the operation scene changes according to the difference pixel area can specifically, for example, perform contour recognition on the difference pixel area, then calculate the minimum rectangular area to which the difference pixel area belongs, and determine whether the operation scene changes according to the area, the size and the like of the rectangular area.
The contour recognition is performed on the difference pixel area, and then the calculation of the minimum rectangular area to which the difference pixel area belongs may specifically be: the image is raster scanned, i.e. each pixel is examined row by row and column by column from left to right and top to bottom. When a pixel point having a value other than zero is encountered, it is judged whether it is a starting point of a contour. If so, it is determined whether it is an outer contour or an aperture contour, and it is assigned a number, depending on whether it is fog or not. Starting from the starting point, according to a certain direction rule, treating edge tracking, finding all the pixel points belonging to the same contour, and modifying their grey scalesThe values are positive and negative numbers. From the results of edge tracking, topological relationships between contours, such as parent-child relationships and surrounding relationships, are established. After all contours are calculated, a rectangular region is acquired using the open Rect interface of OpenCV. The principle is that first, all points in the detrused point set or outline find their minimum and maximum x-coordinate and y-coordinate, respectively marked as x min ,x max, y min ,y max . Next, based on the four coordinate values, the coordinates of the upper left corner and the lower right corner of the minimum bounding positive rectangle are determined as (x) min ,y min ) And (x) max ,y max ). Finally, according to the coordinates of the upper left corner and the lower right corner, the width and the height of the minimum circumscribed positive rectangle are calculated, wherein w=x respectively max -x min And h=y max –y min
In a second aspect, a second embodiment of the present invention provides a scene monitoring device, as shown in fig. 5, including: the image acquisition module 501 is used for acquiring a current scene image, a historical scene image and a reference scene image of the operation scene by the image acquisition module 501; the difference feature acquisition module 502 is configured to acquire current difference features of the current scene image and the reference scene image, and historical difference features of the historical scene image and the reference scene image according to a preset difference algorithm, respectively; the change determining module 503, the change determining module 503 is configured to determine whether the operation scene changes according to the current difference feature and the historical difference feature.
It is clear that the scene monitoring device provided in the second embodiment of the present invention is an embodiment of a transpose corresponding to the scene monitoring method provided in the foregoing embodiment, and therefore, the scene monitoring device provided in the second embodiment of the present invention has the same technical effects as the foregoing embodiment, and specific reference may be made to the specific description of the foregoing embodiment.
An embodiment of the present invention relates to an electronic device, as shown in fig. 6, including: at least one processor 601; and a memory 602 communicatively coupled to the at least one processor 601; the memory 602 stores instructions executable by the at least one processor 601, where the instructions are executed by the at least one processor 601, so that the at least one processor 601 can perform the scene monitoring method in the above embodiments.
Where the memory and the processor are connected by a bus, the bus may comprise any number of interconnected buses and bridges, the buses connecting the various circuits of the one or more processors and the memory together. The bus may also connect various other circuits such as peripherals, voltage regulators, and power management circuits, which are well known in the art, and therefore, will not be described any further herein. The bus interface provides an interface between the bus and the transceiver. The transceiver may be one element or may be a plurality of elements, such as a plurality of receivers and transmitters, providing a means for communicating with various other apparatus over a transmission medium. The data processed by the processor is transmitted over the wireless medium via the antenna, which further receives the data and transmits the data to the processor.
The processor is responsible for managing the bus and general processing and may also provide various functions including timing, peripheral interfaces, voltage regulation, power management, and other control functions. And memory may be used to store data used by the processor in performing operations.
The fourth embodiment of the invention relates to a computer-readable storage medium storing a computer program. The computer program implements the above-described method embodiments when executed by a processor.
That is, it will be understood by those skilled in the art that all or part of the steps in implementing the methods of the embodiments described above may be implemented by a program stored in a storage medium, where the program includes several instructions for causing a device (which may be a single-chip microcomputer, a chip or the like) or a processor (processor) to perform all or part of the steps in the methods of the embodiments of the invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The present invention is not limited to the above embodiments, and any changes or substitutions that can be easily understood by those skilled in the art within the technical scope of the present invention are intended to be included in the scope of the present invention. Therefore, the protection scope of the invention is subject to the protection scope of the claims.

Claims (11)

1. A scene monitoring method, comprising:
acquiring a current scene image, a historical scene image and a reference scene image of an operation scene;
respectively acquiring current difference characteristics of the current scene image and the reference scene image and historical difference characteristics of the historical scene image and the reference scene image according to a preset difference algorithm;
determining whether the operation scene changes according to the current difference characteristic and the historical difference characteristic;
the preset difference algorithm comprises the following steps:
acquiring an operation texture feature image and an operation color feature image of an operation scene image, wherein the operation scene image is the current scene image or the historical scene image;
acquiring a reference texture feature image and a reference color feature image of the reference scene image;
obtaining texture feature differences of the operation texture feature image and the reference texture feature image, and obtaining color feature differences of the operation color feature image and the reference color feature image;
and obtaining the difference characteristics of the operation scene image and the reference scene image according to the texture characteristic difference and the color characteristic difference.
2. The scene monitoring method according to claim 1, wherein the acquiring an operational texture feature image of an operational scene image comprises:
acquiring an operation gray image of the operation scene image;
dividing the operation gray image into a plurality of pixel areas, wherein each pixel area comprises a plurality of pixel points;
for each pixel area, calculating the pixel mean value of all pixel points in the pixel area, and determining the texture characteristic value of each pixel point in the pixel area according to the pixel mean value;
and constructing the operation texture feature image according to the texture feature values of all pixel points in the operation gray level image.
3. The scene monitoring method according to claim 2, wherein the determining the texture feature value of each pixel point in the pixel area according to the pixel mean value includes:
determining the texture characteristic value of a class of pixel points as a first preset characteristic value, determining the texture characteristic value of a class of pixel points as a second preset characteristic value, wherein the class of pixel points are the pixel points with pixel values larger than the pixel mean value, and the class of pixel points are the pixel points with pixel values smaller than or equal to the pixel mean value.
4. The scene monitoring method of claim 1, wherein obtaining an operational color feature image of an operational scene image comprises:
acquiring an HSV space image of the operation scene image;
for each pixel point in the HSV space image, determining an operation color of the pixel point according to a H, S, V component of the pixel point, and determining a color characteristic value of the pixel point according to the operation color;
and constructing the operation color feature image according to the color feature values of all pixel points in the HSV space image.
5. The scene monitoring method according to claim 1, wherein the acquiring the texture difference of the operational texture image and the reference texture image comprises:
for each operation pixel point in the operation texture feature image, determining a pixel point texture feature difference of the operation pixel point according to a difference value of a texture feature value of the operation pixel point and a texture feature value of a target reference pixel point, wherein the target reference pixel point is the pixel point with the same position as the operation pixel point in the reference texture feature image;
the texture feature differences comprise pixel point texture feature differences corresponding to all operation pixel points in the operation texture feature image.
6. The scene monitoring method according to claim 5, wherein said obtaining a color feature difference between said operational color feature image and said reference color feature image comprises:
for each operation pixel point in the operation color feature image, determining a pixel point color feature difference of the operation pixel point according to a difference value of a color feature value of the operation pixel point and a color feature value of a target reference pixel point, wherein the target reference pixel point is the pixel point with the same position as the operation pixel point in the reference color feature image;
the color characteristic differences comprise pixel point color characteristic differences corresponding to all operation pixel points in the operation color characteristic image.
7. The scene monitoring method according to claim 6, wherein the obtaining the difference features of the operation scene image and the reference scene image from the texture feature differences and the color feature differences comprises:
for each operation pixel point in the operation color feature image, determining a pixel point difference feature of the operation pixel point according to the sum value of the pixel point texture feature difference and the pixel point color feature difference of the operation pixel point;
the difference features comprise pixel point difference features corresponding to all operation pixel points in the operation color feature image.
8. The scene monitoring method according to claim 1, wherein the determining whether the operation scene is changed according to the current difference feature and the history difference feature comprises:
determining current difference pixel points of the current scene image and the reference scene image according to the current difference characteristics, and determining historical difference pixel points of the historical scene image and the reference scene image according to the historical difference characteristics;
determining a difference pixel area according to the current difference pixel point and the historical difference pixel point;
and determining whether the operation scene changes according to the difference pixel area.
9. A scene monitoring device, comprising:
the image acquisition module is used for acquiring a current scene image, a historical scene image and a reference scene image of the operation scene;
the difference feature acquisition module is used for respectively acquiring current difference features of the current scene image and the reference scene image and historical difference features of the historical scene image and the reference scene image according to a preset difference algorithm;
the change determining module is used for determining whether the operation scene changes according to the current difference characteristic and the historical difference characteristic;
the preset difference algorithm comprises the following steps:
acquiring an operation texture feature image and an operation color feature image of an operation scene image, wherein the operation scene image is the current scene image or the historical scene image;
acquiring a reference texture feature image and a reference color feature image of the reference scene image;
obtaining texture feature differences of the operation texture feature image and the reference texture feature image, and obtaining color feature differences of the operation color feature image and the reference color feature image;
and obtaining the difference characteristics of the operation scene image and the reference scene image according to the texture characteristic difference and the color characteristic difference.
10. An electronic device, comprising:
at least one processor; and a memory communicatively coupled to the at least one processor;
wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the scene monitoring method of any one of claims 1 to 8.
11. A computer readable storage medium storing a computer program, wherein the computer program is executed by a processor to implement the scene monitoring method of any one of claims 1 to 8.
CN202311606006.9A 2023-11-28 2023-11-28 Scene monitoring method and device, electronic equipment and computer readable storage medium Pending CN117541995A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311606006.9A CN117541995A (en) 2023-11-28 2023-11-28 Scene monitoring method and device, electronic equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311606006.9A CN117541995A (en) 2023-11-28 2023-11-28 Scene monitoring method and device, electronic equipment and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN117541995A true CN117541995A (en) 2024-02-09

Family

ID=89795565

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311606006.9A Pending CN117541995A (en) 2023-11-28 2023-11-28 Scene monitoring method and device, electronic equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN117541995A (en)

Similar Documents

Publication Publication Date Title
US10832445B2 (en) Method, apparatus, terminal and system for measuring trajectory tracking accuracy of target
EP3188481B1 (en) Automatic white balance
CN112102409B (en) Target detection method, device, equipment and storage medium
EP3142045B1 (en) Predicting accuracy of object recognition in a stitched image
US10560686B2 (en) Photographing device and method for obtaining depth information
CN110400278B (en) Full-automatic correction method, device and equipment for image color and geometric distortion
CN109034017A (en) Head pose estimation method and machine readable storage medium
US20230186680A1 (en) Information processing device and recognition support method
CN109523551B (en) Method and system for acquiring walking posture of robot
CN110807807B (en) Monocular vision target positioning pattern, method, device and equipment
CN107452028B (en) Method and device for determining position information of target image
US11030749B2 (en) Image-processing apparatus, image-processing method, and storage medium storing image-processing program
US20230059499A1 (en) Image processing system, image processing method, and non-transitory computer readable medium
JP2006133990A (en) Image processing device
CN115187549A (en) Image gray processing method, device, equipment and storage medium
CN114511894A (en) System and method for acquiring pupil center coordinates
US12067658B1 (en) Method, apparatus and device for automatically making up portrait lips, storage medium and program product
CN117541995A (en) Scene monitoring method and device, electronic equipment and computer readable storage medium
JP5080416B2 (en) Image processing apparatus for detecting an image of a detection object from an input image
CN114842057A (en) Distance information complementing method, apparatus, storage medium, and computer program product
CN113379611B (en) Image processing model generation method, processing method, storage medium and terminal
CN112766338A (en) Method, system and computer readable storage medium for calculating distance image
JP2007025901A (en) Image processor and image processing method
CN111565306B (en) Automatic white balance method and storage device for three-dimensional space white point detection and brightness weighting white point
CN110796050A (en) Target object identification method and related device in unmanned aerial vehicle inspection process

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination