WO2022111275A1 - 数据处理方法、装置及电子设备 - Google Patents

数据处理方法、装置及电子设备 Download PDF

Info

Publication number
WO2022111275A1
WO2022111275A1 PCT/CN2021/129536 CN2021129536W WO2022111275A1 WO 2022111275 A1 WO2022111275 A1 WO 2022111275A1 CN 2021129536 W CN2021129536 W CN 2021129536W WO 2022111275 A1 WO2022111275 A1 WO 2022111275A1
Authority
WO
WIPO (PCT)
Prior art keywords
area
target
vertex
monitoring
coordinate information
Prior art date
Application number
PCT/CN2021/129536
Other languages
English (en)
French (fr)
Inventor
熊辉
沈阳
俞少文
Original Assignee
杭州海康威视数字技术股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 杭州海康威视数字技术股份有限公司 filed Critical 杭州海康威视数字技术股份有限公司
Priority to EP21896776.8A priority Critical patent/EP4254314A4/en
Publication of WO2022111275A1 publication Critical patent/WO2022111275A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/04Context-preserving transformations, e.g. by using an importance map
    • G06T3/047Fisheye or wide-angle transformations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/60Rotation of whole images or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30232Surveillance

Definitions

  • the present application relates to the field of monitoring technology, and in particular, to a data processing method, device and electronic device.
  • the heat map refers to converting the values of the elements in the heat map matrix of a single surveillance camera (for example, a fisheye camera, the fisheye camera is taken as an example below) into pixel values through the heat map synthesis library, and attaching them to the flat base map. , which is shown as a flat base map covered with many colors, and the depth of the color identifies the size of the heat map data value.
  • the heat map data refers to the personnel stay time matrix data (for example, a 260*260 matrix) or the personnel quantity matrix data (for example, a 260*260 matrix) counted by the fisheye camera.
  • the meaning of the value of each element in the personnel stay time matrix data is the time sum (unit: seconds) of all personnel staying at the physical plane point in the statistical time unit (such as one hour), and each element in the personnel quantity matrix data
  • the meaning of the value is the number of people counted on the physical plane point within the statistical time unit (such as one hour).
  • the global deployment heatmap data analysis scheme of multiple fisheye cameras has gradually become a hot research direction.
  • the deployment pattern of multiple fisheye cameras shown in Figure 1B when this deployment method is applied In a dense shop cluster environment, the heat map data collected by a single fisheye camera may include data from multiple shops, and a single shop may also be covered by the monitoring areas of multiple fisheye cameras. At this time, how to target a single shop? The analysis of heat map data for stores has become a difficult problem.
  • the present application provides a data processing method, device, and electronic device, so as to realize the determination of heat map data of a custom area.
  • a data processing method comprising: determining at least one target overlapping area; wherein any one of the at least one target overlapping area is between a user-defined area and at least one target monitoring area One of the at least one target monitoring area is a monitoring area that overlaps with the custom area; according to the heat map data corresponding to the at least one target monitoring area, and the at least one target monitoring area The target overlapping area, and the heat map data of the custom area is determined.
  • a data processing apparatus comprising: a first determination unit configured to determine at least one target overlapping area; wherein any one of the at least one target overlapping area is a custom area and a an overlapping area between one of at least one target monitoring area, and one of the at least one target monitoring area is a monitoring area that has an overlapping area with the user-defined area; a second determining unit, configured to determine according to the at least one The heat map data corresponding to the target monitoring area and the at least one target overlapping area determine the heat map data of the custom area.
  • an electronic device including a processor and a machine-readable storage medium, where the machine-readable storage medium stores machine-executable instructions that can be executed by the processor, the The processor is configured to execute machine-executable instructions to implement the data processing method of the first aspect.
  • a computer-readable storage medium where a computer program is stored in the computer-readable storage medium, and when the computer program is executed by a processor, the data processing method of the first aspect is implemented.
  • a computer program includes machine-executable instructions, and when a processor executes the machine-executable instructions, causes the processor to execute the data processing method of the first aspect.
  • At least one target overlapping area is determined, and according to the heat map corresponding to the at least one target monitoring area and the at least one target overlapping area, the heat map data of the user-defined area is determined, so as to realize In the multi-camera deployment scenario, the determination of the heat map data of a single custom area provides data support for the heat map data analysis of the custom area.
  • 1A is a schematic diagram of a typical deployment style of a single fisheye camera
  • FIG. 1B is a schematic diagram of a typical deployment style of multiple fisheye cameras
  • FIG. 2 is a schematic flowchart of a data processing method provided by an embodiment of the present application.
  • FIG. 3 is a schematic diagram of an application scenario provided by an embodiment of the present application.
  • FIG. 4 is a schematic flowchart of determining a target monitoring area that overlaps with a custom area provided by an embodiment of the present application
  • FIG. 5 is a schematic flowchart of determining the second coordinate information of the target overlapping area under the image coordinate system of the target surveillance camera provided by an embodiment of the present application;
  • 6A is a schematic diagram of a plane base map under a specific application scenario provided by an embodiment of the present application.
  • 6B is a schematic diagram of a monitoring area rotation provided by an embodiment of the present application.
  • FIG. 7 is a schematic structural diagram of a data processing apparatus provided by an embodiment of the present application.
  • FIG. 8 is a schematic diagram of a hardware structure of an electronic device provided by an embodiment of the present application.
  • the flat basemap In a single surveillance camera heat map, the flat basemap is generally a picture captured by the camera; in the global heatmap, the flat basemap needs to be configured separately in the platform system (such as the HCP (HikCentral Professional) system).
  • the platform system such as the HCP (HikCentral Professional) system.
  • Global heat map refers to converting the values of elements in the heat map matrix of multiple surveillance cameras into pixel values through the heat map synthesis library, and attaching them to the coordinates of the rectangular area deployed by each surveillance camera on the base map.
  • On the flat basemap it is shown that the flat basemap is covered with many colors, and the depth of the color identifies the size of the heat map data value.
  • Custom area refers to the area drawn on the flat basemap of the global heatmap. For example, if a fisheye camera is deployed in a large shopping mall to collect heat map data, the area covered by some cameras includes multiple stores, and each store focuses on the heat map data of the store area. The area corresponding to this store is drawn in the figure, and this area is the custom area.
  • the custom area may include, but is not limited to, a rectangular area, a circular area or a parallelogram area.
  • FIG. 2 is a schematic flowchart of a data processing method provided by an embodiment of the present application. As shown in FIG. 2 , the data processing method may include the following steps S200 to S210 .
  • Step S200 determining at least one target overlapping area, any one of the at least one target overlapping area is an overlapping area between a custom area and one of the at least one target monitoring area, and one of the at least one target monitoring area is an overlap area with the custom area. There are monitoring areas with overlapping areas in the area.
  • the user-defined region is taken as an example of a rectangular region for description, and the same can be obtained from the processing of the user-defined region as other regular graphic regions.
  • the monitoring area that overlaps with the custom area may be determined according to the distribution of the monitoring area and the custom area (this paper In this paper, it is called the target monitoring area), and determine the overlapping area between the custom area and the monitoring area (referred to as the target overlapping area in this paper).
  • Step S210 Determine the heat map data of the custom area according to the heat map data corresponding to the at least one target monitoring area and the at least one target overlapping area.
  • the heat map data of the at least one target overlapping area may be determined according to the heat map data corresponding to the at least one target monitoring area and the at least one target overlapping area, And according to the heat map data of at least one target overlapping area, the heat map data of the custom area is determined.
  • the heat map data corresponding to the target monitoring area may include, but is not limited to, information on the width and height of the heat map corresponding to the target monitoring area and/or heat map data values in the heat map corresponding to the target monitoring area.
  • the heat map data of the target overlap area can be determined according to the distribution of the target overlap area in the corresponding target monitoring area, and based on the distribution and the heat map data of the target overlap area, and then the heat of the custom area can be determined. graph data.
  • determining at least one target overlapping area may include: according to the first coordinate information of the monitoring area of each monitoring camera in the coordinate system of the plane base map, and the first coordinate information of the user-defined area , determine at least one monitoring area that overlaps with the custom area as at least one target monitoring area; according to the first coordinate information of the custom area and the first coordinate information of the at least one target monitoring area, determine the coordinate system of the plane base map The first coordinate information of the next at least one target overlapping area; the first coordinate information of the at least one target overlapping area is used to characterize the at least one target overlapping area.
  • the plane base map coordinate system refers to a plane coordinate system established in the plane where the plane base map is located, the horizontal direction is the x-axis, and the vertical direction is the y-axis, and the origin of the coordinate system may include any one of the plane base map.
  • the origin of the coordinate system of the plane base map is taken as the upper left vertex of the plane base map, the positive x-axis in the horizontal and right directions, and the positive direction of the y-axis in the vertical and downward directions as an example.
  • the coordinate information of the monitoring area of the surveillance camera under the plane basemap coordinate system (herein referred to as the first coordinate information) and the coordinate information of the custom area under the plane basemap coordinate system (herein referred to as the first coordinate information) may be pre-configured respectively. referred to as the first coordinate information).
  • the first coordinate information of the monitoring areas of the surveillance cameras 1 to 4 in the plane base map coordinate system and the custom areas 1 to 5 in the plane base map coordinate system can be configured respectively.
  • the first coordinate information of the monitoring area and/or the first coordinate information of the user-defined area may include the coordinates of two diagonal vertices in the plane basemap coordinate system, and any vertex or center point in the plane basemap.
  • a monitoring area that overlaps with the user-defined area (herein referred to as a target monitoring area) may be determined.
  • step S200 ⁇ Determine the heat map data in the manner described in step S210.
  • At least one target monitoring area that overlaps with the user-defined area is determined according to the method described in step S200, it can be determined according to the first coordinate information of the user-defined area and the first coordinate information of the at least one target monitoring area.
  • determining the heat map data of the custom area according to the heat map data corresponding to at least one target monitoring area and the at least one target overlapping area may include: targeting at least one target monitoring area in the heat map data.
  • the second coordinate information of the target overlapping area; the target monitoring camera is the monitoring camera to which the target monitoring area belongs; according to the heat map data corresponding to at least one target monitoring area and the second coordinate information of at least one target overlapping area, determine the custom Heatmap data for the area.
  • the heat map of the surveillance camera is based on the image coordinate system of the surveillance camera
  • the image coordinate system of the surveillance camera may be a plane coordinate system with the upper left vertex of the image of the surveillance camera as the origin, the positive x-axis horizontally to the right, and the positive y-axis vertically downward.
  • the surveillance camera to which the target overlapping area belongs in the target monitoring area can be determined according to the first coordinate information of the target overlapping area and the first coordinate information of the target monitoring area (this paper
  • the coordinate information (herein referred to as the second coordinate information) in the image coordinate system of the target surveillance camera).
  • the heat map corresponding to the target monitoring area can be based on the heat map data corresponding to the target monitoring area and the second coordinate information of the target overlapping area.
  • the area corresponding to the second coordinate information is determined as the heat map data of the target overlapping area, and further, the heat map data of the custom area is determined according to the heat map data of at least one target overlapping area.
  • heat map data of the custom area when the heat map data of the custom area is determined, relevant analysis can be performed according to the heat map data of the custom area, such as maximum heat value analysis and average heat value analysis, which specifically implement the embodiments of the present application. Not limited.
  • At least one monitoring area with overlapping areas is determined as at least one target monitoring area, which may be implemented through the following steps S400 to S410.
  • Step S400 respectively determining the first coordinate information of the first vertex and the second vertex of the monitoring area of each monitoring camera under the plane basemap coordinate system, and the first coordinate information of the first vertex and the second vertex of the custom area.
  • Step S410 according to the first coordinate information of the first vertex and the second vertex of the monitoring area of each monitoring camera, and the first coordinate information of the first vertex and the second vertex of the custom area, overlap each with the custom area.
  • At least one monitoring area of the area is determined as at least one target monitoring area.
  • the vertex closest to the origin (herein referred to as the first vertex) and the farthest vertex from the origin (herein referred to as the second vertex) in the plane basemap coordinate system of the monitoring area, and a custom In the plane basemap coordinate system, the vertex closest to the origin (herein referred to as the first vertex) and the vertex farthest from the origin (herein referred to as the second vertex), the distribution state in the plane basemap coordinate system, Determine if the monitored area overlaps with the custom area.
  • the origin of the coordinate system of the plane basemap is the upper left vertex of the plane basemap
  • the first vertex is the upper left vertex
  • the second vertex is the lower right vertex
  • the target of the overlapping area with the user-defined area. monitoring area based on the first coordinate information of the first vertex and the second vertex of the monitoring area, and the first coordinate information of the first vertex and the second vertex of the user-defined area, determine the target of the overlapping area with the user-defined area. monitoring area.
  • At least one monitoring area with an overlapping area with the custom area is determined as at least one target monitoring area, which may include: for any monitoring area, the abscissa of the second vertex of the target is greater than or equal to the abscissa of the first vertex of the target, and the When the ordinate of the two vertices is greater than or equal to the ordinate of the first vertex of the target, it is determined that the monitoring area is the target monitoring area that overlaps with the user-defined area.
  • the first vertex (herein referred to as the target that is farther away from the origin of the plane basemap coordinate system) may be determined.
  • the first vertex), and, among the second vertex of the monitoring area and the second vertex of the custom area, the second vertex that is closer to the origin of the coordinate system of the plane basemap (herein referred to as the second vertex of the target) and compare the first coordinate information of the first vertex of the target with the second coordinate information of the second vertex of the target.
  • the abscissa of the second vertex of the target is greater than or equal to the abscissa of the first vertex of the target, and the ordinate of the second vertex of the target is greater than or equal to the ordinate of the first vertex of the target, it is determined that there is an overlapping area between the monitoring area and the custom area .
  • the monitoring area and the user-defined area have an overlapping area including a boundary intersection.
  • the implementation manner of determining whether there is an overlapping area between the monitoring area and the user-defined area described in the above embodiments is only a specific example in the embodiment of the present application, and does not belong to the limitation of the protection scope of the present application, that is, in the In the embodiment of the present application, it is also possible to determine whether there is an overlapping area between the monitoring area and the user-defined area in other ways. For example, for any monitoring area, the line where the boundary of the monitoring area is located, and the line where the boundary of the user-defined area is located can be determined respectively.
  • determining the first coordinate information of the at least one target overlapping area under the coordinate system of the plane base map may include: for each target In the overlapping area, the coordinates of the second vertex of the target and the first vertex of the target in the plane basemap coordinate system are used as the first coordinate information of the target overlapping area under the plane basemap coordinate system, and the first coordinate information of the target overlapping area is used for Characterize the target overlap region.
  • the second vertex of the target and the first vertex of the target are two vertices at opposite corners of the target overlapping area, and the coordinates of these two vertices in the plane basemap coordinate system can represent the target overlapping area.
  • each of the at least one target monitoring area and the target overlapping area between the target monitoring area and the user-defined area according to the first coordinate of the target overlapping area information, and the first coordinate information of the target monitoring area, and determining the second coordinate information of the target overlapping area in the image coordinate system of the target monitoring camera can be achieved through the following steps S500 to S510.
  • Step S500 according to the first coordinate information of the target overlapping area and the first coordinate information of the target monitoring area, determine the third coordinate information of the target overlapping area under the coordinate system of the target monitoring area.
  • the heat map corresponding to the monitoring area of the surveillance camera corresponds to the image coordinate system of the surveillance camera, that is, the origin (upper left vertex) of the image coordinate system of the surveillance camera is the element [0, 0 in the heat map matrix. ]
  • the lower right vertex of the image of the surveillance camera is the last element in the heat matrix, taking the 260*260 matrix as an example, it is the element [259, 259]. Therefore, in order to determine the heat map data corresponding to the target overlapping area, it is necessary to determine The coordinate information of the target overlapping area in the image coordinate system of the surveillance camera.
  • the first coordinate information of the target overlapping area and the first coordinate information of the target monitoring area can be used to determine the target overlapping area under the coordinate system of the target monitoring area.
  • Coordinate information herein referred to as third coordinate information.
  • the coordinate system of the target monitoring area is a coordinate system with the target vertex of the target monitoring area as the origin, the horizontal and vertical axes are respectively parallel to the horizontal and vertical axes of the plane base map coordinate system and are in the same direction, and the target vertex is in the target monitoring area.
  • the vertex closest to the origin of the planar basemap coordinate system is a coordinate system with the target vertex of the target monitoring area as the origin, the horizontal and vertical axes are respectively parallel to the horizontal and vertical axes of the plane base map coordinate system and are in the same direction, and the target vertex is in the target monitoring area.
  • the vertex closest to the origin of the planar basemap coordinate system is a coordinate system with the target vertex of the target monitoring area as the origin, the horizontal and vertical axes are respectively parallel to the horizontal and vertical axes of the plane base map coordinate system and are in the same direction, and the target vertex is in the target monitoring area.
  • the vertex closest to the origin of the planar basemap coordinate system is a coordinate system with
  • the origin of the coordinate system of the plane base map is the upper left vertex
  • the origin of the coordinate system of the target monitoring area is the upper left vertex of the target monitoring area
  • the horizontal right is the positive x-axis
  • the vertical downward is the positive y-axis.
  • the abscissa value in the third coordinate information can be obtained by subtracting ⁇ x from the abscissa value in the first coordinate information
  • the vertical coordinate value in the third coordinate information can be obtained by subtracting ⁇ x.
  • the coordinate value can be obtained by subtracting ⁇ y from the ordinate value in the first coordinate information.
  • Step S510 Determine the second coordinate information of the target overlapping area according to the third coordinate information of the target overlapping area and the rotation angle of the target monitoring camera.
  • the x-y axis of the image coordinate system of the surveillance camera and the x-y axis of the plane basemap coordinate system will not be completely in the same direction (that is, the image coordinates of the surveillance camera.
  • the positive x-axis of the system is different from the positive x-axis of the plane base map coordinate system, or/and the positive y-axis of the image coordinate system of the surveillance camera is different from the positive y-axis of the plane base map coordinate system).
  • the rotation angle is a clockwise rotation angle as an example (counterclockwise rotation A° is clockwise rotation 360°-A°), and the rotation angle is an integer multiple of 90°.
  • the second coordinate information of the target overlapping area in the image coordinate system of the monitoring camera can be determined according to the third coordinate information of the target overlapping area and the rotation angle of the target monitoring camera.
  • the third coordinate information of the target overlapping area can be mapped to the second coordinate information through an axis rotation formula.
  • determining the heat map data of the custom area according to the heat map data corresponding to the at least one target monitoring area and the at least one target overlap area includes: according to the width of the at least one target monitoring area and the heat map High ratio, and at least one target overlapping area, determine the heat map data corresponding to the at least one target overlapping area; determine the heat map data of the custom area according to the heat map data corresponding to the at least one target overlapping area.
  • the heat map data corresponding to the target overlapping area may be determined according to the ratio of the width and height of the target monitoring area and the heat map, and the target overlapping area.
  • the heat map corresponding to the target overlapping area may be determined according to the ratio of the width and height of the target monitoring area to the heat map, and the second coordinate information of the target overlapping area. Data, for example, to determine the number of pixels corresponding to coordinate units within the target monitoring area.
  • the heat map is a 260*260 image
  • the width and height of the target monitoring area are w and h respectively
  • the ratio of the length of the target monitoring area to the width of the heat map is w: 260, which is an abscissa unit Corresponding to 260/w pixels
  • the ratio of the width of the target monitoring area to the height of the heat map is h: 260, that is, one ordinate unit corresponds to 260/h pixels.
  • the heat map data corresponding to the target overlapping area may be determined according to the second coordinate information of the target overlapping area and the number of pixels corresponding to the coordinate unit in the target monitoring area.
  • the heat map matrix can be divided into vertical and horizontal equal divisions with 260/w pixels and 260/h pixels as unit lengths, and according to the second coordinate information of the target overlapping area, determine Blocks in the heat map matrix corresponding to the target overlapping area, and then determine the heat map data corresponding to the target overlapping area.
  • the heat map data of the custom area may be determined according to the heat map data corresponding to each target overlapping area, respectively.
  • FIG. 6A is a schematic diagram of a plane base map in a specific application scenario provided by an embodiment of the present application.
  • four fisheye cameras are deployed in the application scene (the monitoring areas respectively correspond to the plane base map)
  • the monitoring areas of the four fisheye cameras cover a total of 5 stores (the corresponding custom areas are the store areas 1 to 5 in the floor plan).
  • the coordinate information of the camera area, the rotation angle of the camera area that is, the rotation angle of the image coordinate system of the monitoring camera relative to the plane basemap coordinate system
  • the coordinate information of the custom area that is, the store area
  • the coordinate information of the monitoring area and the custom area of the fisheye camera are represented by the coordinates of the upper left vertex and the lower right vertex.
  • the direction of the actually deployed fisheye camera area may deviate from the direction of the planar basemap, and the rotation angle of the fisheye camera area relative to the planar basemap needs to be adjusted at the software level.
  • the rotation angle may be an integer multiple of 90°.
  • the overlapping area between the monitoring area and the user-defined area includes the intersection of the boundary of the monitoring area and the user-defined area.
  • the necessary condition for the overlapping area between the monitoring area and the custom area is that the smaller end_x value in the two rectangular areas is greater than or equal to the larger start_x value, and the smaller end_y value in the coordinates of the two rectangles is greater than or equal to the larger start_y value.
  • min_end_x ⁇ max_start_x, and min_end_y ⁇ max_start_y determine that the monitoring area 4 and the store area 4 have an overlapping area.
  • the first coordinates of the upper left vertex and the lower right vertex of the target overlapping area are respectively:
  • the origin of the monitoring area coordinate system corresponding to the monitoring area 4 is the upper left vertex of the monitoring area 4, its coordinates are f_point1 (start_x1, start_y1), and The abscissa and ordinate axes are respectively in the same direction as the abscissa and ordinate axes of the plane base map coordinate system. Therefore, the third coordinate information of the target overlapping area in the monitoring area coordinate system corresponding to the monitoring area 4 can be obtained by means of translation. Among them, the third coordinate of the upper left vertex of the target overlap area (coordinates in the monitoring area coordinate system) and the third coordinate of the lower right vertex are respectively:
  • the coordinate system of the target monitoring area may be rotated according to the rotation angle to obtain the image coordinate system of the target monitoring camera.
  • the (0, 0) elements of the heatmap matrix data are distributed on the upper left vertex of the monitoring area of the fisheye camera, and other heatmap matrix data elements are distributed in sequence according to the direction of the coordinate system, ( 259, 259) elements are distributed at the lower right vertex of the monitoring area of the fisheye camera.
  • the (0, 0) elements are actually distributed on the upper right vertex of the monitoring area of the fisheye camera after rotation, and the (259, 259) elements are distributed At the lower left vertex of the fisheye camera area;
  • rotating the monitoring area of the fisheye camera is equivalent to rotating the monitoring area coordinate system by the same angle (rotate with the origin of the monitoring area coordinate system as the rotation center, and then translate the origin), that is, the coordinate point in the monitoring area coordinate system Map through the axis rotation formula, and translate the mapped coordinates according to the translation of the origin to obtain the coordinates in the rotated coordinate system.
  • the specific principle and implementation are as follows:
  • the axis rotation formula can be as follows:
  • the coordinates converted by the axis rotation formula can be translated based on the translation relationship between the origin of the rotated coordinate system and the original point before rotation, and the second coordinate information of the target overlapping area can be obtained. .
  • the third coordinate of the vertex with the closest distance to the origin of the coordinate system of the monitoring area 4 (the upper left vertex) and the vertex with the farthest distance ( The third coordinates of the lower right vertex) are (3, 2) and (4, 3) respectively, and the rotation angle of the monitoring area 4 is 90°, then after the axis rotation mapping, the obtained coordinates are (2, -3) and (3, -4).
  • the origin after rotation is the upper right vertex of the monitoring area 4, the origin before rotation is the upper left vertex of the monitoring area 4, that is, the position where the origin is horizontally shifted to the right by the width of the monitoring area (4 in this embodiment), and the rotation
  • the horizontal direction of the rear coordinate system is the y-axis. Therefore, the coordinates of the overlapping area need to be based on the coordinates obtained by the axis rotation mapping, and the y-coordinate value should be added with 4, that is, the final coordinates are (2, 1) and (3) , 0).
  • the vertex with the closest distance to the origin in the target overlapping area changes from the upper left vertex of the target overlapping area to the upper right vertex of the target overlapping area, and the vertex with the farthest distance From the lower right vertex to the lower left vertex of the target overlapping area.
  • the second coordinates of the target overlapping area may be (2, 0) (the coordinates of the upper right vertex) and (3, 1) (the coordinates of the lower left vertex), respectively.
  • the obtained heat map data may be analyzed, such as determining a maximum heat value, an average heat value, and the like.
  • the heat map data corresponding to each target overlapping area can be determined respectively.
  • the heat map data corresponding to each target overlapping area can be analyzed separately, or the heat map data corresponding to the store area can be determined according to the heat map data corresponding to each target overlapping area, and then the heat map data corresponding to the store area can be determined.
  • Graph data for analysis such as determining the maximum heat value, average heat value, etc.
  • FIG. 7 is a schematic structural diagram of a data processing apparatus provided by an embodiment of the present application.
  • the data processing apparatus may include:
  • the first determining unit 710 is configured to determine at least one target overlapping area; wherein any one of the at least one target overlapping area is an overlapping area between a user-defined area and one of the at least one target monitoring area, and the at least one target overlapping area is an overlapping area.
  • One of the target monitoring areas is a monitoring area that overlaps with the user-defined area;
  • the second determining unit 720 is configured to determine the heat map data of the custom area according to the heat map data corresponding to the at least one target monitoring area and the at least one target overlapping area.
  • the first determining unit 710 determines the at least one target overlapping area, including:
  • At least one monitoring area that overlaps with the user-defined area is monitored.
  • the area is determined as the at least one target monitoring area
  • the first coordinate information of the self-defined area and the first coordinate information of the at least one target monitoring area determine the first coordinate information of the at least one target overlapping area under the plane base map coordinate system; wherein, the The first coordinate information of the at least one target overlapping area is used to characterize the at least one target overlapping area;
  • the second determining unit 720 determines the heat map data of the custom area according to the heat map data corresponding to the at least one target monitoring area and the at least one target overlapping area, including:
  • the heat map data of the custom area is determined according to the heat map data corresponding to the at least one target monitoring area and the second coordinate information of the at least one target overlapping area.
  • the second determining unit 720 determines the temperature of the custom area according to the heat map data corresponding to the at least one target monitoring area and the second coordinate information of the at least one target overlapping area.
  • Heat map data including: determining the heat map data corresponding to the at least one target overlapping area according to the aspect ratio of the at least one target monitoring area and the heat map, and the second coordinate information of the at least one target overlapping area; Determine the heat map data of the custom area according to the heat map data corresponding to the at least one target overlapping area.
  • the first determining unit 710 is based on the first coordinate information of the monitoring area of each monitoring camera in the at least one monitoring camera under the plane base map coordinate system and the first coordinate information of the user-defined area. Coordinate information, determining at least one monitoring area that overlaps with the user-defined area as the at least one target monitoring area, including:
  • the vertices with the closest and the farthest origin from the origin, and the first vertex and the second vertex of the user-defined area are respectively the vertices with the closest and the farthest distances from the origin of the plane basemap coordinate system in the user-defined area;
  • each At least one monitoring area that has an overlapping area with the custom area is determined as the at least one target monitoring area.
  • the first determining unit 710 is based on the first coordinate information of the first vertex and the second vertex of the monitoring area of each monitoring camera in the at least one monitoring camera, and the first coordinate information of the user-defined area.
  • the first coordinate information of the first vertex and the second vertex determining at least one monitoring area that overlaps with the user-defined area as the at least one target monitoring area, including:
  • the monitoring area determines the monitoring area as A target monitoring area with an overlapping area with the user-defined area;
  • the first vertex of the target is the distance between the first vertex of the monitoring area and the first vertex of the user-defined area and the origin of the coordinate system of the plane basemap
  • the farthest first vertex, the second vertex of the target is the second vertex of the second vertex of the monitoring area and the second vertex of the user-defined area that is closest to the origin of the plane basemap coordinate system.
  • the second determining unit 720 determines according to the target overlapping area The first coordinate information of the target monitoring area, and the first coordinate information of the target monitoring area, determine the second coordinate information of the target overlapping area under the image coordinate system of the target monitoring camera to which the target monitoring area belongs, including:
  • the target monitoring area coordinate system is based on
  • the target vertex of the target monitoring area is the origin
  • the horizontal and vertical axes are respectively parallel and co-directional coordinate systems with the horizontal and vertical axes of the plane base map coordinate system
  • the target vertex is the coordinate system in the target monitoring area and the plane bottom
  • the origin of the graph coordinate system is closest to the vertex
  • the second coordinate information of the target overlapping area is determined.
  • the second determining unit 720 determines the heat map data of the custom area according to the heat map data corresponding to the at least one target monitoring area and the at least one target overlapping area, including :
  • the at least one target monitoring area and the heat map, and the at least one target overlapping area determine the heat map data corresponding to the at least one target overlapping area
  • Embodiments of the present application provide an electronic device, including a processor and a memory, wherein the memory stores machine-executable instructions that can be executed by the processor, and the processor is configured to execute the machine-executable instructions to realize the data described above Approach.
  • FIG. 8 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present application.
  • the electronic device may include a processor 801, a memory 803 storing machine-executable instructions.
  • Processor 801 and memory 803 may communicate via system bus 802 .
  • the processor 801 can execute the data processing method described above.
  • the memory 803 referred to herein can be any electronic, magnetic, optical, or other physical storage device that can contain or store information, such as executable instructions, data, and the like.
  • the machine-readable storage medium can be: RAM (Radom Access Memory, random access memory), volatile memory, non-volatile memory, flash memory, storage drive (such as hard disk drive), solid state drive, any type of storage disk (such as compact disc, dvd, etc.), or similar storage media, or a combination thereof.
  • a machine-readable storage medium is also provided, such as the memory 803 in FIG. 8 , where machine-executable instructions are stored in the machine-readable storage medium, and when the machine-executable instructions are executed by the processor Implement the data processing method described above.
  • the machine-readable storage medium may be ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like.
  • Embodiments of the present application also provide a computer program, where the computer program includes machine-executable instructions, and when the processor executes the machine-executable instructions, the processor 801 is caused to execute the data processing method described above.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Alarm Systems (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

本申请提供一种数据处理方法、装置及电子设备,该方法包括:确定至少一个目标重叠区域;其中,所述至少一个目标重叠区域中任一为自定义区域与至少一个目标监控区域中之一之间的重叠区域,所述至少一个目标监控区域中之一为与所述自定义区域存在重叠区域的监控区域;依据所述至少一个目标监控区域对应的热度图,以及所述至少一个目标重叠区域,确定所述自定义区域的热度图数据。该方法可以为自定义区域的热度图数据分析提供了数据支持。

Description

数据处理方法、装置及电子设备
相关申请的交叉引用
本专利申请要求于2020年11月27日提交的、申请号为202011365066.2、发明名称为“数据处理方法、装置及电子设备”的中国专利申请的优先权,该申请以引用的方式并入本文中。
技术领域
本申请涉及监控技术领域,尤其涉及一种数据处理方法、装置及电子设备。
背景技术
热度图是指通过热度图合成库将单个监控相机(例如,鱼眼相机,下文中以鱼眼相机为例)的热度图矩阵中元素的值转换成像素值,并贴附在平面底图上,展示为平面底图上覆盖了很多颜色,颜色的深浅度标识了热度图数据值的大小。
热度图数据是指鱼眼相机统计的人员停留时间矩阵数据(例如,260*260的矩阵)或人员数量矩阵数据(例如,260*260的矩阵)。人员停留时间矩阵数据中每个元素值的含义是在统计时间单元(如一小时)内,在物理平面点位上的所有人员停留的时间和(单位:秒),人员数量矩阵数据中每个元素值的含义是在统计时间单元(如一小时)内,在物理平面点位上统计的人员数量。
传统的部署鱼眼相机统计热度图数据方案中,大部分是针对单个鱼眼相机统计的热度图矩阵数据进行分析,例如,针对小卖部场景的热度图数据应用,以图1A所示的单个鱼眼相机部署样式为例(图1A中的鱼眼相机区域已经被矫正为规则矩形区域,下同),在进行热度图数据分析时,只需要分析鱼眼相机区域(如图1A中的矩形ABCD区域)内的全部热度图矩阵数据。
然而,随着应用场景的扩展,多个鱼眼相机的全局部署热度图数据分析方案逐渐成为热门研究方向,例如,图1B所示的多个鱼眼相机部署样式,当这种部署方式应用于密集型商铺群环境下,可能会出现单个鱼眼相机统计的热度图数据中包括多个商铺的数据,单个商铺也可能会被多个鱼眼相机的监控区域所覆盖,此时,如何针对单个商铺进行热度图数据分析成为一个难题。
发明内容
有鉴于此,本申请提供一种数据处理方法、装置及电子设备,以实现自定义区域的热度图数据的确定。
根据本申请实施例的第一方面,提供一种数据处理方法,包括:确定至少一个目标重叠区域;其中,所述至少一个目标重叠区域中任一为自定义区域与至少一个目标监控区域中之一之间的重叠区域,所述至少一个目标监控区域中之一为与所述自定义区域存在重叠区域的监控区域;依据所述至少一个目标监控区域对应的热度图数据,以及所述至少一个目标重叠区域,确定所述自定义区域的热度图数据。
根据本申请实施例的第二方面,提供一种数据处理装置,包括:第一确定单元,用于确定至少一个目标重叠区域;其中,所述至少一个目标重叠区域中任一为自定义区域与至少一个目标监控区域中之一之间的重叠区域,所述至少一个目标监控区域中之一为与所述自定义区域存在重叠区域的监控区域;第二确定单元,用于依据所述至少一个目标监控区域对应的热度图数据,以及所述至少一个目标重叠区域,确定所述自定义区域 的热度图数据。
根据本申请实施例的第三方面,提供一种电子设备,包括处理器和机器可读存储介质,所述机器可读存储介质存储有能够被所述处理器执行的机器可执行指令,所述处理器用于执行机器可执行指令,以实现第一方面的数据处理方法。
根据本申请实施例的第四方面,提供一种计算机可读存储介质,计算机可读存储介质内存储有计算机程序,计算机程序被处理器执行时实现第一方面的数据处理方法。
根据本申请实施例的第五方面,提供一种计算机程序,该计算机程序包括机器可执行指令,并且当处理器执行该机器可执行指令时,促使处理器执行第一方面的数据处理方法。
在本申请实施例的数据处理方法中,确定至少一个目标重叠区域,并依据至少一个目标监控区域对应的热度图,以及所述至少一个目标重叠区域,确定自定义区域的热度图数据,实现了在多监控相机部署场景下,单个自定义区域的热度图数据的确定,为自定义区域的热度图数据分析提供了数据支持。
附图说明
图1A是一种典型的单个鱼眼相机部署样式示意图;
图1B是一种典型的多个鱼眼相机部署样式示意图;
图2是本申请实施例提供的一种数据处理方法的流程示意图;
图3是本申请实施例提供的一种应用场景的示意图;
图4是本申请实施例提供的一种确定与自定义区域存在重叠区域的目标监控区域的流程示意图;
图5是本申请实施例提供的一种确定目标监控相机的图像坐标系下目标重叠区域的第二坐标信息的流程示意图;
图6A是本申请实施例提供的一种具体应用场景下的平面底图的示意图;
图6B是本申请实施例提供的一种监控区域旋转的示意图;
图7是本申请实施例提供的一种数据处理装置的结构示意图;
图8是本申请实施例提供的一种电子设备的硬件结构示意图。
具体实施方式
这里将详细地对示例性实施例进行说明,其示例表示在附图中。下面的描述涉及附图时,除非另有表示,不同附图中的相同数字表示相同或相似的要素。以下示例性实施例中所描述的实施方式并不代表与本申请相一致的所有实施方式。相反,它们仅是与如所附权利要求书中所详述的、本申请的一些方面相一致的装置和方法的例子。
在本申请使用的术语是仅仅出于描述特定实施例的目的,而非旨在限制本申请。在本申请和所附权利要求书中所使用的单数形式的“一种”、“所述”和“该”也旨在包括多数形式,除非上下文清楚地表示其他含义。
为了使本领域技术人员更好地理解本申请实施例提供的技术方案,下面先对本申请实施例中涉及的部分技术术语进行解释说明。
平面底图:在单个监控相机热度图中,平面底图一般是相机抓拍的一张图片;在全 局热度图中,需要在平台系统中(如HCP(HikCentral Professional)系统)单独配置平面底图。
全局热度图:是指通过热度图合成库将多个监控相机的热度图矩阵中元素的值转换成像素值,并按照各监控相机所部署的矩形区域在平面底图上的坐标位置贴附在平面底图上,展示为平面底图上覆盖了很多颜色,颜色的深浅度标识了热度图数据值的大小。
自定义区域:是指在全局热度图的平面底图上绘制的区域。例如,在一个大型商场中部署鱼眼相机统计热度图数据,某些相机覆盖的区域包含了多家店铺,而每个店铺关注的是本店铺区域的热度图数据,那么可以在商场的平面底图中绘制本商铺对应的区域,该区域即为自定义区域。
示例性的,该自定义区域可以包括但不限于矩形区域、圆形区域或平行四边形区域。
为了使本申请实施例的上述目的、特征和优点能够更加明显易懂,下面结合附图对本申请实施例中技术方案作进一步详细的说明。
请参见图2,为本申请实施例提供的一种数据处理方法的流程示意图,如图2所示,该数据处理方法可以包括以下步骤S200至S210。
需要说明的是,本申请实施例中各步骤的序号大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对本申请实施例的实施过程构成任何限定。
步骤S200、确定至少一个目标重叠区域,至少一个目标重叠区域中任一为自定义区域与至少一个目标监控区域中之一之间的重叠区域,至少一个目标监控区域中之一为与该自定义区域存在重叠区域的监控区域。
示例性的,为了便于理解和说明,本文中以自定义区域为矩形区域为例进行说明,自定义区域为其他规则图形区域的处理同理可得。
本申请实施例中,为了实现多监控相机部署场景下,单个自定义区域的热度图数据的确定,可以依据监控区域与自定义区域的分布,确定与自定义区域存在重叠区域的监控区域(本文中称为目标监控区域),并确定自定义区域与监控区域的重叠区域(本文中称为目标重叠区域)。
步骤S210、依据至少一个目标监控区域对应的热度图数据,以及至少一个目标重叠区域,确定自定义区域的热度图数据。
本申请实施例中,在确定了上述至少一个目标重叠区域的情况下,可以依据至少一个目标监控区域对应的热度图数据,以及至少一个目标重叠区域,确定至少一个目标重叠区域的热度图数据,并依据至少一个目标重叠区域的热度图数据,确定自定义区域的热度图数据。
示例性的,目标监控区域对应的热度图数据可以包括但不限于目标监控区域对应的热度图的宽高等信息和/或目标监控区域对应的热度图中的热度图数据值。
示例性的,可以依据目标重叠区域在对应的目标监控区域中的分布,并依据该分布,以及目标重叠区域的热度图数据,确定目标重叠区域的热度图数据,进而,确定自定义区域的热度图数据。
可见,在图2所示方法流程中,通过确定与自定义区域存在重叠区域的至少一个目标监控区域,并确定自定义区域与至少一个目标监控区域的至少一个目标重叠区域,进而,依据至少一个目标监控区域对应的热度图,以及至少一个目标重叠区域,确定自定义区域的热度图数据,实现了在多监控相机部署场景下,单个自定义区域的热度图数据 的确定,为自定义区域的热度图数据分析提供了数据支持。
作为一种可能的实施例,步骤S200中,确定至少一个目标重叠区域,可以包括:依据平面底图坐标系下各监控相机的监控区域的第一坐标信息,以及自定义区域的第一坐标信息,将各自与自定义区域存在重叠区域的至少一个监控区域确定为至少一个目标监控区域;依据自定义区域的第一坐标信息以及至少一个目标监控区域的第一坐标信息,确定平面底图坐标系下至少一个目标重叠区域的第一坐标信息;至少一个目标重叠区域的第一坐标信息用于表征所述至少一个目标重叠区域。
示例性的,平面底图坐标系是指在平面底图所在平面内,水平方向为x轴,竖直方向为y轴建立的平面坐标系,该坐标系的原点可以包括平面底图的任一顶点或中心点之一。
为了便于描述和理解,下文中以平面底图坐标系的原点为平面底图的左上顶点,水平向右为x轴正向,竖直向下为y轴正向为例。
示例性的,可以分别预先配置监控相机的监控区域在平面底图坐标系下的坐标信息(本文中称为第一坐标信息)以及自定义区域在平面底图坐标系下的坐标信息(本文中称为第一坐标信息)。
举例来说,以图3所示场景为例,可以分别配置监控相机1~4的监控区域在平面底图坐标系下的第一坐标信息、以及自定义区域1~5在平面底图坐标系下的第一坐标信息。
示例性的,监控区域的第一坐标信息和/或自定义区域的第一坐标信息可以包括对角的两个顶点在平面底图坐标系下的坐标、任一顶点或中心点在平面底图坐标系下的坐标以及宽高,或任一顶点在平面底图坐标系下的坐标以及中心点在平面底图坐标系下的坐标中一者或多者。
示例性的,可以依据监控相机的监控区域的第一坐标信息以及自定义区域的第一坐标信息,确定与自定义区域存在重叠区域的监控区域(本文中称为目标监控区域)。
需要说明的是,在存在多个自定义区域的情况下,对于任一自定义区域,例如,图3所示场景中的自定义区域1~5中任一自定义区域,均可以按照步骤S200~步骤S210描述的方式确定热度图数据。
在按照步骤S200中描述的方式确定了与自定义区域存在重叠区域的至少一个目标监控区域的情况下,可以依据自定义区域的第一坐标信息以及至少一个目标监控区域的第一坐标信息,确定至少一个目标重叠区域在平面底图坐标系下的第一坐标信息。
作为一种可能的实施例,步骤S210中,依据至少一个目标监控区域对应的热度图数据,以及至少一个目标重叠区域,确定自定义区域的热度图数据,可以包括:针对至少一个目标监控区域中的每一个和该目标监控区域与自定义区域之间的目标重叠区域,依据该目标重叠区域的第一坐标信息,以及该目标监控区域的第一坐标信息,确定目标监控相机的图像坐标系下该目标重叠区域的第二坐标信息;目标监控相机为该目标监控区域归属的监控相机;依据至少一个目标监控区域对应的热度图数据,以及至少一个目标重叠区域的第二坐标信息,确定自定义区域的热度图数据。
示例性的,考虑到监控相机的热度图是以监控相机的图像坐标系为参照的,因此,为了确定自定义的热度图数据,需要确定目标重叠区域在监控相机的图像坐标系下的坐标。
示例性的,监控相机的图像坐标系可以为以监控相机的图像的左上顶点为原点,水平向右为x轴正向,竖直向下为y轴正向的平面坐标系。
在确定了目标重叠区域的第一坐标信息的情况下,可以依据目标重叠区域的第一坐标信息,以及目标监控区域的第一坐标信息,确定目标重叠区域在目标监控区域归属的监控相机(本文中称为目标监控相机)的图像坐标系下的坐标信息(本文中称为第二坐标信息)。
在确定了目标重叠区域的第二坐标信息的情况下,可以依据目标监控区域对应的热度图数据,以及目标重叠区域的第二坐标信息,将目标监控区域对应的热度图中对应目标重叠区域的第二坐标信息对应的区域,确定为目标重叠区域的热度图数据,进而,依据至少一个目标重叠区域的热度图数据,确定自定义区域的热度图数据。
需要说明的是,在确定了自定义区域的热度图数据的情况下,可以依据自定义区域的热度图数据进行相关分析,如最大热度值分析、平均热度值分析,其具体实现本申请实施例不做限定。
作为一种可能的实施例,如图4所示,上述依据平面底图坐标系下各监控相机的监控区域的第一坐标信息,以及自定义区域的第一坐标信息,将各自与自定义区域存在重叠区域的至少一个监控区域确定为至少一个目标监控区域,可以通过以下步骤S400至S410实现。
步骤S400、分别确定平面底图坐标系下各监控相机的监控区域的第一顶点和第二顶点的第一坐标信息,以及自定义区域的第一顶点和第二顶点的第一坐标信息。
步骤S410、依据各监控相机的监控区域的第一顶点和第二顶点的第一坐标信息,以及自定义区域的第一顶点和第二顶点的第一坐标信息,将各自与自定义区域存在重叠区域的至少一个监控区域确定为至少一个目标监控区域。
示例性的,可以依据该监控区域在平面底图坐标系中距离原点最近的顶点(本文中称为第一顶点)和距离原点最远的顶点(本文中称为第二顶点),以及自定义区域在平面底图坐标系中距离原点最近的顶点(本文中称为第一顶点)和距离原点最远的顶点(本文中称为第二顶点),在平面底图坐标系中的分布状态,确定监控区域与自定义区域是否存在重叠。
示例性的,在平面底图坐标系的原点为平面底图的左上顶点的情况下,上述第一顶点为左上顶点,第二顶点为右下顶点。
相应地,可以基于监控区域的第一顶点和第二顶点的第一坐标信息,以及,自定义区域的第一顶点和第二顶点的第一坐标信息,确定与自定义区域存在重叠区域的目标监控区域。
在一个示例中,步骤S410中,依据各监控相机的监控区域的第一顶点和第二顶点的第一坐标信息,以及自定义区域的第一顶点和第二顶点的第一坐标信息,将各自与自定义区域存在重叠区域的至少一个监控区域确定为至少一个目标监控区域,可以包括:对于任一监控区域,在目标第二顶点的横坐标大于等于目标第一顶点的横坐标,且目标第二顶点的纵坐标大于等于目标第一顶点的纵坐标的情况下,确定该监控区域为与自定义区域存在重叠区域的目标监控区域。
示例性的,对于任一监控区域,可以确定该监控区域的第一顶点和自定义区域的第一顶点中,与平面底图坐标系的原点距离更远的第一顶点(本文中称为目标第一顶点),以及,确定该监控区域的第二顶点和自定义区域的第二顶点中,与平面底图坐标系的原点距离更近的第二顶点(本文中称为目标第二顶点),并比较目标第一顶点的第一坐标信息和目标第二顶点的第二坐标信息。
在目标第二顶点的横坐标大于等于目标第一顶点的横坐标,且目标第二顶点的纵坐 标大于等于目标第一顶点的纵坐标的情况下,确定该监控区域与自定义区域存在重叠区域。
需要说明的是,在本申请实施例中,监控区域与自定义区域存在重叠区域包括边界相交的情况。
应该认识到,上述实施例中描述的确定监控区域与自定义区域是否存在重叠区域的实现方式仅仅是本申请实施例中的一种具体示例,而并不属于本申请保护范围的限定,即在本申请实施例中,也可以通过其他方式确定监控区域与自定义区域是否存在重叠区域,例如,对于任一监控区域,可以分别确定该监控区域的边界所在直线,与自定义区域的边界所在直线的交点,并判断是否存在任一交点同时处于该监控区域的边界上和自定义区域的边界上;若存在,则确定该监控区域与自定义区域存在重叠区域;若不存在,则进一步判断该监控区域的四个顶点是否均处于自定义区域内,或自定义区域的四个顶点是否均处于该监控区域内,若是,则确定监控区域与自定义区域存在重叠区域;否则,即前述判断的结果均为否,则确定该监控区域与自定义区域不存在重叠区域。
示例性的,依据自定义区域的第一坐标信息以及至少一个目标监控区域的第一坐标信息,确定平面底图坐标系下至少一个目标重叠区域的第一坐标信息,可以包括:针对每个目标重叠区域,将目标第二顶点与目标第一顶点在平面底图坐标系下的坐标作为平面底图坐标系下该目标重叠区域的第一坐标信息,该目标重叠区域的第一坐标信息用于表征该目标重叠区域。这里,目标第二顶点与目标第一顶点即为该目标重叠区域的对角的两个顶点,这两个顶点的在平面底图坐标系下的坐标可以表征该目标重叠区域。
作为一种可能的实施例,如图5所示,上述针对至少一个目标监控区域中的每一个和该目标监控区域与自定义区域之间的目标重叠区域,依据该目标重叠区域的第一坐标信息,以及该目标监控区域的第一坐标信息,确定目标监控相机的图像坐标系下该目标重叠区域的第二坐标信息,可以通过以下步骤S500至S510实现。
步骤S500、依据该目标重叠区域的第一坐标信息,以及该目标监控区域的第一坐标信息,确定目标监控区域坐标系下该目标重叠区域的第三坐标信息。
示例性的,由于监控相机的监控区域对应的热度图是与监控相机的图像坐标系对应的,即监控相机的图像坐标系的原点(左上顶点),为热度图矩阵中的元素[0,0],监控相机的图像的右下顶点为热度矩阵中的最后一个元素,以260*260矩阵为例,为元素[259,259],因此,为了确定目标重叠区域对应的热度图数据,需要确定目标重叠区域在监控相机的图像坐标系中的坐标信息。
相应地,在确定了目标重叠区域的第一坐标信息的情况下,可以依据目标重叠区域的第一坐标信息,以及目标监控区域的第一坐标信息,确定目标监控区域坐标系下目标重叠区域的坐标信息(本文中称为第三坐标信息)。
示例性的,目标监控区域坐标系为以目标监控区域的目标顶点为原点,横纵坐标轴分别与平面底图坐标系的横纵轴平行且同向的坐标系,目标顶点为目标监控区域中与平面底图坐标系的原点距离最近的顶点。
当平面底图坐标系的原点为左上顶点时,目标监控区域坐标系的原点为目标监控区域的左上顶点,水平向右为x轴正向,竖直向下为y轴正向。
由于平面底图坐标系(对应第一坐标信息)与目标监控区域坐标系(对应第三坐标信息)的横纵坐标轴同向,因此,对于第一坐标系统,可以通过平移的方式,得到对应的第三坐标信息。
例如,以上述示例中的水平向右为x轴正向,竖直向下为y轴正向为例,假设目标 监控区域坐标系的原点相对平面底图坐标系的原点水平向右平移△x,竖直向下平移△y,则对于同一位置,第三坐标信息中的横坐标值可以在第一坐标信息中的横坐标值的基础上减去△x得到,第三坐标信息中的纵坐标值可以在第一坐标信息中的纵坐标值的基础上减去△y得到。
步骤S510、依据该目标重叠区域的第三坐标信息,以及目标监控相机的旋转角度,确定该目标重叠区域的第二坐标信息。
示例性的,考虑到监控相机在实际部署中可能会存在旋转角度,从而,监控相机的图像坐标系的x-y轴与平面底图坐标系的x-y轴不会完全同向(即监控相机的图像坐标系的x轴正向与平面底图坐标系的x轴正向不同,或/和,监控相机的图像坐标系的y轴正向与平面底图坐标系的y轴正向不同)。
示例性的,以旋转角度为顺时针的旋转角度为例(逆时针旋转A°即为顺时针旋转360°-A°),且旋转角度为90°的整数倍。
在确定了目标重叠区域的第三坐标信息的情况下,可以依据目标重叠区域的第三坐标信息以及目标监控相机的旋转角度,确定监控相机的图像坐标系下目标重叠区域的第二坐标信息。
示例性的,可以依据目标监控相机的旋转角度,通过轴转公式,将目标重叠区域的第三坐标信息映射为第二坐标信息。
作为一种可能的实施例,上述依据至少一个目标监控区域对应的热度图数据,以及至少一个目标重叠区域,确定自定义区域的热度图数据,包括:依据至少一个目标监控区域与热度图的宽高比例,以及至少一个目标重叠区域,确定至少一个目标重叠区域对应的热度图数据;依据至少一个目标重叠区域对应的热度图数据,确定自定义区域的热度图数据。
示例性的,在确定了自定义区域与目标监控区域之间的目标重叠区域时,可以依据目标监控区域与热度图的宽高比例,以及目标重叠区域,确定目标重叠区域对应的热度图数据。
示例性的,在确定了目标重叠区域的第二坐标信息的情况下,可以依据目标监控区域与热度图的宽高比例,以及目标重叠区域的第二坐标信息,确定目标重叠区域对应的热度图数据,例如,确定目标监控区域内坐标单位对应的像素数量。
举例来说,假设热度图为260*260的图像,目标监控区域的宽和高分别为w和h,则目标监控区域的长与热度图的宽的比例为w:260,即一个横坐标单位对应260/w个像素;目标监控区域的宽与热度图的高的比例为h:260,即一个纵坐标单位对应260/h个像素。
可以依据目标重叠区域的第二坐标信息,以及目标监控区域内坐标单位对应的像素数量,确定目标重叠区域对应的热度图数据。
仍以上一示例为例,可以分别以260/w像素和260/h像素为单位长度,对热度图矩阵进行竖直方向和水平方向的等分,并依据目标重叠区域的第二坐标信息,确定目标重叠区域对应的热度图矩阵中的分块,进而,确定目标重叠区域对应的热度图数据。
示例性的,在存在多个目标监控区域的情况下时,可以分别依据各目标重叠区域对应的热度图数据,确定自定义区域的热度图数据。
为了使本领域技术人员更好地理解本申请实施例提供的技术方案,下面结合具体实例对本申请实施例提供的技术方案进行说明。
请参见图6A,为本申请实施例提供的一种具体应用场景下的平面底图的示意图,如图6A所示,在应用场景中部署有4个鱼眼相机(监控区域分别对应平面底图中监控区域1~4),该4个鱼眼相机的监控区域共覆盖了5个店铺(对应的自定义区域分别为平面底图中的店铺区域1~5)。
基于图6A所示应用场景,本申请实施例提供的数据处理流程如下:
a)在系统中配置平面底图坐标系(以平面底图左上角为原点,x轴正向向右延伸,y轴正向向下延伸)下的关键坐标信息,包括鱼眼相机的监控区域的坐标信息、相机区域的旋转角度(即监控相机的图像坐标系相对平面底图坐标系的旋转角度)以及自定义区域(即店铺区域)的坐标信息(上述第一坐标信息)。
以鱼眼相机的监控区域和自定义区域的坐标信息均通过左上顶点和右下顶点的坐标表征为例。
示例性的,实际部署的鱼眼相机区域方向可能和平面底图方向有偏差,需要在软件层面调整鱼眼相机区域相对平面底图的旋转角度。
示例性的,旋转角度可以为90°的整数倍。
b)、依据鱼眼相机的监控区域的左上顶点和右下顶点的第一坐标(在平面底图坐标系中的坐标),以及自定义区域的左上顶点和右下顶点的第一坐标,确定监控区域与自定义区域是否存在重叠区域。
示例性的,监控区域与自定义区域存在重叠区域包括监控区域与自定义区域边界相交。
示例性的,假设矩形区域(包括监控区域和自定义区域)的左上顶点的坐标为(start_x,start_y),右下顶点坐标为(end_x,end_y)。
监控区域和自定义区域存在重叠区域的必要条件是两个矩形区域中较小end_x值大于等于较大start_x值,并且两矩形坐标中较小end_y值大于等于较大start_y值。
以判断鱼眼相机4的监控区域(即图6A中的监控区域4)与店铺4对应的自定义区域(即图6A中的店铺区域4)是否存在重叠区域为例,假设监控区域4的左上顶点和右下顶点的第一坐标分别为f_point1(start_x1,start_y1)和f_point2(end_x1,end_y1),店铺区域4的左上顶点和右下顶点的第一坐标分别为s_point1(start_x2,start_y2),s_point2(end_x2,end_y2),则:
max_start_x=max(start_x1,start_x2)
max_start_y=max(start_y1,start_y2)
min_end_x=min(end_x1,end_x2)
min_end_y=min(end_y1,end_y2)
在满足如下条件的情况下:
min_end_x≥max_start_x,且min_end_y≥max_start_y确定监控区域4与店铺区域4存在重叠区域。
c)确定监控区域与店铺区域的重叠区域(即目标重叠区域)在平面底图坐标系下的坐标信息(即上述第一坐标信息)。
示例性的,仍以监控区域4与店铺区域4的重叠区域为例,该目标重叠区域左上顶点和右下顶点的第一坐标分别为:
o_point1(max_start_x,max_start_y)和o_point2(min_end_x,min_end_y)
d)、确定目标重叠区域在监控区域坐标系下的坐标信息(即上述第三坐信息)。
示例性的,仍以监控区域4与店铺区域4的重叠区域为例,由于监控区域4对应的监控区域坐标系的原点为监控区域4的左上顶点,其坐标为f_point1(start_x1,start_y1),而横纵坐标轴分别与平面底图坐标系的横纵坐标轴同向,因此,可以通过平移的方式,得到目标重叠区域在监控区域4对应的监控区域坐标系下的第三坐标信息。其中,目标重叠区域的左上顶点的第三坐标(在监控区域坐标系下的坐标)和右下顶点的第三坐标分别为:
oo_point1(max_start_x-start_x1,max_start_y-start_y1)
oo_point2(min_end_x-start_x1,min_end_y-start_y1)
e)、依据相机区域的旋转角度,确定目标重叠区域在旋转调整后的监控区域坐标系下的坐标信息(即上述第二坐标信息)。
示例性的,考虑到鱼眼相机的监控区域的实际部署方向与d)中假设的鱼眼相机的监控区域坐标系方向不一致问题,需要旋转一个角度(监控区域的旋转角度)来调整到正确方向。例如,可根据所述旋转角度对所述目标监控区域坐标系进行旋转得到目标监控相机的图像坐标系。
在没有旋转鱼眼相机的监控区域的情况下,热度图矩阵数据的(0,0)元素分布在鱼眼相机的监控区域的左上顶点,其他热度图矩阵数据元素依次按照坐标系方向分布,(259,259)元素分布在鱼眼相机的监控区域的右下顶点。
假设鱼眼相机的监控区域的旋转角度为90°(即顺时针旋转90°),旋转后(0,0)元素实际分布在鱼眼相机的监控区域的右上顶点,(259,259)元素分布在鱼眼相机区域的左下顶点;
其中,旋转鱼眼相机的监控区域,相当于将监控区域坐标系旋转相同角度(以监控区域坐标系的原点为旋转中心进行旋转,后对原点进行平移),即将监控区域坐标系中的坐标点通过轴转公式映射,并对映射后的坐标依据原点的平移而平移,得到旋转后的坐标系中的坐标。具体原理和实现如下:
假设坐标系的旋转角度为θ,在原坐标系xOy中的坐标为(x,y),在新坐标系x′Oy′中的坐标为(x′,y′),则轴转公式可以如下:
x′=x*cosθ+y*sinθ
y′=y*cosθ-x*sinθ
通过上述轴转公式进行坐标转换之后,可以对基于旋转后的坐标系的原点与旋转前的原点的平移关系,对经过轴转公式转换后的坐标进行平移,得到目标重叠区域的第二坐标信息。
举例来说,如图6B所示,假设监控区域4与店铺区域4的重叠区域中,与监控区域4坐标系的原点的距离最近的顶点(左上顶点)第三坐标和距离最远的顶点(右下顶点)第三坐标分别为(3,2)和(4,3),且监控区域4的旋转角度为90°,则经过轴转映射后,得到的坐标分别为(2,-3)和(3,-4)。
由于旋转后的原点为监控区域4的右上顶点,旋转前的原点为监控区域4的左上顶点,即原点水平向右平移了监控区域的宽度的位置(在该实施例中为4),且旋转后的坐标系的水平方向为y轴,因此,重叠区域的坐标需要在轴转映射得到的坐标的基础上,y坐标值加上4,即最终的坐标分别为(2,1)和(3,0)。
此外,由于监控区域4的监控区域坐标系旋转了90°,因此,目标重叠区域中与原点的距离最近的顶点由目标重叠区域的左上顶点变为了目标重叠区域的右上顶点,距离最远的顶点由右下顶点变为了目标重叠区域的左下顶点。
相应地,目标重叠区域的第二坐标可以分别为(2,0)(右上顶点的坐标)和(3,1)(左下顶点的坐标)。
f)、依据鱼眼相机的监控区域与热度图的宽高比例,从热度图矩阵中取出目标重叠区域对应的热度图数据。
示例性的,获取到目标重叠区域对应的热度图数据时,可以对获取到的热度图数据进行分析,如确定最大热度值、平均热度值等。
g)、在存在多个监控区域与店铺区域存在重叠区域的情况下,可以分别确定各目标重叠区域对应的热度图数据。
示例性的,可以分别对各目标重叠区域对应的热度图数据进行分析,也可以依据各目标重叠区域对应的热度图数据,确定店铺区域对应的热度图数据,进而,可以对店铺区域对应的热度图数据进行分析,如确定最大热度值、平均热度值等。
以上对本申请提供的方法进行了描述。下面对本申请提供的装置进行描述:
请参见图7,为本申请实施例提供的一种数据处理装置的结构示意图,如图7所示,该数据处理装置可以包括:
第一确定单元710,用于确定至少一个目标重叠区域;其中,所述至少一个目标重叠区域中任一为自定义区域与至少一个目标监控区域中之一之间的重叠区域,所述至少一个目标监控区域中之一为与所述自定义区域存在重叠区域的监控区域;
第二确定单元720,用于依据所述至少一个目标监控区域对应的热度图数据,以及所述至少一个目标重叠区域,确定所述自定义区域的热度图数据。
作为一种可能的实施例,所述第一确定单元710确定所述至少一个目标重叠区域,包括:
依据平面底图坐标系下至少一个监控相机中各监控相机的监控区域的第一坐标信息以及所述自定义区域的第一坐标信息,将各自与所述自定义区域存在重叠区域的至少一个监控区域确定为所述至少一个目标监控区域;
依据所述自定义区域的第一坐标信息以及所述至少一个目标监控区域的第一坐标信息,确定所述平面底图坐标系下所述至少一个目标重叠区域的第一坐标信息;其中,所述至少一个目标重叠区域的第一坐标信息用于表征所述至少一个目标重叠区域;
所述第二确定单元720依据所述至少一个目标监控区域对应的热度图数据,以及所述至少一个目标重叠区域,确定所述自定义区域的热度图数据,包括:
针对所述至少一个目标监控区域中的每一个和该目标监控区域与所述自定义区域之间的目标重叠区域,依据该目标重叠区域的第一坐标信息,以及该目标监控区域的第一坐标信息,确定该目标监控区域归属的目标监控相机的图像坐标系下该目标重叠区域的第二坐标信息;
依据所述至少一个目标监控区域对应的热度图数据,以及所述至少一个目标重叠区域的第二坐标信息,确定所述自定义区域的热度图数据。
作为一种可能的实施例,所述第二确定单元720依据所述至少一个目标监控区域对应的热度图数据,以及所述至少一个目标重叠区域的第二坐标信息,确定所述自定 义区域的热度图数据,包括:依据所述至少一个目标监控区域与热度图的宽高比例,以及所述至少一个目标重叠区域的第二坐标信息,确定所述至少一个目标重叠区域对应的热度图数据;依据所述至少一个目标重叠区域对应的热度图数据,确定所述自定义区域的热度图数据。
作为一种可能的实施例,所述第一确定单元710依据所述平面底图坐标系下所述至少一个监控相机中各监控相机的监控区域的第一坐标信息以及所述自定义区域的第一坐标信息,将与所述自定义区域存在重叠区域的至少一个监控区域确定为所述至少一个目标监控区域,包括:
分别确定所述平面底图坐标系下所述至少一个监控相机中各监控相机的监控区域的第一顶点和第二顶点的第一坐标信息,以及所述自定义区域的第一顶点和第二顶点的第一坐标信息;其中,对于所述至少一个监控相机中各监控相机的监控区域,该监控区域的第一顶点和第二顶点分别为该监控区域中与所述平面底图坐标系的原点距离最近和最远的顶点,所述自定义区域的第一顶点和第二顶点分别为所述自定义区域中与所述平面底图坐标系的原点距离最近和最远的顶点;
依据所述至少一个监控相机中各监控相机的监控区域的第一顶点和第二顶点的第一坐标信息,以及所述自定义区域的第一顶点和第二顶点的第一坐标信息,将各自与所述自定义区域存在重叠区域的至少一个监控区域确定为所述至少一个目标监控区域。
作为一种可能的实施例,所述第一确定单元710依据所述至少一个监控相机中各监控相机的监控区域的第一顶点和第二顶点的第一坐标信息,以及所述自定义区域的第一顶点和第二顶点的第一坐标信息,将各自与所述自定义区域存在重叠区域的至少一个监控区域确定为所述至少一个目标监控区域,包括:
对于任一监控区域,在目标第二顶点的横坐标大于等于目标第一顶点的横坐标,且目标第二顶点的纵坐标大于等于目标第一顶点的纵坐标的情况下,确定该监控区域为与所述自定义区域存在重叠区域的目标监控区域;所述目标第一顶点为该监控区域的第一顶点和所述自定义区域的第一顶点中与所述平面底图坐标系的原点距离最远的第一顶点,所述目标第二顶点为该监控区域的第二顶点和所述自定义区域的第二顶点中与所述平面底图坐标系的原点距离最近的第二顶点。
作为一种可能的实施例,针对所述至少一个目标监控区域中的每一个和该目标监控区域与所述自定义区域之间的目标重叠区域,所述第二确定单元720依据该目标重叠区域的第一坐标信息,以及该目标监控区域的第一坐标信息,确定该目标监控区域归属的目标监控相机的图像坐标系下该目标重叠区域的第二坐标信息,包括:
依据该目标重叠区域的第一坐标信息,以及该目标监控区域的第一坐标信息,确定目标监控区域坐标系下该目标重叠区域的第三坐标信息;其中,所述目标监控区域坐标系为以该目标监控区域的目标顶点为原点,横纵坐标轴分别与所述平面底图坐标系的横纵轴平行且同向的坐标系,所述目标顶点为该目标监控区域中与所述平面底图坐标系的原点距离最近的顶点;
依据该目标重叠区域的第三坐标信息,以及所述目标监控相机的旋转角度,确定该目标重叠区域的第二坐标信息。
作为一种可能的实施例,所述第二确定单元720依据所述至少一个目标监控区域对应的热度图数据,以及所述至少一个目标重叠区域,确定所述自定义区域的热度图数据,包括:
依据所述至少一个目标监控区域与热度图的宽高比例,以及所述至少一个目标 重叠区域,确定所述至少一个目标重叠区域对应的热度图数据;
依据所述至少一个目标重叠区域对应的热度图数据,确定所述自定义区域的热度图数据。
本申请实施例提供一种电子设备,包括处理器和存储器,其中,存储器存储有能够被所述处理器执行的机器可执行指令,处理器用于执行机器可执行指令,以实现上文描述的数据处理方法。
请参见图8,为本申请实施例提供的一种电子设备的硬件结构示意图。该电子设备可包括处理器801、存储有机器可执行指令的存储器803。处理器801与存储器803可经由系统总线802通信。并且,通过读取并执行存储器803中与数据处理逻辑对应的机器可执行指令,处理器801可执行上文描述的数据处理方法。
本文中提到的存储器803可以是任何电子、磁性、光学或其它物理存储装置,可以包含或存储信息,如可执行指令、数据,等等。例如,机器可读存储介质可以是:RAM(Radom Access Memory,随机存取存储器)、易失存储器、非易失性存储器、闪存、存储驱动器(如硬盘驱动器)、固态硬盘、任何类型的存储盘(如光盘、dvd等),或者类似的存储介质,或者它们的组合。
在一些实施例中,还提供了一种机器可读存储介质,如图8中的存储器803,该机器可读存储介质内存储有机器可执行指令,所述机器可执行指令被处理器执行时实现上文描述的数据处理方法。例如,所述机器可读存储介质可以是ROM、RAM、CD-ROM、磁带、软盘和光数据存储设备等。
本申请实施例还提供了一种计算机程序,该计算机程序包括机器可执行指令,当处理器执行该机器可执行指令时,促使处理器801执行上文中描述的数据处理方法。
需要说明的是,在本文中,诸如第一和第二等之类的关系术语仅仅用来将一个实体或者操作与另一个实体或操作区分开来,而不一定要求或者暗示这些实体或操作之间存在任何这种实际的关系或者顺序。而且,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的过程、方法、物品或者设备中还存在另外的相同要素。
以上所述仅为本申请的一些实施例而已,并不用以限制本申请,凡在本申请的精神和原则之内,所做的任何修改、等同替换、改进等,均应包含在本申请保护的范围之内。

Claims (17)

  1. 一种数据处理方法,包括:
    确定至少一个目标重叠区域;其中,所述至少一个目标重叠区域中任一为自定义区域与至少一个目标监控区域中之一之间的重叠区域,所述至少一个目标监控区域中之一为与所述自定义区域存在重叠区域的监控区域;
    依据所述至少一个目标监控区域对应的热度图数据,以及所述至少一个目标重叠区域,确定所述自定义区域的热度图数据。
  2. 根据权利要求1所述的方法,其特征在于,确定所述至少一个目标重叠区域,包括:
    依据平面底图坐标系下至少一个监控相机中各监控相机的监控区域的第一坐标信息以及所述自定义区域的第一坐标信息,将各自与所述自定义区域存在重叠区域的至少一个监控区域确定为所述至少一个目标监控区域;
    依据所述自定义区域的第一坐标信息以及所述至少一个目标监控区域的第一坐标信息,确定所述平面底图坐标系下所述至少一个目标重叠区域的第一坐标信息;其中,所述至少一个目标重叠区域的第一坐标信息用于表征所述至少一个目标重叠区域;
    依据所述至少一个目标监控区域对应的热度图数据,以及所述至少一个目标重叠区域,确定所述自定义区域的热度图数据,包括:
    针对所述至少一个目标监控区域中的每一个和该目标监控区域与所述自定义区域之间的目标重叠区域,依据该目标重叠区域的第一坐标信息,以及该目标监控区域的第一坐标信息,确定该目标监控区域归属的目标监控相机的图像坐标系下该目标重叠区域的第二坐标信息;
    依据所述至少一个目标监控区域对应的热度图数据,以及所述至少一个目标重叠区域的第二坐标信息,确定所述自定义区域的热度图数据。
  3. 根据权利要求2所述的方法,其特征在于,依据所述至少一个目标监控区域对应的热度图数据,以及所述至少一个目标重叠区域的第二坐标信息,确定所述自定义区域的热度图数据,包括:
    依据所述至少一个目标监控区域与热度图的宽高比例,以及所述至少一个目标重叠区域的第二坐标信息,确定所述至少一个目标重叠区域对应的热度图数据;
    依据所述至少一个目标重叠区域对应的热度图数据,确定所述自定义区域的热度图数据。
  4. 根据权利要求2所述的方法,其特征在于,依据所述平面底图坐标系下所述至少一个监控相机中各监控相机的监控区域的第一坐标信息以及所述自定义区域的第一坐标信息,将与所述自定义区域存在重叠区域的至少一个监控区域确定为所述至少一个目标监控区域,包括:
    分别确定所述平面底图坐标系下所述至少一个监控相机中各监控相机的监控区域的第一顶点和第二顶点的第一坐标信息,以及所述自定义区域的第一顶点和第二顶点的第一坐标信息;其中,对于所述至少一个监控相机中各监控相机的监控区域,该监控区域的第一顶点和第二顶点分别为该监控区域中与所述平面底图坐标系的原点距离最近和最远的顶点,所述自定义区域的第一顶点和第二顶点分别为所述自定义区域中与所述平面底图坐标系的原点距离最近和最远的顶点;
    依据所述至少一个监控相机中各监控相机的监控区域的第一顶点和第二顶点的第一坐标信息,以及所述自定义区域的第一顶点和第二顶点的第一坐标信息,将各自与所述自定义区域存在重叠区域的至少一个监控区域确定为所述至少一个目标监控区域。
  5. 根据权利要求4所述的方法,其特征在于,依据所述至少一个监控相机中各监控相机的监控区域的第一顶点和第二顶点的第一坐标信息,以及所述自定义区域的第一顶点和第二顶点的第一坐标信息,将各自与所述自定义区域存在重叠区域的至少一个监 控区域确定为所述至少一个目标监控区域,包括:
    对于任一监控区域,在目标第二顶点的横坐标大于等于目标第一顶点的横坐标,且目标第二顶点的纵坐标大于等于目标第一顶点的纵坐标的情况下,确定该监控区域为与所述自定义区域存在重叠区域的目标监控区域;所述目标第一顶点为该监控区域的第一顶点和所述自定义区域的第一顶点中与所述平面底图坐标系的原点距离最远的第一顶点,所述目标第二顶点为该监控区域的第二顶点和所述自定义区域的第二顶点中与所述平面底图坐标系的原点距离最近的第二顶点。
  6. 根据权利要求2-5任一项所述的方法,其特征在于,针对所述至少一个目标监控区域中的每一个和该目标监控区域与所述自定义区域之间的目标重叠区域,依据该目标重叠区域的第一坐标信息,以及该目标监控区域的第一坐标信息,确定该目标监控区域归属的目标监控相机的图像坐标系下该目标重叠区域的第二坐标信息,包括:
    针对所述至少一个目标监控区域中的每一个和该目标监控区域与所述自定义区域之间的目标重叠区域,依据该目标重叠区域的第一坐标信息,以及该目标监控区域的第一坐标信息,确定目标监控区域坐标系下该目标重叠区域的第三坐标信息;其中,所述目标监控区域坐标系为以该目标监控区域的目标顶点为原点,横纵坐标轴分别与所述平面底图坐标系的横纵轴平行且同向的坐标系,所述目标顶点为该目标监控区域中与所述平面底图坐标系的原点距离最近的顶点;
    依据该目标重叠区域的第三坐标信息,以及所述目标监控相机的旋转角度,确定该目标重叠区域的第二坐标信息。
  7. 根据权利要求1所述的方法,其特征在于,依据所述至少一个目标监控区域对应的热度图数据,以及所述至少一个目标重叠区域,确定所述自定义区域的热度图数据,包括:
    依据所述至少一个目标监控区域与热度图的宽高比例,以及所述至少一个目标重叠区域,确定所述至少一个目标重叠区域对应的热度图数据;
    依据所述至少一个目标重叠区域对应的热度图数据,确定所述自定义区域的热度图数据。
  8. 一种数据处理装置,其特征在于,包括:
    第一确定单元,用于确定至少一个目标重叠区域;其中,所述至少一个目标重叠区域中任一为自定义区域与至少一个目标监控区域中之一之间的重叠区域,所述至少一个目标监控区域中之一为与所述自定义区域存在重叠区域的监控区域;
    第二确定单元,用于依据所述至少一个目标监控区域对应的热度图数据,以及所述至少一个目标重叠区域,确定所述自定义区域的热度图数据。
  9. 根据权利要求8所述的装置,其特征在于,所述第一确定单元确定所述至少一个目标重叠区域,包括:
    依据平面底图坐标系下至少一个监控相机中各监控相机的监控区域的第一坐标信息以及所述自定义区域的第一坐标信息,将各自与所述自定义区域存在重叠区域的至少一个监控区域确定为所述至少一个目标监控区域;
    依据所述自定义区域的第一坐标信息以及所述至少一个目标监控区域的第一坐标信息,确定所述平面底图坐标系下所述至少一个目标重叠区域的第一坐标信息;其中,所述至少一个目标重叠区域的第一坐标信息用于表征所述至少一个目标重叠区域;
    所述第二确定单元依据所述至少一个目标监控区域对应的热度图数据,以及所述至少一个目标重叠区域,确定所述自定义区域的热度图数据,包括:
    针对所述至少一个目标监控区域中的每一个和该目标监控区域与所述自定义区域之间的目标重叠区域,依据该目标重叠区域的第一坐标信息,以及该目标监控区域的第一坐标信息,确定该目标监控区域归属的目标监控相机的图像坐标系下该目标重叠区域的第二坐标信息;
    依据所述至少一个目标监控区域对应的热度图数据,以及所述至少一个目标重叠区域的第二坐标信息,确定所述自定义区域的热度图数据。
  10. 根据权利要求9所述的装置,其特征在于,所述第二确定单元依据所述至少一个目标监控区域对应的热度图数据,以及所述至少一个目标重叠区域的第二坐标信息,确定所述自定义区域的热度图数据,包括:
    依据所述至少一个目标监控区域与热度图的宽高比例,以及所述至少一个目标重叠区域的第二坐标信息,确定所述至少一个目标重叠区域对应的热度图数据;
    依据所述至少一个目标重叠区域对应的热度图数据,确定所述自定义区域的热度图数据。
  11. 根据权利要求9所述的装置,其特征在于,所述第一确定单元依据所述平面底图坐标系下所述至少一个监控相机中各监控相机的监控区域的第一坐标信息以及所述自定义区域的第一坐标信息,将与所述自定义区域存在重叠区域的至少一个监控区域确定为所述至少一个目标监控区域,包括:
    分别确定所述平面底图坐标系下所述至少一个监控相机中各监控相机的监控区域的第一顶点和第二顶点的第一坐标信息,以及所述自定义区域的第一顶点和第二顶点的第一坐标信息;其中,对于所述至少一个监控相机中各监控相机的监控区域,该监控区域的第一顶点和第二顶点分别为该监控区域中与所述平面底图坐标系的原点距离最近和最远的顶点,所述自定义区域的第一顶点和第二顶点分别为所述自定义区域中与所述平面底图坐标系的原点距离最近和最远的顶点;
    依据所述至少一个监控相机中各监控相机的监控区域的第一顶点和第二顶点的第一坐标信息,以及所述自定义区域的第一顶点和第二顶点的第一坐标信息,将各自与所述自定义区域存在重叠区域的至少一个监控区域确定为所述至少一个目标监控区域。
  12. 根据权利要求11所述的装置,其中,所述第一确定单元依据所述至少一个监控相机中各监控相机的监控区域的第一顶点和第二顶点的第一坐标信息,以及所述自定义区域的第一顶点和第二顶点的第一坐标信息,将各自与所述自定义区域存在重叠区域的至少一个监控区域确定为所述至少一个目标监控区域,包括:
    对于任一监控区域,在目标第二顶点的横坐标大于等于目标第一顶点的横坐标,且目标第二顶点的纵坐标大于等于目标第一顶点的纵坐标的情况下,确定该监控区域为与所述自定义区域存在重叠区域的目标监控区域;所述目标第一顶点为该监控区域的第一顶点和所述自定义区域的第一顶点中与所述平面底图坐标系的原点距离最远的第一顶点,所述目标第二顶点为该监控区域的第二顶点和所述自定义区域的第二顶点中与所述平面底图坐标系的原点距离最近的第二顶点。
  13. 根据权利要求9-12任一项所述的装置,其中,针对所述至少一个目标监控区域中的每一个和该目标监控区域与所述自定义区域之间的目标重叠区域,所述第二确定单元依据该目标重叠区域的第一坐标信息,以及该目标监控区域的第一坐标信息,确定该目标监控区域归属的目标监控相机的图像坐标系下该目标重叠区域的第二坐标信息,包括:
    依据该目标重叠区域的第一坐标信息,以及该目标监控区域的第一坐标信息,确定目标监控区域坐标系下该目标重叠区域的第三坐标信息;其中,所述目标监控区域坐标系为以该目标监控区域的目标顶点为原点,横纵坐标轴分别与所述平面底图坐标系的横纵轴平行且同向的坐标系,所述目标顶点为该目标监控区域中与所述平面底图坐标系的原点距离最近的顶点;
    依据该目标重叠区域的第三坐标信息,以及所述目标监控相机的旋转角度,确定该目标重叠区域的第二坐标信息。
  14. 根据权利要求8所述的装置,其中,所述第二确定单元依据所述至少一个目标监控区域对应的热度图数据,以及所述至少一个目标重叠区域,确定所述自定义区域的 热度图数据,包括:
    依据所述至少一个目标监控区域与热度图的宽高比例,以及所述至少一个目标重叠区域,确定所述至少一个目标重叠区域对应的热度图数据;
    依据所述至少一个目标重叠区域对应的热度图数据,确定所述自定义区域的热度图数据。
  15. 一种电子设备,其特征在于,包括处理器和存储器,所述存储器存储有能够被所述处理器执行的机器可执行指令,所述处理器用于执行机器可执行指令,以实现如权利要求1-7任一项所述的方法。
  16. 一种计算机可读存储介质,其中,所述计算机可读存储介质内存储有计算机程序,所述计算机程序被处理器执行时实现如权利要求1-7任一项所述的方法。
  17. 一种计算机程序,其中,所述计算机程序包括机器可执行指令,并且当处理器执行所述机器可执行指令时,促使所述处理器执行如权利要求1-7任一项所述的方法。
PCT/CN2021/129536 2020-11-27 2021-11-09 数据处理方法、装置及电子设备 WO2022111275A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP21896776.8A EP4254314A4 (en) 2020-11-27 2021-11-09 DATA PROCESSING METHOD AND APPARATUS, AND ELECTRONIC DEVICE

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011365066.2A CN112488913A (zh) 2020-11-27 2020-11-27 数据处理方法、装置及电子设备
CN202011365066.2 2020-11-27

Publications (1)

Publication Number Publication Date
WO2022111275A1 true WO2022111275A1 (zh) 2022-06-02

Family

ID=74936741

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/129536 WO2022111275A1 (zh) 2020-11-27 2021-11-09 数据处理方法、装置及电子设备

Country Status (3)

Country Link
EP (1) EP4254314A4 (zh)
CN (1) CN112488913A (zh)
WO (1) WO2022111275A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112488913A (zh) * 2020-11-27 2021-03-12 杭州海康威视数字技术股份有限公司 数据处理方法、装置及电子设备

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130091432A1 (en) * 2011-10-07 2013-04-11 Siemens Aktiengesellschaft Method and user interface for forensic video search
CN107315824A (zh) * 2017-07-04 2017-11-03 百度在线网络技术(北京)有限公司 用于生成热力图的方法和装置
CN111264056A (zh) * 2017-08-23 2020-06-09 美国西门子医学诊断股份有限公司 用于实验室工作流程的视觉系统
CN111274340A (zh) * 2020-01-15 2020-06-12 中国联合网络通信集团有限公司 人流密度的监控处理方法、设备及存储介质
CN111914819A (zh) * 2020-09-30 2020-11-10 杭州未名信科科技有限公司 一种多摄像头融合的人群密度预测方法、装置、存储介质及终端
CN112102307A (zh) * 2020-09-25 2020-12-18 杭州海康威视数字技术股份有限公司 全局区域的热度数据确定方法、装置及存储介质
CN112488913A (zh) * 2020-11-27 2021-03-12 杭州海康威视数字技术股份有限公司 数据处理方法、装置及电子设备
CN112822442A (zh) * 2020-12-31 2021-05-18 杭州海康威视数字技术股份有限公司 热度图生成方法、装置及电子设备

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104520508B (zh) * 2012-11-08 2017-06-30 住友重机械工业株式会社 铺装机械用图像生成装置以及铺装机械用操作支援系统
CN109829486B (zh) * 2019-01-11 2021-05-28 新华三技术有限公司 图像处理方法和装置
CN111862521B (zh) * 2019-04-28 2022-07-05 杭州海康威视数字技术股份有限公司 行为热力图生成及报警方法、装置、电子设备及存储介质
CN112070623A (zh) * 2019-05-22 2020-12-11 北京京东尚科信息技术有限公司 热力分析方法、装置和系统
CN111738923B (zh) * 2020-06-19 2024-05-10 京东方科技集团股份有限公司 图像处理方法、设备及存储介质

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130091432A1 (en) * 2011-10-07 2013-04-11 Siemens Aktiengesellschaft Method and user interface for forensic video search
CN107315824A (zh) * 2017-07-04 2017-11-03 百度在线网络技术(北京)有限公司 用于生成热力图的方法和装置
CN111264056A (zh) * 2017-08-23 2020-06-09 美国西门子医学诊断股份有限公司 用于实验室工作流程的视觉系统
CN111274340A (zh) * 2020-01-15 2020-06-12 中国联合网络通信集团有限公司 人流密度的监控处理方法、设备及存储介质
CN112102307A (zh) * 2020-09-25 2020-12-18 杭州海康威视数字技术股份有限公司 全局区域的热度数据确定方法、装置及存储介质
CN111914819A (zh) * 2020-09-30 2020-11-10 杭州未名信科科技有限公司 一种多摄像头融合的人群密度预测方法、装置、存储介质及终端
CN112488913A (zh) * 2020-11-27 2021-03-12 杭州海康威视数字技术股份有限公司 数据处理方法、装置及电子设备
CN112822442A (zh) * 2020-12-31 2021-05-18 杭州海康威视数字技术股份有限公司 热度图生成方法、装置及电子设备

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4254314A4 *

Also Published As

Publication number Publication date
CN112488913A (zh) 2021-03-12
EP4254314A1 (en) 2023-10-04
EP4254314A4 (en) 2024-05-08

Similar Documents

Publication Publication Date Title
JP6859442B2 (ja) キャリブレーション装置、キャリブレーションシステム、およびキャリブレーション方法
JP7372199B2 (ja) 投影システム、投影装置及びその表示画像の校正方法
US8436904B2 (en) Method and apparatus for calibrating video camera
TWI703870B (zh) 投影裝置、投影系統及影像校正方法
TWI622960B (zh) 深度影像擷取裝置的校正方法
WO2022111275A1 (zh) 数据处理方法、装置及电子设备
JP2018147095A (ja) カメラ位置姿勢推定装置、方法およびプログラム
JP2007006175A (ja) 車両用画像生成装置および方法
CN110769224A (zh) 投影区域获取方法及投影方法
JP2015138445A (ja) 表示制御方法、情報処理装置、および表示制御プログラム
CN111462245B (zh) 一种基于矩形结构的变焦相机姿态标定方法和系统
WO2023273108A1 (zh) 单目测距方法、装置及智能装置
TWI716874B (zh) 影像處理裝置、影像處理方法、及影像處理程式
WO2018053756A1 (zh) 一种图像检测方法及终端
CN113989376B (zh) 室内深度信息的获取方法、装置和可读存储介质
US20150145862A1 (en) Texture Modeling of Image Data
WO2023216982A1 (zh) 数据处理方法、装置、计算机设备、存储介质及程序产品
CN117671031A (zh) 双目相机标定方法、装置、设备及存储介质
TWI703869B (zh) 投影裝置、投影系統及影像校正方法
CN111813984B (zh) 一种利用单应矩阵实现室内定位的方法、装置及电子设备
JP2014149791A (ja) 拡張現実プログラム及び登録プログラム
WO2023066143A1 (zh) 全景图像的图像分割方法、装置、计算机设备和存储介质
CN115086625B (zh) 投影画面的校正方法、装置、系统、校正设备和投影设备
AU2017204848A1 (en) Projecting rectified images on a surface using uncalibrated devices
JP2019041208A (ja) パノラマ画像生成システム及び設定装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21896776

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2021896776

Country of ref document: EP

Effective date: 20230627