CN111915640B - Method and device for determining candidate frame scale, storage medium and electronic device - Google Patents

Method and device for determining candidate frame scale, storage medium and electronic device Download PDF

Info

Publication number
CN111915640B
CN111915640B CN202010803006.8A CN202010803006A CN111915640B CN 111915640 B CN111915640 B CN 111915640B CN 202010803006 A CN202010803006 A CN 202010803006A CN 111915640 B CN111915640 B CN 111915640B
Authority
CN
China
Prior art keywords
imaging
target object
scale
determining
tracking
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010803006.8A
Other languages
Chinese (zh)
Other versions
CN111915640A (en
Inventor
郑少飞
潘华东
殷俊
张兴明
唐邦杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Dahua Technology Co Ltd
Original Assignee
Zhejiang Dahua Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dahua Technology Co Ltd filed Critical Zhejiang Dahua Technology Co Ltd
Priority to CN202010803006.8A priority Critical patent/CN111915640B/en
Publication of CN111915640A publication Critical patent/CN111915640A/en
Application granted granted Critical
Publication of CN111915640B publication Critical patent/CN111915640B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30232Surveillance

Abstract

The invention provides a method and a device for determining a candidate frame scale, a storage medium and an electronic device, wherein the method comprises the following steps: establishing a relation model of imaging scale and imaging position of a target object in a monitoring scene according to the imaging scale of the target object in the monitoring scene and the imaging position of the target object in the monitoring scene; and determining a candidate frame scale for tracking the target object according to the relation model. The method solves the problems that the scale estimation of target tracking is difficult to determine, so that the width and the height of the target are difficult to determine in the tracking process, and ideal target characteristics are not extracted, and the effective problem does not exist at present, so that the effects of reducing the selection of the scaling coefficient of the candidate frame and reducing the calculated amount of the scale estimation are achieved.

Description

Method and device for determining candidate frame scale, storage medium and electronic device
Technical Field
The invention relates to the technical field of computer vision, in particular to a method and a device for determining a candidate frame scale, a storage medium and an electronic device.
Background
Target tracking is a fundamental vision problem and has long been a research hotspot for computer vision.
The task flow to be completed by target tracking is as follows: given the target position and size of the initial frame of video, the position and size of the target in the subsequent frame is predicted.
Aiming at the problems that in the related technology, the scale estimation of target tracking is difficult to determine, so that the width and height of the target are difficult to determine in the tracking process, and further the ideal target characteristics are not extracted, no effective solution exists at present.
Disclosure of Invention
The embodiment of the invention provides a method and a device for determining the scale of a candidate frame, a storage medium and an electronic device, which at least solve the problems that the scale estimation of target tracking in the related technology is difficult to determine, the width and the height of a target are difficult to determine in the tracking process, and ideal target characteristics are not extracted.
According to an embodiment of the present invention, there is provided a method for determining a candidate frame scale, including: establishing a relation model of an imaging scale and an imaging position of a target object in a monitoring scene according to the imaging scale of the target object in the monitoring scene and the imaging position of the target object in the monitoring scene, wherein the imaging scale comprises: an imaging height of the target object and an imaging width of the target object, the imaging position comprising: the imaging position of the imaging height of the target object in a preset monitoring area and the imaging position of the imaging width of the target object in the preset monitoring area are determined by the monitoring scene; and determining a candidate frame scale for tracking the target object according to the relation model.
According to another embodiment of the present invention, there is provided a determination apparatus for a candidate frame scale, including: the modeling module is used for building a relation model of the imaging scale and the imaging position according to the imaging scale of the target object in the monitoring scene and the imaging position of the target object in the monitoring scene, wherein the imaging scale comprises: an imaging height of the target object and an imaging width of the target object, the imaging position comprising: the imaging position of the imaging height of the target object in a preset monitoring area and the imaging position of the imaging width of the target object in the preset monitoring area are determined by the monitoring scene; and the determining module is used for determining a candidate frame scale for tracking the target object according to the relation model.
According to a further embodiment of the invention, there is also provided a storage medium having stored therein a computer program, wherein the computer program is arranged to perform the steps of any of the method embodiments described above when run.
According to a further embodiment of the invention, there is also provided an electronic device comprising a memory having stored therein a computer program and a processor arranged to run the computer program to perform the steps of any of the method embodiments described above.
According to the invention, as the linear relation between the target position and the target width and height of the target object is established and the linearity is applied to the selection of the candidate frame scale of the target tracking, the problem that the scale estimation of the target tracking is difficult to determine, the width and the height of the target are difficult to determine in the tracking process, and ideal target characteristics are not extracted is solved, and the effects of reducing the selection of the candidate frame scaling factor and the calculation amount of the scale estimation are achieved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiments of the invention and together with the description serve to explain the invention and do not constitute a limitation on the invention. In the drawings:
FIG. 1 is a block diagram of a hardware architecture of a mobile terminal of a method for determining a candidate frame scale according to an embodiment of the present invention;
FIG. 2 is a flow chart of a method of determining a candidate box scale according to an embodiment of the invention;
FIG. 3 is a block diagram of a determination apparatus of a candidate frame scale according to an embodiment of the present invention;
FIG. 4 is a schematic view of a scene used by a method for determining a candidate frame scale in an embodiment of the application;
FIG. 5 is a schematic diagram of determining target tracking candidate box dimensions according to an alternative embodiment of the invention.
Detailed Description
The invention will be described in detail hereinafter with reference to the drawings in conjunction with embodiments. It should be noted that, in the case of no conflict, the embodiments and features in the embodiments may be combined with each other.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order.
Example 1
The method embodiment provided in the first embodiment of the present application may be executed in a mobile terminal, a computer terminal or a similar computing device. Taking the operation on a mobile terminal as an example, fig. 1 is a block diagram of a hardware structure of a mobile terminal according to a method for determining a candidate frame scale according to an embodiment of the present invention. As shown in fig. 1, the mobile terminal 10 may include one or more (only one is shown in fig. 1) processors 102 (the processor 102 may include, but is not limited to, a microprocessor MCU or a processing device such as a programmable logic device FPGA) and a memory 104 for storing data, and optionally a transmission device 106 for communication functions and an input-output device 108. It will be appreciated by those skilled in the art that the structure shown in fig. 1 is merely illustrative and not limiting of the structure of the mobile terminal described above. For example, the mobile terminal 10 may also include more or fewer components than shown in FIG. 1 or have a different configuration than shown in FIG. 1.
The memory 104 may be used to store a computer program, for example, a software program of application software and a module, such as a computer program corresponding to a method for determining a candidate frame scale in an embodiment of the present invention, and the processor 102 executes the computer program stored in the memory 104, thereby performing various functional applications and data processing, that is, implementing the method described above. Memory 104 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 104 may further include memory located remotely from the processor 102, which may be connected to the mobile terminal 10 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission means 106 is arranged to receive or transmit data via a network. The specific examples of networks described above may include wireless networks provided by the communication provider of the mobile terminal 10. In one example, the transmission device 106 includes a network adapter (Network Interface Controller, simply referred to as NIC) that can connect to other network devices through a base station to communicate with the internet. In one example, the transmission device 106 may be a Radio Frequency (RF) module, which is used to communicate with the internet wirelessly.
In this embodiment, a method for determining a candidate frame size of the mobile terminal is provided, and fig. 2 is a flowchart of a method for determining a candidate frame size according to an embodiment of the present invention, as shown in fig. 2, where the flowchart includes the following steps:
step S202, a relation model of the imaging scale and the imaging position is built according to the imaging scale of a target object in a monitoring scene and the imaging position of the target object in the monitoring scene, wherein the imaging scale comprises: an imaging height of the target object and an imaging width of the target object, the imaging position comprising: the imaging position of the imaging height of the target object in a preset monitoring area and the imaging position of the imaging width of the target object in the preset monitoring area are determined by the monitoring scene;
and step S204, determining a candidate frame scale for tracking the target object according to the relation model.
In the above steps, the imaging background tends to have invariance in consideration of the fixed position of the imaging device of the video structural analysis system at the front end of the general deployment. For target tracking, modeling is performed based on a monitoring scene, a relation between a target position and a target width and height is established, and the relation is applied to selection of candidate frame scales of target tracking. Specifically, modeling is firstly carried out according to a monitoring scene, and the relation between the imaging scale of a target object in the monitoring scene and the imaging position of the target object in the monitoring scene is established. And the target frame is not adjusted after the target is positioned, but the scale of the candidate frame is determined according to scene modeling while calculating the candidate frame.
Through the steps, the linear relation between the target position and the width and height of the target object is established, and the linearity is applied to the selection of the scale of the candidate frame for target tracking, so that the problem that the scale estimation of the target tracking is difficult to determine, the width and the height of the target are difficult to determine in the tracking process, and ideal target characteristics are not extracted can be solved, and the effects of reducing the selection of the scale coefficient of the candidate frame and reducing the calculation amount of the scale estimation are achieved.
Preferably, the method further comprises: determining a minimum circumscribed rectangle in an effective monitoring area of the image acquisition equipment according to the position of the image acquisition equipment from a ground plane, wherein the length and the width of the minimum circumscribed rectangle are respectively used as an x axis and a y axis in the plane of the effective monitoring area; under the condition that the target object is vertically arranged relative to the ground plane, determining a first data pair according to an imaging position of the imaging height of the target object in a preset monitoring area, wherein the first data pair comprises: the coordinates of the x-axis and the imaging height of the target object; and under the condition that the target object is horizontally arranged relative to the ground plane, determining a second data pair according to an imaging position of the imaging width of the target object in a preset monitoring area, wherein the first data pair comprises: the coordinates of the y-axis and the imaging width of the target object.
Specifically, first modeling is performed according to the imaging height and x-axis coordinates of the target object. And under the condition that the target object is vertically arranged relative to the ground plane, determining a first data pair according to the imaging position of the imaging height of the target object in a preset monitoring area. And secondly, modeling according to the imaging width and the y-axis coordinate of the target object, and determining a second data pair according to the imaging position of the imaging width of the target object in a preset monitoring area under the condition that the target object is horizontally arranged relative to the ground plane.
Preferably, the method for establishing the linear model of the imaging scale and the imaging position of the target object in the monitored scene according to the imaging scale of the target object in the monitored scene and the imaging position of the target object in the monitored scene comprises the following steps: determining a first data pair according to an imaging position of the imaging height of the target object in a preset monitoring area; and establishing a one-dimensional linear relation model of the imaging scale and the imaging position by taking the first data pair as an independent variable and the imaging height of the target object as an independent variable.
Specifically, a first data pair is determined according to an imaging position of the imaging height of the target object in a preset monitoring area, and parameters in a one-dimensional linear relation model can be determined by taking the first data pair as an independent variable and the imaging height of the target object as a dependent variable to establish the one-dimensional linear relation model of the imaging scale and the imaging position.
Preferably, the method for establishing the linear model of the imaging scale and the imaging position of the target object in the monitored scene according to the imaging scale of the target object in the monitored scene and the imaging position of the target object in the monitored scene comprises the following steps: determining a second data pair according to an imaging position of the imaging width of the target object within a preset monitoring area range; and establishing a one-dimensional linear relation model of the imaging scale and the imaging position by taking the second data pair as an independent variable and the imaging width of the target object as an independent variable.
Specifically, a second data pair is determined according to an imaging position of the imaging width of the target object within a preset monitoring area, and parameters in a one-dimensional linear relation model can be determined by taking the second data pair as an independent variable and the imaging width of the target object as a dependent variable to establish the one-dimensional linear relation model of the imaging scale and the imaging position.
Optionally, determining a candidate frame scale for target tracking according to the relational model includes: acquiring a tracking frame position of an initial frame of target tracking; determining a change coefficient corresponding to the relation model according to the tracking frame position of the initial frame; and determining the size of the candidate frame scale according to the change coefficient. The method comprises the steps of obtaining the tracking frame position of an initial frame of target tracking, determining a change coefficient corresponding to the relation model according to the tracking frame position of the initial frame, and determining the size of the candidate frame scale.
Optionally, determining a candidate frame scale of the target tracking according to the linear relation model includes: acquiring a tracking frame position of an initial frame of target tracking; determining a height change coefficient corresponding to the relation model according to the height position of the tracking frame of the initial frame; and determining the height of the candidate frame scale according to the height change coefficient. And determining the height of the candidate frame scale according to the height position of the tracking frame of the initial frame after acquiring the position of the tracking frame of the initial frame and determining the height change coefficient corresponding to the relation model.
Optionally, determining a candidate frame scale of the target tracking according to the linear relation model includes: acquiring a tracking frame position of an initial frame of target tracking; determining a width change coefficient corresponding to the relation model according to the width of the tracking frame position of the initial frame; and determining the width of the candidate frame scale according to the width change coefficient. The method comprises the steps of obtaining the tracking frame position of an initial frame of target tracking, determining the height change coefficient corresponding to the relation model according to the tracking frame width position of the initial frame, and determining the width of the candidate frame scale.
From the description of the above embodiments, it will be clear to a person skilled in the art that the method according to the above embodiments may be implemented by means of software plus the necessary general hardware platform, but of course also by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising instructions for causing a terminal device (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method according to the embodiments of the present invention.
Example 2
The embodiment also provides a device for determining the dimensions of the candidate frames, which is used for implementing the above embodiment and the preferred implementation, and is not described in detail. As used below, the term "module" may be a combination of software and/or hardware that implements a predetermined function. While the means described in the following embodiments are preferably implemented in software, implementation in hardware, or a combination of software and hardware, is also possible and contemplated.
Fig. 3 is a block diagram of a determination apparatus for a candidate frame scale according to an embodiment of the present invention, as shown in fig. 3, the apparatus including:
a modeling module 30, configured to build a relational model of an imaging scale of a target object in a monitored scene and an imaging position of the target object in the monitored scene according to the imaging scale and the imaging position, where the imaging scale includes: an imaging height of the target object and an imaging width of the target object, the imaging position comprising: the imaging position of the imaging height of the target object in a preset monitoring area and the imaging position of the imaging width of the target object in the preset monitoring area are determined by the monitoring scene;
a determination module 32 for determining a candidate frame scale for tracking the target object based on the relationship model.
In the above modules, the imaging background tends to have invariance in view of the fixed position of the imaging device typically deployed by the video structural analysis system. For target tracking, modeling is performed based on a monitoring scene, a relation between a target position and a target width and height is established, and the relation is applied to selection of candidate frame scales of target tracking. Specifically, modeling is firstly carried out according to a monitoring scene, and the relation between the imaging scale of a target object in the monitoring scene and the imaging position of the target object in the monitoring scene is established. And the target frame is not adjusted after the target is positioned, but the scale of the candidate frame is determined according to scene modeling while calculating the candidate frame.
By the module, the linear relation between the target position and the width and height of the target object is established, and the linearity is applied to the selection of the scale of the candidate frame for target tracking, so that the problem that the scale estimation of the target tracking is difficult to determine, the width and the height of the target are difficult to determine in the tracking process, and ideal target characteristics are not extracted can be solved, and the effects of reducing the selection of the scale coefficient of the candidate frame and reducing the calculation amount of the scale estimation are achieved.
Further comprises: the first determining module is used for determining a minimum circumscribed rectangle in an effective monitoring area of the image acquisition equipment according to the position of the image acquisition equipment from the ground plane, wherein the length and the width of the minimum circumscribed rectangle are respectively used as an x axis and a y axis in the plane of the effective monitoring area; the second determining module is configured to determine, in a case where the target object is disposed vertically with respect to the ground plane, a first data pair according to an imaging position of an imaging height of the target object in a preset monitoring area, where the first data pair includes: the coordinates of the x-axis and the imaging height of the target object; a third determining module, configured to determine, in a case where the target object is set horizontally with respect to the ground plane, a second data pair according to an imaging position of an imaging width of the target object in a preset monitoring area, where the first data pair includes: the coordinates of the y-axis and the imaging width of the target object.
The modeling module 30 is configured to determine a first data pair according to an imaging position of the imaging height of the target object in a preset monitoring area; and establishing a one-dimensional linear relation model of the imaging scale and the imaging position by taking the first data pair as an independent variable and the imaging height of the target object as an independent variable.
The modeling module 30 is configured to determine a second data pair according to an imaging position of the imaging width of the target object within a preset monitoring area range; and establishing a one-dimensional linear relation model of the imaging scale and the imaging position by taking the second data pair as an independent variable and the imaging width of the target object as an independent variable.
The determining module 32 is configured to obtain a tracking frame position of an initial frame of the target tracking; determining a change coefficient corresponding to the relation model according to the tracking frame position of the initial frame; and determining the size of the candidate frame scale according to the change coefficient.
The determining module 32 obtains the tracking frame position of the initial frame of the target tracking; determining a height change coefficient corresponding to the relation model according to the height position of the tracking frame of the initial frame; and determining the height of the candidate frame scale according to the height change coefficient.
The determining module 32 obtains the tracking frame position of the initial frame of the target tracking; determining a width change coefficient corresponding to the relation model according to the width of the tracking frame position of the initial frame; and determining the width of the candidate frame scale according to the width change coefficient.
It should be noted that each of the above modules may be implemented by software or hardware, and for the latter, it may be implemented by, but not limited to: the modules are all located in the same processor; alternatively, the above modules may be located in different processors in any combination.
In order to better understand the above-mentioned determination method flow of the candidate frame scale, the following explanation is given with reference to the preferred embodiment, but the technical solution of the embodiment of the present invention is not limited.
The preferred embodiment of the invention is a method for defining the scale of a target tracking candidate frame based on scene geometric modeling. The method comprises the steps of firstly modeling according to a monitoring scene, and establishing a relation between the scale of a target bounding box and the position of an image. This model relationship is then applied to the candidate box scale selection for target tracking.
As shown in fig. 4, a scenario diagram is used for a method for determining a candidate frame scale in an embodiment of the present application.
First, the precondition for scene modeling is guaranteed: 1) The three-dimensional position of the image acquisition device is unchanged, the focal length is unchanged, and a large depression angle (the depression angle ensures about 25-30 degrees) is not available, 2) the shooting scene is approximately parallel to the ground plane. A typical application scenario may reasonably meet both assumptions.
And modeling a monitoring scene of a typical application scene, wherein C is an image acquisition device, and OE is the height of the image acquisition device from the ground plane as shown in fig. 4. P (P) 1 P 2 P 3 P 4 Representing scene ground, P 5 P 6 P 7 P 8 The minimum circumscribed rectangle of the effective area of the related task is provided, and four sides of the rectangle are parallel to four sides of the two-dimensional image. Will P 5 P 6 The straight line is taken as the x axis,P 7 P 8 The straight line is taken as the y-axis. The line segment AB is the height of the object L, the distance from OE is OB, and CD represents the imaging height when the object L is placed far, i.e. the effect that the same object is at different positions and the heights of the objects are inconsistent appears in the image. And on the premise of meeting scene modeling assumption, the height shows a linear relationship along with the change of the distance. Specifically, establishing a relation model of imaging scale and imaging position according to imaging scale of a target object in a monitoring scene and imaging position of the target object in the monitoring scene comprises:
first the target height and x-axis coordinates are modeled. Selecting a plurality of test objects with different heights, and marking the test objects as O= { O 1 ,o 2 ,...,o S },o i Indicating a height of h i Object, h 1 <h 2 <…<h S
It should be noted that the test objects of different heights need to ensure that 1) the object heights are equal and different, and that 2) the height range is large enough to include the object heights to be tracked in actual use.
Object o i Perpendicular to the ground plane, placed at different positions of the active area, data pairs of x-axis coordinates and heights of object pixels in the image are recorded and expressed as:
Figure BDA0002628068920000101
wherein the coordinates are used as independent variables, namely the coordinates are used as variables which cause the height to change, and the heights are used as dependent variables, namely the height change is changed due to the change of the coordinates.
Construction of one-dimensional linear model based on coordinates and height
Figure BDA0002628068920000102
Fitting by a least square method to obtain:
Figure BDA0002628068920000103
Figure BDA0002628068920000104
and similarly, calculating parameters of one-dimensional models of other objects. Due to the linear variation characteristics, the calculated slopes of all objects are theoretically equal, and
Figure BDA0002628068920000105
as the slope of the linear model of all test objects. In the same way, the object is placed horizontally, and the relation between the width and the x-axis coordinate is calculated to obtain k w And->
Figure BDA0002628068920000106
Through the calculation, when knowing the x coordinate of the object in the image and the height h thereof, the parameter b corresponding to the object can be calculated through a linear model h . Thereby determining the corresponding linear model. Similarly, parameter b can be determined w . On the premise of meeting the scene modeling assumption condition, a two-dimensional coordinate system is established according to the minimum circumscribed rectangle of the effective area, and scene self-adaptive modeling is performed by counting the heights and the widths of objects with different heights and widths at different positions in the two-dimensional coordinate system.
As shown in fig. 5, a schematic diagram of determining the scale of the target tracking candidate box is shown. After scene modeling is completed, it is applied to candidate box scale selection for target tracking. Assuming that the current frame is the ith frame, the target position and the tracking frame are determined, as shown in fig. 5, the a frame is the result of target tracking of the current frame, and the x-axis coordinate corresponding to the a frame is x 1 . The b frame is a target candidate frame, and the x-axis coordinate corresponding to the candidate frame is x 2 . The height and width of the a frame are H and W respectively. From the linear model, h=k can be determined h x 1 +b h Calculating, thereby calculating b h Similarly, b can be calculated w . The deformation of the target between adjacent frames is small, and the target can be considered as being wide and high between the adjacent framesIs consistent with the model of scene modeling. The coefficients corresponding to the scene model can be found that the heights and widths corresponding to Huang Kuang are H' =k h x 2 +b h And W' =k w x 2 +b w . Similarly, the height and width of the candidate boxes at other locations may be calculated. The scale of the target candidate frame is calculated in this way, instead of setting a plurality of scale coefficients, a large number of calculations are performed. On one hand, the calculation resources are saved, and on the other hand, the accuracy of the scale is improved.
The scene self-adaptive modeling is applied to candidate frame selection of target tracking, and the scale of the candidate frame is determined according to the positions of different candidate frames and the target scale of the previous frame.
An embodiment of the invention also provides a storage medium having a computer program stored therein, wherein the computer program is arranged to perform the steps of any of the method embodiments described above when run.
Alternatively, in the present embodiment, the above-described storage medium may be configured to store a computer program for performing the steps of:
s1, establishing a relation model of an imaging scale and an imaging position of a target object in a monitoring scene according to the imaging scale of the target object in the monitoring scene and the imaging position of the target object in the monitoring scene, wherein the imaging scale comprises: an imaging height of the target object and an imaging width of the target object, the imaging position comprising: the imaging position of the imaging height of the target object in a preset monitoring area and the imaging position of the imaging width of the target object in the preset monitoring area are determined by the monitoring scene;
s2, determining a candidate frame scale for tracking the target object according to the relation model.
Optionally, the storage medium is further arranged to store a computer program for performing the steps of:
s1, determining a minimum circumscribed rectangle in an effective monitoring area of image acquisition equipment according to the position of the image acquisition equipment from a ground plane, wherein the length and the width of the minimum circumscribed rectangle are respectively used as an x axis and a y axis in the plane of the effective monitoring area;
s2, under the condition that the target object is vertically arranged relative to the ground plane, determining a first data pair according to an imaging position of the imaging height of the target object in a preset monitoring area, wherein the first data pair comprises: the coordinates of the x-axis and the imaging height of the target object;
s3, determining a second data pair according to an imaging position of the imaging width of the target object in a preset monitoring area under the condition that the target object is horizontally arranged relative to the ground plane, wherein the first data pair comprises: the y-axis coordinates and the imaging width of the target object
Alternatively, in the present embodiment, the storage medium may include, but is not limited to: a usb disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing a computer program.
An embodiment of the invention also provides an electronic device comprising a memory having stored therein a computer program and a processor arranged to run the computer program to perform the steps of any of the method embodiments described above.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, where the transmission device is connected to the processor, and the input/output device is connected to the processor.
Alternatively, in the present embodiment, the above-described processor may be configured to execute the following steps by a computer program:
s1, establishing a relation model of an imaging scale and an imaging position of a target object in a monitoring scene according to the imaging scale of the target object in the monitoring scene and the imaging position of the target object in the monitoring scene, wherein the imaging scale comprises: an imaging height of the target object and an imaging width of the target object, the imaging position comprising: the imaging position of the imaging height of the target object in a preset monitoring area and the imaging position of the imaging width of the target object in the preset monitoring area are determined by the monitoring scene;
s2, determining a candidate frame scale for tracking the target object according to the relation model.
Alternatively, specific examples in this embodiment may refer to examples described in the foregoing embodiments and optional implementations, and this embodiment is not described herein.
It will be appreciated by those skilled in the art that the modules or steps of the invention described above may be implemented in a general purpose computing device, they may be concentrated on a single computing device, or distributed across a network of computing devices, they may alternatively be implemented in program code executable by computing devices, so that they may be stored in a memory device for execution by computing devices, and in some cases, the steps shown or described may be performed in a different order than that shown or described, or they may be separately fabricated into individual integrated circuit modules, or multiple modules or steps within them may be fabricated into a single integrated circuit module for implementation. Thus, the present invention is not limited to any specific combination of hardware and software.
The above description is only of the preferred embodiments of the present invention and is not intended to limit the present invention, but various modifications and variations can be made to the present invention by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. A method for determining a candidate frame scale, comprising:
according to an imaging scale of a target object in a monitored scene and an imaging position of the target object in the monitored scene, establishing a relation model of the imaging scale and the imaging position, wherein the imaging scale comprises: an imaging height of the target object and an imaging width of the target object, the imaging position comprising: the imaging position of the imaging height of the target object in a preset monitoring area and the imaging position of the imaging width of the target object in the preset monitoring area are determined by the monitoring area through the monitoring scene;
determining a candidate frame scale for tracking the target object according to the relation model and the tracking frame position of the target object in an initial frame;
wherein establishing a relationship model of the imaging scale and the imaging position comprises:
establishing a relation model of the imaging height and the imaging position according to the following formula:
Figure QLYQS_2
wherein y is the imaging height, x is the imaging position, +.>
Figure QLYQS_3
、/>
Figure QLYQS_4
Is a parameter for establishing the relationship model for the target object; wherein (1)>
Figure QLYQS_5
Is obtained by the following steps: />
Figure QLYQS_6
S is the test object +.>
Figure QLYQS_7
Quantity of->
Figure QLYQS_8
Is for the ith test object o of S test objects i Parameters of a one-dimensional model are constructed and said +.>
Figure QLYQS_1
By applying to said test object o i Is a slope obtained by least square fitting a set of test data pairsEach test data pair of the set of test data pairs comprising the test object o i And at different coordinate positions and corresponding object pixel heights, i is a positive integer which is greater than or equal to 1 and less than or equal to S.
2. The method as recited in claim 1, further comprising:
determining a minimum circumscribed rectangle in an effective monitoring area of the image acquisition equipment according to the position of the image acquisition equipment from a ground plane, wherein the length and the width of the minimum circumscribed rectangle are respectively used as an x axis and a y axis in the plane of the effective monitoring area;
under the condition that the target object is vertically arranged relative to the ground plane, determining a first data pair according to an imaging position of the imaging height of the target object in a preset monitoring area, wherein the first data pair comprises: the coordinates of the x-axis and the imaging height of the target object;
and under the condition that the target object is horizontally arranged relative to the ground plane, determining a second data pair according to the imaging position of the imaging width of the target object in a preset monitoring area, wherein the second data pair comprises: the coordinates of the y-axis and the imaging width of the target object.
3. The method according to claim 1 or 2, wherein building a linear model of an imaging scale of a target object in a monitored scene and an imaging position of the target object in the monitored scene from the imaging scale and the imaging position comprises:
determining a first data pair according to an imaging position of the imaging height of the target object in a preset monitoring area;
and establishing a one-dimensional linear relation model of the imaging scale and the imaging position by taking the first data pair as an independent variable and the imaging height of the target object as an independent variable.
4. The method according to claim 1 or 2, wherein building a linear model of an imaging scale of a target object in a monitored scene and an imaging position of the target object in the monitored scene from the imaging scale and the imaging position comprises:
determining a second data pair according to an imaging position of the imaging width of the target object within a preset monitoring area range;
and establishing a one-dimensional linear relation model of the imaging scale and the imaging position by taking the second data pair as an independent variable and the imaging width of the target object as an independent variable.
5. The method of claim 1, wherein determining a candidate frame scale for target tracking based on the relationship model and a tracking frame position of the target object in an initial frame comprises:
acquiring the tracking frame position of the initial frame of target tracking;
determining a change coefficient corresponding to the relation model according to the tracking frame position of the initial frame;
and determining the size of the candidate frame scale according to the change coefficient.
6. The method of claim 1 or 5, wherein determining a candidate frame scale for target tracking based on the relationship model and a tracking frame position of the target object in an initial frame comprises:
acquiring the tracking frame position of the initial frame of target tracking;
determining a height change coefficient corresponding to the relation model according to the height position of the tracking frame of the initial frame;
and determining the height of the candidate frame scale according to the height change coefficient.
7. The method of claim 1 or 5, wherein determining a candidate frame scale for target tracking based on the relationship model and a tracking frame position of the target object in an initial frame comprises:
acquiring the tracking frame position of the initial frame of target tracking;
determining a width change coefficient corresponding to the relation model according to the width of the tracking frame position of the initial frame;
and determining the width of the candidate frame scale according to the width change coefficient.
8. A device for determining a candidate frame scale, comprising:
the modeling module is used for building a relation model of the imaging scale and the imaging position according to the imaging scale of the target object in the monitoring scene and the imaging position of the target object in the monitoring scene, wherein the imaging scale comprises: an imaging height of the target object and an imaging width of the target object, the imaging position comprising: the imaging position of the imaging height of the target object in a preset monitoring area and the imaging position of the imaging width of the target object in the preset monitoring area are determined by the monitoring scene;
the determining module is used for determining a candidate frame scale for tracking the target object according to the relation model and the tracking frame position of the target object in an initial frame;
wherein the modeling module may build a relational model of the imaging height and the imaging position by:
establishing a relation model of the imaging height and the imaging position according to the following formula:
Figure QLYQS_10
wherein y is the imaging height, x is the imaging position, +.>
Figure QLYQS_11
、/>
Figure QLYQS_12
Is a parameter for establishing the relationship model for the target object; wherein (1)>
Figure QLYQS_13
Is obtained by the following steps: />
Figure QLYQS_14
S is the test object +.>
Figure QLYQS_15
Quantity of->
Figure QLYQS_16
Is for the ith test object o of S test objects i Parameters of a one-dimensional model are constructed and said +.>
Figure QLYQS_9
By applying to said test object o i Slope obtained by least square fitting a set of test data pairs, each test data pair of the set of test data pairs comprising the test object o i And at different coordinate positions and corresponding object pixel heights, i is a positive integer which is greater than or equal to 1 and less than or equal to S.
9. A storage medium having a computer program stored therein, wherein the computer program is arranged to perform the method of any of claims 1 to 7 when run.
10. An electronic device comprising a memory and a processor, characterized in that the memory has stored therein a computer program, the processor being arranged to run the computer program to perform the method of any of the claims 1 to 7.
CN202010803006.8A 2020-08-11 2020-08-11 Method and device for determining candidate frame scale, storage medium and electronic device Active CN111915640B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010803006.8A CN111915640B (en) 2020-08-11 2020-08-11 Method and device for determining candidate frame scale, storage medium and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010803006.8A CN111915640B (en) 2020-08-11 2020-08-11 Method and device for determining candidate frame scale, storage medium and electronic device

Publications (2)

Publication Number Publication Date
CN111915640A CN111915640A (en) 2020-11-10
CN111915640B true CN111915640B (en) 2023-06-13

Family

ID=73284182

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010803006.8A Active CN111915640B (en) 2020-08-11 2020-08-11 Method and device for determining candidate frame scale, storage medium and electronic device

Country Status (1)

Country Link
CN (1) CN111915640B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101183428A (en) * 2007-12-18 2008-05-21 北京中星微电子有限公司 Image detection method and apparatus

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103065163B (en) * 2013-02-04 2015-10-14 成都神州数码索贝科技有限公司 A kind of fast target based on static images detects recognition system and method
CN104680478B (en) * 2015-02-15 2018-08-21 青岛海信移动通信技术股份有限公司 A kind of choosing method and device of destination image data
CN107316332A (en) * 2017-05-16 2017-11-03 深圳市保千里电子有限公司 The camera and scene relating scaling method and system of a kind of application intelligent driving
CN109903331B (en) * 2019-01-08 2020-12-22 杭州电子科技大学 Convolutional neural network target detection method based on RGB-D camera

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101183428A (en) * 2007-12-18 2008-05-21 北京中星微电子有限公司 Image detection method and apparatus

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Ning Jifeng et al.Scale and orientation adaptive mean shift tracking.《IET Computer Vision》.2012,第52-61页. *
单玉刚 等.鲁棒的自适应尺度和方向的目标跟踪方法.《计算机工程与应用》.2018,(第21期),第213-221页. *

Also Published As

Publication number Publication date
CN111915640A (en) 2020-11-10

Similar Documents

Publication Publication Date Title
CN111340864A (en) Monocular estimation-based three-dimensional scene fusion method and device
US8897539B2 (en) Using images to create measurements of structures through the videogrammetric process
CN111311632B (en) Object pose tracking method, device and equipment
CN108898171B (en) Image recognition processing method, system and computer readable storage medium
JP2017011431A (en) Image processing device, image processing method and program
CN111310727B (en) Object detection method and device, storage medium and electronic device
CN112270736A (en) Augmented reality processing method and device, storage medium and electronic equipment
CN112613381A (en) Image mapping method and device, storage medium and electronic device
CN111294563B (en) Video monitoring method and device, storage medium and electronic device
CN114972027A (en) Image splicing method, device, equipment, medium and computer product
CN111915640B (en) Method and device for determining candidate frame scale, storage medium and electronic device
CN109345560B (en) Motion tracking precision testing method and device of augmented reality equipment
CN114611635B (en) Object identification method and device, storage medium and electronic device
CN110458857A (en) Central symmetry pel detection method, device, electronic equipment and readable storage medium storing program for executing
WO2022227875A1 (en) Three-dimensional imaging method, apparatus, and device, and storage medium
CN113496527B (en) Vehicle surrounding image calibration method, device and system and storage medium
CN116091653A (en) Configuration diagram generation method and device of operation and maintenance equipment, terminal equipment and storage medium
CN113378864B (en) Method, device and equipment for determining anchor frame parameters and readable storage medium
CN111507894B (en) Image stitching processing method and device
CN114913246A (en) Camera calibration method and device, electronic equipment and storage medium
CN112991463A (en) Camera calibration method, device, equipment, storage medium and program product
CN110189396B (en) Scene graph construction method, system and related device based on vision
CN113487685A (en) Calibration method, device and equipment of line laser scanning camera and storage medium
CN112232170A (en) Method and device for determining object behaviors, storage medium and electronic device
CN112163519A (en) Image mapping processing method, device, storage medium and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant