CN111323006A - Target positioning method and device and electronic equipment - Google Patents

Target positioning method and device and electronic equipment Download PDF

Info

Publication number
CN111323006A
CN111323006A CN201811545183.XA CN201811545183A CN111323006A CN 111323006 A CN111323006 A CN 111323006A CN 201811545183 A CN201811545183 A CN 201811545183A CN 111323006 A CN111323006 A CN 111323006A
Authority
CN
China
Prior art keywords
target
pixel coordinates
point
pixel
monitoring picture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811545183.XA
Other languages
Chinese (zh)
Inventor
王超
郑经国
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikvision System Technology Co Ltd
Original Assignee
Hangzhou Hikvision System Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikvision System Technology Co Ltd filed Critical Hangzhou Hikvision System Technology Co Ltd
Priority to CN201811545183.XA priority Critical patent/CN111323006A/en
Publication of CN111323006A publication Critical patent/CN111323006A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the invention provides a target positioning method, a target positioning device and electronic equipment. The method comprises the following steps: acquiring pixel coordinates of a target in a monitoring picture, wherein the monitoring picture comprises a plurality of calibration points with known positions; and calculating the position of the target based on the corresponding relation between the pixel coordinates and the positions of the calibration points and the pixel coordinates of the target. The target can be positioned based on the monitoring picture, namely the target can be positioned under the condition that the target does not need to carry terminal equipment.

Description

Target positioning method and device and electronic equipment
Technical Field
The present invention relates to the field of video analysis technologies, and in particular, to a target positioning method and apparatus, and an electronic device.
Background
In order to realize intelligent management of targets (such as personnel and vehicles), terminal equipment with a signal sending function can be equipped (or installed) for the targets, and the positioning base station can position the terminal equipment by sensing signals sent by the terminal equipment, so that the positions of the targets are determined.
However, in some application scenarios, it is difficult to require the target to be equipped with the terminal device (or to install the terminal device at the target), and in these application scenarios, it may be difficult to achieve target location.
Disclosure of Invention
The embodiment of the invention aims to provide a target positioning method, a target positioning device and electronic equipment, so as to position a target under the condition that the target does not carry (or is provided with) terminal equipment. The specific technical scheme is as follows:
in a first aspect of the embodiments of the present invention, a target positioning method is provided, where the method includes:
acquiring pixel coordinates of a target in a monitoring picture, wherein the monitoring picture comprises a plurality of calibration points with known positions;
and calculating the position of the target based on the corresponding relation between the pixel coordinates and the positions of the calibration points and the pixel coordinates of the target.
With reference to the first aspect, in a first possible implementation manner, the calculating a position of the target based on a correspondence between pixel coordinates and positions of the calibration points and pixel coordinates of the target includes:
and mapping the pixel coordinates of the target through a mapping equation map to obtain the position of the target, wherein the mapping equation is obtained by fitting the corresponding relation between the pixel coordinates and the position of the calibration point.
With reference to the first aspect, in a second possible implementation manner, the calculating the position of the target based on the correspondence between the pixel coordinates and the position of the calibrated point and the pixel coordinates of the target includes:
and obtaining the position corresponding to the pixel coordinate of the target as the position of the target through interpolation calculation based on the corresponding relation between the pixel coordinate and the position of the calibration point.
With reference to the first aspect, in a third possible implementation manner, the index point is selected by:
dividing a monitoring picture into a plurality of grids;
and selecting a plurality of vertexes from the vertexes of the grids as calibration points.
With reference to the third possible implementation manner of the first aspect, in a fourth possible implementation manner, the calculating a position of the target based on a correspondence between pixel coordinates and positions of the calibration points and pixel coordinates of the target includes:
and taking the position of a preset specific point in the grid to which the pixel coordinate of the target belongs as the position of the target, wherein the position of the specific point is calculated based on the corresponding relation between the pixel coordinate and the position of the calibration point.
With reference to the first aspect, in a fifth possible implementation manner, before the obtaining the pixel coordinates of the target in the monitoring picture, the method further includes:
and identifying the target existing in the monitoring picture by using a preset target identification algorithm.
In a second aspect of embodiments of the present invention, there is provided an object localization apparatus, the apparatus comprising:
the system comprises a data acquisition module, a data acquisition module and a data processing module, wherein the data acquisition module is used for acquiring pixel coordinates of a target in a monitoring picture, and the monitoring picture comprises a plurality of calibration points with known positions;
and the position determining module is used for calculating the position of the target based on the corresponding relation between the pixel coordinates and the positions of the calibration points and the pixel coordinates of the target.
With reference to the first aspect, in a sixth possible implementation manner, the method further includes:
and acquiring the type of the target, and displaying the target on a map according to the position of the target and the type of the target.
With reference to the second aspect, in a first possible implementation manner, the position determining module is specifically configured to map the pixel coordinates of the target through a mapping equation map to obtain the position of the target, where the mapping equation is obtained by fitting a corresponding relationship between the pixel coordinates and the position of the calibration point.
With reference to the second aspect, in a second possible implementation manner, the position determining module is specifically configured to obtain, as the position of the target, a position corresponding to the pixel coordinate of the target through interpolation calculation based on a corresponding relationship between the pixel coordinate and the position of the calibration point.
With reference to the second aspect, in a third possible implementation manner, the index point is selected by:
dividing a monitoring picture into a plurality of grids;
and selecting a plurality of vertexes from the vertexes of the grids as calibration points.
With reference to the third possible implementation manner of the second aspect, in a fourth possible implementation manner, the position determining module is specifically configured to use, as the position of the target, a position of a specific point preset in a grid to which the pixel coordinate of the target belongs, where the position of the specific point is calculated based on a corresponding relationship between the pixel coordinate and the position of the calibration point.
With reference to the second aspect, in a fifth possible implementation manner, the apparatus further includes a target identification module, configured to identify, by using a preset target identification algorithm, a target existing in the monitoring picture before the pixel coordinates of the target in the monitoring picture are obtained.
With reference to the second aspect, in a sixth possible implementation manner, the apparatus further includes a target display module, configured to obtain a type to which the target belongs, and display the target on the map according to the position of the target and the type to which the target belongs.
In a third aspect of embodiments of the present invention, there is provided an electronic device, including:
a memory for storing a computer program;
and the processor is used for realizing any one of the target positioning methods when executing the program stored in the memory.
In a fourth aspect of the embodiments of the present invention, a computer-readable storage medium is provided, in which a computer program is stored, and the computer program, when executed by a processor, implements any of the above-mentioned object positioning methods.
The target positioning method, the target positioning device and the electronic equipment provided by the embodiment of the invention can position the target based on the monitoring picture, namely the target can be positioned under the condition that the target does not need to carry terminal equipment. Of course, not all of the advantages described above need to be achieved at the same time in the practice of any one product or method of the invention.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic flow chart of a target positioning method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a distribution of index points according to an embodiment of the present invention;
fig. 3 is a schematic flowchart of a target positioning method according to an embodiment of the present invention;
FIG. 4 is a schematic structural diagram of a target positioning system according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of an object location system according to an embodiment of the present invention;
fig. 6 is a schematic flowchart of a target positioning method according to an embodiment of the present invention;
fig. 7 is another schematic flow chart of a target positioning method according to an embodiment of the present invention;
FIG. 8a is a schematic structural diagram of an object-locating device according to an embodiment of the present invention;
FIG. 8b is a schematic structural diagram of an object-locating device according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, fig. 1 is a schematic flow chart of a target positioning method according to an embodiment of the present invention, where the method may be applied to a camera, and may also be applied to an electronic device having a camera function or connected to the camera, and the method may include:
s101, obtaining pixel coordinates of a target in a monitoring picture, wherein the monitoring picture comprises a plurality of calibration points with known positions.
The monitoring picture is obtained by shooting a specific monitoring scene by a camera, the target can be different objects such as people, vehicles and cats according to actual requirements, and the target can also be one object or multiple objects, for example, the target can be a person or a person and a vehicle. The method can be applied to a camera, namely the method can be applied to a camera for shooting to obtain a monitoring picture, and can also be applied to other cameras except the camera for shooting to obtain the monitoring picture.
If the method is applied to a camera for shooting a monitored picture, the target identification can be performed on the monitored picture to acquire the pixel coordinates of the target in the monitored picture. If the method is applied to other electronic equipment except the camera for shooting the monitored picture, the method can be used for receiving the monitored picture sent by the camera for shooting the monitored picture and carrying out target identification on the monitored picture to obtain the pixel coordinates of the target in the monitored picture, or can be used for obtaining the pixel coordinates of the target in the monitored picture sent by the camera for shooting the monitored picture.
The position of the calibration point may be measured by an electronic device with a position measurement function, and the position of the calibration point may be represented in the form of coordinates, for convenience of discussion, the coordinates representing the position of the calibration point are referred to as world coordinates, and a coordinate axis and a coordinate origin of the world coordinates may be set according to actual needs of a user, for example, the world coordinates may be GPS (Global Positioning System) coordinates.
The calibration point may be a plurality of points that are specified in advance by the user in the monitoring screen, and may be, for example, a plurality of markers that are placed in the monitoring scene by the user in advance, and the GPS coordinates of the plurality of markers are measured as the positions of the plurality of markers by using the electronic device with GPS positioning function. And for each marker, finding the image of the marker in the monitoring picture, taking the image of the marker as a calibration point, and taking the position of the marker as the position of the calibration point. Or the user may specify a plurality of points in the monitoring screen as the calibration points, find the spatial points corresponding to the plurality of calibration points in the monitoring scene, and measure the positions of the plurality of spatial points by using the electronic device with the GPS positioning function as the positions of the calibration points.
Further, in an alternative embodiment, the monitoring picture may be divided into a plurality of grids, and a plurality of vertices may be selected from the vertices of the plurality of grids to be used as the calibration points. The different grids may be the same size or different sizes. For example, the monitoring picture may be divided into a plurality of grids by using 5 horizontal lines and 6 vertical lines, and the grids have 20 grids, the grids have 30 vertices, n vertices may be selected from the 30 vertices as the index points, n is an integer greater than 1 and less than or equal to 30, and when n is equal to 30, all the vertices are taken as the index points.
S102, calculating the position of the target based on the corresponding relation between the pixel coordinates and the position of the calibration point and the pixel coordinates of the target.
Calculating the position of the target based on the pixel coordinates of the target may be regarded as mapping the pixel coordinates of the target to the position of the target. For the monitoring pictures shot by different cameras, the mapping relationship between the pixel coordinates of the target and the position of the target may be different because the configuration parameters and/or the installation method of the cameras may be different.
For the calibration point with a known position, because the calibration point also belongs to the monitoring picture, the corresponding relation between the pixel coordinate and the position of the calibration point also conforms to the mapping relation between the pixel coordinate of the target and the position of the target. For the sake of discussion, it is assumed that the position is expressed in the form of GPS coordinates, and the mapping relationship between the pixel coordinates and the GPS coordinates is a function f (x, y) for the monitored picture, i.e., the pixel coordinates are (x) for the monitored picturex,yx) Point X of (a), the GPS coordinate of which is f (X)x,yx). Assume index point A, B, C, D where index point A has pixel coordinates of (x)a,ya) GPS coordinates are GPSa, and pixel coordinates of calibration point B are (x)b,yb) The GPS coordinate is GPSb, and the pixel coordinate of the index point C is (x)c,yc) The GPS coordinate is GPSc, and the pixel coordinate of the calibration point D is (x)d,yd) And the GPS coordinate is GPSd, the following formula is satisfied:
GPSa=f(xa,ya)
GPSb=f(xb,yb)
GPSc=f(xc,yc)
GPSd=f(xd,yd)
that is, the function f (x, y) crosses the point (x)a,ya,GPSa)、(xb,yb,GPSb)、(xc,ycGPSc) and (x)d,ydGPSd), in an alternative embodiment, the four points may be fitted as sample points, and the fitted function is used as the mapping relationship between the pixel coordinates and the GPS coordinates. Assuming that the function obtained after fitting is g (x, y), the pixel coordinate of the target in the monitored picture is (x)p,yp) Then the GPS coordinate GPSp of the target can be calculated by:
GPSp=g(xp,yp)
further, in other embodiments, other numbers of calibration points may be set in the monitoring picture according to actual requirements, and the larger the number of calibration points is, the more sample points are in the fitting process, and further the function g (x, y) obtained by fitting may be closer to the function f (x, y), and the closer the function g (x, y) is to the function f (x, y), the more accurate the calculated GPS coordinate of the target may be. It is to be understood that since the calibration points are preset, the fitting function g (x, y) may be performed before S102 or S101.
In another optional embodiment, the GPS coordinate GPSp of the target may also be obtained through interpolation calculation, and the interpolation method may be selected according to the number of calibration points and actual requirements, which will be described below by taking the case of four calibration points in this embodiment as an example.
Assuming that the target is located at a point P in the monitoring screen and, as shown in fig. 2, the point P is located in a quadrangle with the calibration point A, B, C, D as a vertex (for the case where the point P is not located in the quadrangle, GPSp may be obtained by interpolation with the same principle, and the accuracy may be low), the GPSp may be obtained by interpolation according to the following formula:
GPSp=Sa*GPSa+Sb*GPSb+Sc*GPSc+Sd*GPSd
wherein S isa、Sb、Sc、SdAs interpolation coefficients, SaInversely proportional to the distance, S, from point P to index point AbInversely proportional to the distance, S, from point P to index point BcInversely proportional to the distance, S, from point P to index point CdInversely proportional to the distance of point P to index point D and satisfies:
Sa+Sb+Sc+Sd=1
for example, assuming that the pixel coordinate of the calibration point a is (0,0), the pixel coordinate of the calibration point B is (1, 0), the pixel coordinate of the calibration point C is (1, 1), and the pixel coordinate of the calibration point D is (0,1), the GPSp may be obtained by interpolation according to the bilinear interpolation formula:
Figure BDA0001909162840000071
wherein xpIs the horizontal pixel coordinate of point P, ypIs a pointP vertical pixel coordinate.
By adopting the embodiment, the target can be positioned based on the monitoring picture, namely the target can be positioned under the condition that the target does not need to carry terminal equipment. For example, in order to better manage the park, people in the park need to be located, the park is used as a public place, the mobility of people is large, the composition of people in the park may be complex, it is difficult to distribute terminal devices in a unified manner, and it is also difficult to require all people entering the park to wear terminal devices, so it is difficult to locate the people in the park through the base station.
To more clearly describe the target positioning method provided by the embodiment of the present invention, the following description is made with reference to the selection of the calibration point and the measurement of the calibration point position, and may refer to fig. 3, where the method includes:
s301, dividing the monitoring picture into a plurality of grids.
Regarding the grid division, reference may be made to the related description in S101, which is not described herein again.
S302, selecting a plurality of vertexes from the vertexes of the grids as calibration points.
S303, acquiring the pixel coordinates of the target in the monitoring picture.
The step is the same as S101, and reference may be made to the foregoing description about S101, which is not described herein again.
S304, the position of a specific point preset in the grid to which the pixel coordinates of the target belong is set as the position of the target.
Wherein the position of the specific point is calculated based on the corresponding relationship between the pixel coordinates and the position of the calibration point. According to actual requirements, different points in the grid can be taken as specific points, and for example, the center point of the grid can be taken as the specific point. Since the specific point is selected in advance, the position of the specific point may be calculated in advance, or may be calculated by calculating 1 after determining the grid to which the pixel coordinate of the target belongs, which is not limited in this embodiment.
The target positioning method provided by the embodiment of the present invention will be described below with reference to an entity device, and referring to fig. 4, fig. 4 is a schematic structural diagram of a target positioning system provided by the embodiment of the present invention, and the target positioning system may include a positioning server 410 and cameras 420, in this embodiment, the positioning server 410 may be connected to a plurality of cameras 420, and in other embodiments, the positioning server 410 may also be connected to only one camera 420. Different cameras 420 may be used to capture different monitoring scenes or may be used to capture the same monitoring scene from different angles. The monitoring picture of each camera 420 includes a plurality of calibration points with known positions, and the selection manner and the number of the calibration points in the monitoring pictures of different cameras 420 may be the same or different.
The working principle of the target positioning system can be seen in fig. 5, which comprises:
s501, after recognizing that the target exists in the monitoring picture, the camera sends the pixel coordinates of the target in the monitoring picture to a positioning server.
The camera can continuously shoot a monitoring scene and perform target recognition on a shot monitoring picture. For example, if the target is a person, face recognition may be performed on a captured monitoring picture, when a face is recognized to exist in the monitoring picture, the presence of the target in the monitoring picture is determined, and the pixel coordinates of the area where the face is located are sent to the positioning server as the pixel coordinates of the target in the monitoring picture.
Further, in an alternative embodiment, a camera number may be preset for each camera, and the camera sends the camera number of the camera to the positioning server after the camera captures a target in the monitoring picture. The camera number may be transmitted to the positioning server together with the pixel coordinate, or may be transmitted to the positioning server separately from the pixel coordinate. By adopting the embodiment, the positioning server can manage the information sent by different cameras conveniently.
In other embodiments, the camera may also send the captured monitoring picture to the positioning server, and the positioning server performs object recognition on the monitoring picture. Compared with the pixel coordinates, the data volume of the monitoring picture is large, and the sending of the monitoring picture may occupy a large bandwidth.
S502, after receiving the pixel coordinates of the target in the monitoring picture, the positioning server calculates the position of the target based on the corresponding relation between the pixel coordinates and the position of the calibration point of the camera and the pixel coordinates of the target.
The positioning server may pre-store a correspondence between pixel coordinates and positions of the calibration points of the cameras, read the pixel coordinates and positions of the calibration points of the cameras that send the pixel coordinates after receiving the pixel coordinates sent by the cameras, and calculate the position of the target based on the read correspondence between the pixel coordinates and positions and the received pixel coordinates. For the calculation of the position of the target, reference may be made to the relevant descriptions in S102 and S304, which are not described herein again.
Further, in an optional embodiment, if the position of the target is obtained by mapping the pixel coordinates of the target through a mapping equation, the positioning server may store the mapping equations corresponding to the cameras in advance, read the mapping equation corresponding to the camera that sends the pixel coordinates after receiving the pixel coordinates sent by the ray machine, and map the received pixel coordinates through the mapping equation to obtain the position of the target.
The following describes a target positioning method provided in an embodiment of the present invention, which uses a camera as an execution subject, and sets a plurality of calibration points for measuring positions in advance on a monitoring screen of the camera, as shown in fig. 6, including:
s601, shooting a monitoring scene by a camera to obtain a monitoring picture of the monitoring scene.
S602, the camera carries out target recognition on the monitoring picture so as to determine whether a target exists in the monitoring picture.
S603, if the target exists in the monitoring picture, acquiring the pixel coordinate of the target in the monitoring picture.
S604, calculating the position of the target based on the corresponding relation between the pixel coordinates and the position of the calibration point and the pixel coordinates of the target.
In an alternative embodiment, after the position of the target is calculated, the position of the target may be displayed in a preset map based on the calculated position of the target, so that the user can manage the target.
Referring to fig. 7, fig. 7 is a schematic flow chart of a target positioning method according to an embodiment of the present invention, which may include:
s701, acquiring pixel coordinates of a target in a monitoring picture, wherein the monitoring picture comprises a plurality of calibration points with known positions.
The step is the same as S101, and reference may be made to the foregoing description about S101, which is not described herein again.
S702, calculating the position of the target based on the corresponding relation between the pixel coordinates and the position of the calibration point and the pixel coordinates of the target.
The step is the same as S102, and reference may be made to the foregoing description about S102, which is not repeated herein.
And S703, acquiring the type of the target, and displaying the target on the map according to the position of the target and the type of the target.
The types of the system can include different types according to different actual requirements. Exemplary, may include: human face, human body, car, etc. If the execution subject is a camera for shooting the monitoring picture, the belonging type can be obtained by performing target recognition on the monitoring picture, if the execution subject is not the camera for shooting the monitoring picture, the monitoring picture can be obtained from the camera, the target belonging type can be obtained by performing target recognition on the monitoring picture, or the belonging type of the target can be directly obtained from the camera.
The display method of the target on the map can be set according to actual requirements. For example, different colored identifiers may be used to show in a map for different belonging types of objects. For another example, a plurality of areas may be set on the map in advance, and the target may be displayed in the map in different ways according to the area where the target is located.
For example, a water area, a fire occurrence area, a normal area, and the like may be set in the map, if the position of the target is in the water area, the target may be displayed in the map by a preset drowning warning icon, if the position of the target is in the fire occurrence area, the target may be displayed in the map by a preset fire warning icon, and if the position of the target is in the normal area, the target may be displayed in the map by a preset normal icon. In some alternative embodiments, in order to more intuitively show the target in the map, the captured image of the target may also be shown in the map.
Referring to fig. 8a, fig. 8a is a schematic structural diagram of an object locating apparatus according to an embodiment of the present invention, which may include:
a data obtaining module 801, configured to obtain pixel coordinates of a target in a monitoring picture, where the monitoring picture includes a plurality of calibration points with known positions;
a position determining module 802, configured to calculate a position of the target based on a corresponding relationship between the pixel coordinates and the position of the calibration point and the pixel coordinates of the target.
In an optional embodiment, the position determining module 702 is specifically configured to map the pixel coordinate of the target by using a mapping equation map to obtain the position of the target, where the mapping equation is obtained by fitting a corresponding relationship between the pixel coordinate and the position of the calibration point.
In an optional embodiment, the position determining module 802 is specifically configured to obtain, as the position of the target, a position corresponding to the pixel coordinate of the target through interpolation calculation based on a corresponding relationship between the pixel coordinate and the position of the calibration point.
In an alternative embodiment, the index point is selected by:
dividing a monitoring picture into a plurality of grids;
and selecting a plurality of vertexes from the vertexes of the grids as calibration points.
In an optional embodiment, the position determining module 802 is specifically configured to use a position of a predetermined specific point in a grid to which the pixel coordinate of the target belongs as the position of the target, where the position of the specific point is calculated based on a corresponding relationship between the pixel coordinate and the position of the calibration point.
In an alternative embodiment, referring to fig. 8b, the apparatus further includes an object recognition module 803, configured to recognize an object existing in the monitoring screen by using a preset object recognition algorithm before acquiring the pixel coordinates of the object in the monitoring screen.
In an optional embodiment, the apparatus further includes a target display module, configured to obtain a type to which the target belongs, and display the target on the map according to the position of the target and the type to which the target belongs.
An embodiment of the present invention further provides an electronic device, as shown in fig. 8, which may include:
a memory 901 for storing a computer program;
the processor 902, when executing the program stored in the memory 901, implements the following steps:
acquiring pixel coordinates of a target in a monitoring picture, wherein the monitoring picture comprises a plurality of calibration points with known positions;
and calculating the position of the target based on the corresponding relation between the pixel coordinates and the positions of the calibration points and the pixel coordinates of the target.
In an optional embodiment, the calculating the position of the target based on the correspondence between the pixel coordinates and the position of the calibration point and the pixel coordinates of the target includes:
and mapping the pixel coordinates of the target through a mapping equation map to obtain the position of the target, wherein the mapping equation is obtained by fitting the corresponding relation between the pixel coordinates and the position of the calibration point.
In an optional embodiment, the calculating the position of the target based on the correspondence between the pixel coordinates and the position of the marked point and the pixel coordinates of the target includes:
and obtaining the position corresponding to the pixel coordinate of the target as the position of the target through interpolation calculation based on the corresponding relation between the pixel coordinate and the position of the calibration point.
In an alternative embodiment, the index point is selected by:
dividing a monitoring picture into a plurality of grids;
and selecting a plurality of vertexes from the vertexes of the grids as calibration points.
In an optional embodiment, the calculating the position of the target based on the correspondence between the pixel coordinates and the position of the calibration point and the pixel coordinates of the target includes:
and taking the position of a preset specific point in the grid to which the pixel coordinate of the target belongs as the position of the target, wherein the position of the specific point is calculated based on the corresponding relation between the pixel coordinate and the position of the calibration point.
In an optional embodiment, before acquiring the pixel coordinates of the target in the monitoring screen, the method further includes:
and identifying the target existing in the monitoring picture by using a preset target identification algorithm.
In an optional embodiment, the method further comprises:
and acquiring the type of the target, and displaying the target on the map according to the position of the target and the type of the target.
The Memory mentioned in the above electronic device may include a Random Access Memory (RAM) or a Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the processor.
The Processor may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component.
In yet another embodiment of the present invention, a computer-readable storage medium is further provided, which has instructions stored therein, and when the instructions are executed on a computer, the computer is caused to execute any one of the above-mentioned object positioning methods.
In a further embodiment of the present invention, there is also provided a computer program product comprising instructions which, when run on a computer, cause the computer to perform any of the object localization methods of the above embodiments.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the invention to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
All the embodiments in the present specification are described in a related manner, and the same and similar parts among the embodiments may be referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the apparatus, the electronic device, the computer-readable storage medium, and the computer program product embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and for the relevant points, reference may be made to the partial description of the method embodiments.
The above description is only for the preferred embodiment of the present invention, and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention shall fall within the protection scope of the present invention.

Claims (14)

1. A method of object localization, the method comprising:
acquiring pixel coordinates of a target in a monitoring picture, wherein the monitoring picture comprises a plurality of calibration points with known positions;
and calculating the position of the target based on the corresponding relation between the pixel coordinates and the positions of the calibration points and the pixel coordinates of the target.
2. The method according to claim 1, wherein calculating the position of the object based on the correspondence between the pixel coordinates and the position of the index point and the pixel coordinates of the object comprises:
and mapping the pixel coordinates of the target through a mapping equation to obtain the position of the target, wherein the mapping equation is obtained by fitting the corresponding relation between the pixel coordinates and the position of the calibration point.
3. The method according to claim 1, wherein calculating the position of the object based on the correspondence between the pixel coordinates and the position of the specified point and the pixel coordinates of the object comprises:
and obtaining the position corresponding to the pixel coordinate of the target as the position of the target through interpolation calculation based on the corresponding relation between the pixel coordinate and the position of the calibration point.
4. The method of claim 1, wherein the index point is selected by:
dividing a monitoring picture into a plurality of grids;
and selecting a plurality of vertexes from the vertexes of the grids as calibration points.
5. The method according to claim 4, wherein calculating the position of the object based on the correspondence between the pixel coordinates and the position of the index point and the pixel coordinates of the object comprises:
and taking the position of a preset specific point in the grid to which the pixel coordinate of the target belongs as the position of the target, wherein the position of the specific point is calculated based on the corresponding relation between the pixel coordinate and the position of the calibration point.
6. The method of claim 1, wherein prior to obtaining pixel coordinates of the target in the monitored scene, the method further comprises:
and identifying the target existing in the monitoring picture by using a preset target identification algorithm.
7. The method of claim 1, further comprising:
and acquiring the type of the target, and displaying the target on a map according to the position of the target and the type of the target.
8. An object localization arrangement, characterized in that the arrangement comprises:
the system comprises a data acquisition module, a data acquisition module and a data processing module, wherein the data acquisition module is used for acquiring pixel coordinates of a target in a monitoring picture, and the monitoring picture comprises a plurality of calibration points with known positions;
and the position determining module is used for calculating the position of the target based on the corresponding relation between the pixel coordinates and the positions of the calibration points and the pixel coordinates of the target.
9. The apparatus according to claim 8, wherein the position determining module is specifically configured to map the pixel coordinates of the target by a mapping equation to obtain the position of the target, and the mapping equation is obtained by fitting a corresponding relationship between the pixel coordinates and the position of the calibration point.
10. The apparatus according to claim 8, wherein the position determining module is specifically configured to obtain, as the position of the target, a position corresponding to the pixel coordinate of the target by interpolation calculation based on a correspondence between the pixel coordinate of the calibration point and the position.
11. The apparatus of claim 8, wherein the index point is selected by:
dividing a monitoring picture into a plurality of grids;
and selecting a plurality of vertexes from the vertexes of the grids as calibration points.
12. The apparatus according to claim 11, wherein the position determining module is specifically configured to use, as the position of the target, a position of a specific point preset in a grid to which the pixel coordinate of the target belongs, where the position of the specific point is calculated based on a correspondence between the pixel coordinate and the position of the calibration point.
13. An electronic device, comprising:
a memory for storing a computer program;
a processor for implementing the method steps of any of claims 1 to 7 when executing a program stored in the memory.
14. A computer-readable storage medium, characterized in that a computer program is stored in the computer-readable storage medium, which computer program, when being executed by a processor, carries out the method steps of any one of claims 1 to 7.
CN201811545183.XA 2018-12-17 2018-12-17 Target positioning method and device and electronic equipment Pending CN111323006A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811545183.XA CN111323006A (en) 2018-12-17 2018-12-17 Target positioning method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811545183.XA CN111323006A (en) 2018-12-17 2018-12-17 Target positioning method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN111323006A true CN111323006A (en) 2020-06-23

Family

ID=71170653

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811545183.XA Pending CN111323006A (en) 2018-12-17 2018-12-17 Target positioning method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN111323006A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103136270A (en) * 2011-12-01 2013-06-05 无锡物联网产业研究院 Method and system for obtaining data interpolation
CN103561465A (en) * 2013-10-28 2014-02-05 厦门雅迅网络股份有限公司 Multi-base-station positioning method based on probability fingerprints
CN104299236A (en) * 2014-10-20 2015-01-21 中国科学技术大学先进技术研究院 Target locating method based on scene calibration and interpolation combination
CN104574415A (en) * 2015-01-26 2015-04-29 南京邮电大学 Target space positioning method based on single camera
US20170091920A1 (en) * 2015-09-24 2017-03-30 Optim Corporation Information processing device, method of processing information, and program for processing information
CN107481283A (en) * 2017-08-01 2017-12-15 深圳市神州云海智能科技有限公司 A kind of robot localization method, apparatus and robot based on CCTV camera
CN108897028A (en) * 2018-05-11 2018-11-27 星络科技有限公司 Target object localization method, device, electronic equipment and readable storage medium storing program for executing

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103136270A (en) * 2011-12-01 2013-06-05 无锡物联网产业研究院 Method and system for obtaining data interpolation
CN103561465A (en) * 2013-10-28 2014-02-05 厦门雅迅网络股份有限公司 Multi-base-station positioning method based on probability fingerprints
CN104299236A (en) * 2014-10-20 2015-01-21 中国科学技术大学先进技术研究院 Target locating method based on scene calibration and interpolation combination
CN104574415A (en) * 2015-01-26 2015-04-29 南京邮电大学 Target space positioning method based on single camera
US20170091920A1 (en) * 2015-09-24 2017-03-30 Optim Corporation Information processing device, method of processing information, and program for processing information
CN107481283A (en) * 2017-08-01 2017-12-15 深圳市神州云海智能科技有限公司 A kind of robot localization method, apparatus and robot based on CCTV camera
CN108897028A (en) * 2018-05-11 2018-11-27 星络科技有限公司 Target object localization method, device, electronic equipment and readable storage medium storing program for executing

Similar Documents

Publication Publication Date Title
US11842516B2 (en) Homography through satellite image matching
CN110213488B (en) Positioning method and related equipment
US20200162724A1 (en) System and method for camera commissioning beacons
US20160169662A1 (en) Location-based facility management system using mobile device
CN111046121A (en) Environment monitoring method, device and system
CN110675448A (en) Ground light remote sensing monitoring method, system and storage medium based on civil aircraft
CN112950717A (en) Space calibration method and system
JP6917936B2 (en) Methods, devices and systems for mapping location detection to graphical representations
CN115004273A (en) Digital reconstruction method, device and system for traffic road
CN115334247B (en) Camera module calibration method, visual positioning method and device and electronic equipment
CN110188665B (en) Image processing method and device and computer equipment
CN114782555B (en) Map mapping method, apparatus, and storage medium
CN111277791B (en) Case event monitoring method and system
CN116152471A (en) Factory safety production supervision method and system based on video stream and electronic equipment
CN111323006A (en) Target positioning method and device and electronic equipment
CN112633143B (en) Image processing system, method, head-mounted device, processing device, and storage medium
CN115683046A (en) Distance measuring method, distance measuring device, sensor and computer readable storage medium
CN114943809A (en) Map model generation method and device and storage medium
CN110617800A (en) Emergency remote sensing monitoring method, system and storage medium based on civil aircraft
CN113538578B (en) Target positioning method, device, computer equipment and storage medium
CN117459688B (en) Camera angle marking method, device and medium based on map system
CN110413843B (en) Method and device for fusing video picture and image map
CN116912320B (en) Positioning method and device of object elevation coordinate, electronic equipment and medium
WO2023035296A1 (en) A camera calibration method
Szwoch et al. Spatial Calibration of a Dual PTZ‐Fixed Camera System for Tracking Moving Objects in Video

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200623