CN113804100A - Method, device, equipment and storage medium for determining space coordinates of target object - Google Patents

Method, device, equipment and storage medium for determining space coordinates of target object Download PDF

Info

Publication number
CN113804100A
CN113804100A CN202010527544.9A CN202010527544A CN113804100A CN 113804100 A CN113804100 A CN 113804100A CN 202010527544 A CN202010527544 A CN 202010527544A CN 113804100 A CN113804100 A CN 113804100A
Authority
CN
China
Prior art keywords
coordinates
target object
map
spatial
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010527544.9A
Other languages
Chinese (zh)
Other versions
CN113804100B (en
Inventor
杨少鹏
冷继南
沈建惠
常胜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202010527544.9A priority Critical patent/CN113804100B/en
Publication of CN113804100A publication Critical patent/CN113804100A/en
Application granted granted Critical
Publication of CN113804100B publication Critical patent/CN113804100B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • G01C21/32Structuring or formatting of map data

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Image Processing (AREA)

Abstract

Embodiments of the present disclosure provide a method, an apparatus, a device and a storage medium for determining spatial coordinates of a target object, which relate to the field of spatial mapping. The method comprises acquiring an image recording a target object and a reference object in a spatial region. The method further includes obtaining map data of the map, the map data including spatial coordinates of the reference object. The method further includes determining a mapping relationship between the image and the map based on the pixel coordinates of the reference object in the image and the spatial coordinates of the reference object. The method further includes determining spatial coordinates of the target object based on pixel coordinates and the mapping relationship of the target object in the image. In this way, the spatial coordinates of the target object can be determined efficiently and accurately without the need for specialized mapping equipment, thereby greatly reducing costs.

Description

Method, device, equipment and storage medium for determining space coordinates of target object
Technical Field
Embodiments of the present disclosure relate to the field of spatial mapping, and more particularly, to a method, apparatus, device, and storage medium for determining spatial coordinates of a target object.
Background
In the field of intelligent transportation, for example, in order to facilitate fine management of transportation facilities, it is necessary to determine spatial coordinates of an object such as a transportation facility. In addition, in the construction process of the high-precision map, the spatial coordinates of the objects in the geographic environment also need to be known.
Currently, techniques such as lidar measurements or oblique photography are primarily relied upon to determine the spatial coordinates of objects. Lidar measurement refers to a technique for reconstructing the spatial position and three-dimensional shape of an object by lidar. The oblique photography technology is that a plurality of sensors are carried on the same unmanned aerial vehicle, and shot object images are collected from a vertical angle and four oblique angles simultaneously to reconstruct the space position and the three-dimensional shape of an object. Although the two technologies can acquire the spatial coordinates with the accuracy reaching centimeter level, the two technologies have the problems of high equipment and labor cost, limited use and the like, and cannot conveniently and quickly acquire the spatial coordinates of the object in the changed spatial area after the spatial environment is locally changed (for example, new equipment is added in the space, the lane line in the road is updated and the like).
Disclosure of Invention
Example embodiments of the present disclosure provide a solution for determining spatial coordinates of a target object.
In a first aspect of the present disclosure, a method of determining spatial coordinates of a target object is provided. The method comprises acquiring an image recording a target object and a reference object in a spatial region. The method further includes obtaining map data of the map, the map data including spatial coordinates of the reference object. The method further includes determining a mapping relationship between the image and the map based on the pixel coordinates of the reference object in the image and the spatial coordinates of the reference object. The method further includes determining spatial coordinates of the target object based on pixel coordinates and the mapping relationship of the target object in the image. The scheme can efficiently and accurately determine the space coordinates of the target object without adopting professional mapping equipment, thereby greatly reducing the cost.
In some embodiments, the method further comprises updating the spatial coordinates of the target object to the map data. In this way, the map data can be rapidly updated, and in the above way, under the condition that entity updating exists in a local spatial region, the map data acquisition equipment is not required to be utilized to acquire the map data of the spatial region again, so that the labor, material resources and time cost are saved.
In some embodiments, the target object comprises an entity newly added to the spatial region after the map data acquisition of the spatial region; the reference object includes an entity that exists in the spatial region prior to the map data acquisition of the spatial region. In this way, a fast response to environmental changes may be achieved.
In some embodiments, entities present in the spatial region prior to the map data acquisition for the spatial region may define a first plane on which newly added entities may be located.
In some embodiments, the newly added entity in the area of space may include newly installed devices, such as an electric police camera, traffic radar, and the like. In such embodiments, the entities present in the spatial area prior to map data collection for the spatial area may include equipment in which the equipment is installed, such as an electrical police pole, a light pole, a billboard, a traffic light pole, and the like.
In some embodiments, the newly added entity in the spatial region may include the newly planned lane line. In such embodiments, the entities present in the spatial region prior to map data acquisition for the spatial region may include lane lines or the like that already exist.
In some embodiments, the target object comprises a first portion of a building in the spatial region, and the reference object comprises a second portion of the building; the first portion is collected by a data collection device that collects map data and the second portion is not collected by the data collection device. In this way, the spatial coordinates of the high-rise portion of the high-rise building can be determined more conveniently and quickly.
In some embodiments, determining the mapping relationship between the image and the map comprises: and determining the mapping relation between the image and the map according to the pixel coordinates of the first control points on the reference object and the space coordinates of the corresponding points of the first control points in the map. According to the pixel coordinates of the first control points and the space coordinates of the corresponding points, the mapping relation can be determined quickly.
In some embodiments, the plurality of first control points includes at least four first control points, the at least four first control points lying on the first plane and being non-collinear. The mapping relationship between the image and the map can be more accurately obtained by at least four first control points.
In some embodiments, determining the mapping relationship between the image and the map comprises: determining a first statistical result, a second statistical result and a third statistical result of the space coordinates of the plurality of corresponding points on three dimensions of a space coordinate system; and if the first statistical result and the second statistical result are both larger than the third statistical result, determining a mapping relation between the image and the map based on the pixel coordinates of the plurality of first control points and the coordinates on the dimension corresponding to the first statistical result and the second statistical result in the space coordinates of the plurality of corresponding points. This can reduce the computational complexity in determining the mapping relationship.
In some embodiments, the statistical result comprises at least one of a variance and a length of the distribution interval.
In some embodiments, wherein determining the spatial coordinates of the target object comprises: and determining a first coordinate on a dimension corresponding to the first statistical result and a second coordinate on a dimension corresponding to the second statistical result in the space coordinates of the target object based on the pixel coordinates and the mapping relation of the target object. In this way, the computational complexity in determining the above mapping relationship can be reduced.
In some embodiments, determining the mapping relationship between the image and the map comprises determining a homographic transformation relationship between pixel coordinates of the plurality of first control points and spatial coordinates of the plurality of corresponding points.
In a second aspect of the present disclosure, an apparatus for determining spatial coordinates of a target object is provided. The device comprises an acquisition unit, a mapping relation determination unit and a space coordinate determination unit. The acquisition unit is configured to acquire map data of an image and a map. The image records a target object and a reference object in a spatial region. The map data includes spatial coordinates of the reference object. The mapping relation determination unit is configured to determine a mapping relation between the image and the map according to pixel coordinates of the reference object in the image and spatial coordinates of the reference object. The spatial coordinate determination unit is configured to determine spatial coordinates of the target object based on pixel coordinates of the target object in the image and the mapping relationship.
In some embodiments, the apparatus further comprises an updating unit configured to update the spatial coordinates of the target object to the map data.
In some embodiments, the target object comprises an entity newly added to the spatial region after the map data acquisition of the spatial region; the reference object includes an entity that exists in the spatial region prior to the map data acquisition of the spatial region.
In some embodiments, the target object comprises a first portion of a building in the spatial region, and the reference object comprises a second portion of the building; the first portion is collected by a data collection device that collects map data and the second portion is not collected by the data collection device.
In some embodiments, the mapping relation determining unit is further configured to determine the mapping relation according to pixel coordinates of the plurality of first control points on the reference object and spatial coordinates of corresponding points of the plurality of first control points in the map.
In a third aspect of the disclosure, a computing device is provided. The computing device includes a processor and a memory. The memory is used for storing program code, and the processor is used for the program code in the memory to execute the first aspect and the method in combination with any one of the implementations of the first aspect.
In a fourth aspect of the disclosure, a computer-readable storage medium is provided. The computer-readable storage medium stores a computer program. The computer program may, when executed by a processor, implement the first aspect as well as the functionality of determining spatial coordinates of a target object as provided in connection with any of the implementations of the first aspect.
In a fifth aspect of the disclosure, a computer program product is provided. The computer program product includes instructions. The computer program product, when executed by a computer, enables the computer to perform the processes of the first aspect and the method for determining spatial coordinates of a target object provided in connection with any one of the implementations of the first aspect.
Drawings
FIG. 1A illustrates a block diagram of an environment in which embodiments of the present disclosure can be implemented;
FIG. 1B illustrates a block diagram of an example software architecture of a computing device, in accordance with some embodiments of the present disclosure;
FIG. 2 illustrates a schematic diagram of an example image, according to some embodiments of the present disclosure;
FIG. 3 illustrates a schematic diagram of an example map, in accordance with some embodiments of the present disclosure;
FIG. 4 illustrates a flow diagram of a method of determining spatial coordinates of a target object according to some embodiments of the present disclosure;
FIG. 5 illustrates a flow diagram of a method of determining spatial coordinates of a target object according to further embodiments of the present disclosure;
FIG. 6 illustrates a flow diagram of a method of determining spatial coordinates of a target object according to further embodiments of the present disclosure;
FIG. 7 shows a schematic diagram of an example image, according to further embodiments of the present disclosure;
FIG. 8 shows a schematic block diagram of an apparatus 800 of an embodiment of the present disclosure; and
fig. 9 illustrates a block diagram of a computing device 900 of an embodiment of the disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
The term "spatial scaling" as used herein refers to determining the correspondence of the position of the same object in different spaces. "homographic transformation" refers to a spatial positional transformation relationship between two central projection planes.
In describing embodiments of the present disclosure, the terms "include" and its derivatives should be interpreted as being inclusive, i.e., "including but not limited to. The term "based on" should be understood as "based at least in part on". The term "one embodiment" or "the embodiment" should be understood as "at least one embodiment". The terms "first," "second," and the like may refer to different or the same object. Other explicit and implicit definitions are also possible below.
As previously mentioned, the traditional way of determining the spatial coordinates of an object mainly consists of lidar measurements and oblique photography. Although space coordinates with the precision reaching centimeter level can be obtained by adopting laser radar measurement, the cost of the laser radar surveying and mapping vehicle is high, and the price is generally as high as millions of RMB. In particular, the costs are even unacceptably high when using lidar to measure small areas. In addition, need make laser radar survey and drawing car arrival target object location during the survey and drawing just can survey and drawing, long-distance operation may lead to the sensor on the laser radar survey and drawing car to damage. Moreover, lidar mapping requires mapping qualification, which only a few manufacturers currently possess. When the environment changes, such as updating of traffic sign lines, reinstallation of traffic monitoring equipment, etc., the rapid response to the environment change cannot be realized by adopting laser radar measurement.
Unmanned aerial vehicles based on oblique photography, although relatively inexpensive, take a long time to acquire, resulting in high labor costs. In addition, the use of unmanned aerial vehicles based on oblique photography is limited. There is no-fly zone in the city, leads to unmanned aerial vehicle unable use, and the urban area is just the key deployment area of wisdom traffic. Therefore, updating of high-precision maps by oblique photography techniques is very slow, and quick response to environmental changes cannot be achieved.
At least in response to the above problems, and potentially other related problems, embodiments of the present disclosure propose a solution for determining spatial coordinates of a target object. By acquiring an image including a target object and a reference object (which defines a plane on which the target object is located) from an image capturing apparatus, a mapping relationship between the pixel coordinates of a plurality of control points on the reference object and the spatial coordinates of corresponding points of the plurality of control points of the reference object in a map is determined. Further, the spatial coordinates of the target object in the map may be determined according to the mapping relationship. The scheme can efficiently and accurately determine the space coordinate of the target object without adopting professional mapping equipment such as a laser radar mapping vehicle and an unmanned aerial vehicle, so that the cost is greatly reduced.
FIG. 1A illustrates a block diagram of an environment 100 in which embodiments of the present disclosure can be implemented. As shown in FIG. 1A, environment 100 includes an image capture apparatus 110, a computing device 130, and a storage device 140.
The image capture device 110 may be any suitable device having image capture functionality. For example, the image capture device 110 may be an electronic device capable of taking pictures through a camera or webcam, such as a smart camera, a cell phone, a digital camera, and so forth. The image captured by the image capture device 110 records a target object and a reference object in a spatial region. The target object is any suitable object for which spatial coordinates need to be determined and the reference object may be any suitable object for determining the plane in which the target object lies. For example, in the case where the target object is a traffic monitoring device (e.g., an electric police camera, a traffic radar) added in a spatial area, the reference object may be an electric police pole, a light pole, a billboard, a traffic light pole, etc. in which the traffic monitoring device is installed. For another example, in the case where the target object is a traffic sign line, the reference object may be a traffic sign line around the traffic sign line. For another example, where the target object is a high-level portion of a building, the reference object may be a low-level portion of the building. The target object and the reference object will be described in detail below with particular reference to the embodiments of fig. 2 and 3.
The storage device 140 may be any suitable apparatus having a storage function. For example, the storage device 140 may be a Solid State Disk (SSD), a mechanical storage disk (HDD), a hybrid storage disk (SSHD), or other similar storage device.
The storage device 140 stores therein the map database 120. The map database 120 is a database based on digitized map data, including digital information files of elements of map content stored in a computer, a database management system, and other software and hardware. Various objects may be included in the map, such as control points, land features, land types, residential areas, hydrology, vegetation, transportation, territories, and the like. It should be understood that the above listed examples of objects are merely illustrative and not limiting, and that any suitable type and number of objects may be included in the map.
Computing device 130 may be any suitable electronic device having computing capabilities. In some embodiments, the computing device 130 may be deployed in a cloud environment. A cloud environment refers to a central cluster of computing devices owned by a cloud service provider for providing computing, storage, and communication resources. For example, the computing device 130 may be a central server in a cloud environment. In other embodiments, the computing device 130 may be deployed in an edge environment. An edge environment refers to a cluster of edge computing devices geographically close to the image capture device for providing computing, storage, and communication resources. For example, computing device 130 may be an edge computing device in an edge environment. The edge computing device may be, for example, a server. In other embodiments, the computing device 130 may be a desktop computer, a laptop computer, a tablet computer, a Personal Digital Assistant (PDA), a smart terminal, etc. with significant processing capabilities.
The computing device 130 may acquire the image 111 from the image capture apparatus 110. The image 111 records a target object and a reference object in a spatial area. The computing device 130 may determine pixel coordinates of various points in the image 111, such as pixel coordinates of a target object and pixel coordinates of points on a reference object. The image is composed of pixel points, and the pixel coordinates refer to the positions of the pixel points in the image.
Optionally, the computing device 130 may also obtain the image 111 from other storage or a user interface. In some embodiments, the target object and the reference object in the image 111 have been selected manually or automatically. Further, in some embodiments, in the image 111, the pixel coordinates of the control points on the reference object have been annotated. Similarly, in some embodiments, the pixel coordinates of the target object have also been annotated in the image 111.
The computing device 130 may also obtain a map 121 from the map database 120 that matches the image 111. The map 121 may be, for example, a satellite map, a live-action map, or the like. The map 121 may include map data and/or other suitable map elements or information. The map data may include, for example, spatial position data (hereinafter also referred to as "spatial coordinates") of respective points on the map 121, and the like. From the pixel coordinates of the reference object in the image 111 and the spatial coordinates of the reference object in the map 121, the computing device 130 may determine a mapping relationship between the image 111 and the map 121. Then, the computing device 130 determines the spatial coordinates of the target object in the map 121 based on the pixel coordinates of the target object and the mapping relationship.
The spatial coordinates of a point typically include the coordinates of the point in three dimensions of space, which may be represented, for example, by the form (x, y, z). Specifically, if the spatial coordinate of a point is (x)0,y0,z0) Then x can be considered0Is the first coordinate, y, of the spatial coordinate of the point0Is the second coordinate of the spatial coordinate of the point, and z0Is the third coordinate of the spatial coordinates of the point. It should be understood that the above examples are merely illustrative and not restrictive, and that in some other embodiments y may be used as well0Or z0A first coordinate which is the spatial coordinate of the point.
In various embodiments of the present disclosure, a mapping relationship between the image and the map is determined according to the pixel coordinates and the spatial coordinates of the reference object, and then the spatial coordinates of the target object in the map are determined based on the mapping relationship. From this, need not to adopt professional mapping equipment such as laser radar mapping car and unmanned aerial vehicle to the cost is reduced by a wide margin.
Additionally, in some embodiments, the computing device 130 may update the spatial coordinates of the target object to the map data, thereby obtaining an updated map 131. Further, the computing device 130 stores the updated map 131 to the map database 120. In this way, a fast update of the map can be achieved.
It should be understood that the map database 120 and the computing device 130 are shown in FIG. 1A as two separate entities for purposes of illustration only. In some embodiments, the map database 120 may be stored in the computing device 130.
Fig. 1B illustrates a block diagram of an example software architecture of a computing device 130, according to some embodiments of the present disclosure. As shown, the computing device 130 includes an acquisition unit 132, a mapping relationship determination unit 134, and a spatial coordinate determination unit 136.
The acquisition unit 132 is configured to acquire the image 111. The image 111 records a target object and a reference object in a spatial area. In some embodiments, the acquisition unit 132 may acquire the image 111 from the image capture device 110. In other embodiments, the obtaining unit 132 may obtain the image 111 uploaded by the user through a network. In still other embodiments, the acquisition unit 132 may acquire the image 111 from a storage device (e.g., storage device 140).
The acquisition unit 132 is also configured to acquire the map 121. The map 121 includes map data including spatial coordinates of a reference object therein. In some embodiments, the obtaining unit 132 may obtain the map 121 from the map database 120.
In some embodiments, the computing device 130 may further include a storage unit (not shown) configured to store the image 111 and the map 121. In such an embodiment, the acquisition unit 132 may acquire the image 111 and the map 121 from the storage unit.
The mapping relation determining unit 134 is configured to determine the mapping relation between the image 111 and the map 121 according to the pixel coordinates of the reference object in the image 111 and the spatial coordinates of the reference object.
The spatial coordinate determination unit 136 is configured to determine the spatial coordinates of the target object based on the pixel coordinates and the mapping relationship of the target object in the image 111.
Fig. 2 illustrates an example of an image 111 captured by the image capture device 110, according to some embodiments of the present disclosure. In this example, the image 111 is an image of a traffic intersection. However, it should be understood that this is for illustrative purposes only. In other embodiments, the image 111 may be an image that includes the target object and any suitable spatial region of the target object.
As shown in fig. 2, an electric horn 210 is disposed at the traffic intersection. Traffic monitoring devices 220, 221 and 222 are mounted on the electrical police pole 210. In the embodiment shown in FIG. 2, traffic monitoring devices 220 and 222 are, for example, electrical police cameras and traffic monitoring device 221 is, for example, a traffic radar. One or more of the traffic monitoring devices 220, 221, and 222 may be reinstalled in the replacement location. Traffic monitoring equipment may also be added to the electrical pole 210. In these cases, the spatial coordinates of the newly installed traffic monitoring device or the newly added traffic monitoring device need to be determined. Hereinafter, the determination of the spatial coordinates of the traffic monitoring device 222 will be described as an example. In this example, the target object is a newly added traffic monitoring device 222 in a spatial region of the traffic intersection and the reference object is an electrical police pole 210. Thus, in the following, the traffic monitoring device 222 may be used interchangeably with the target object 222 and the electrical alert bar 210 may be used interchangeably with the reference object 210.
In some embodiments, the reference object 210 defines a plane (hereinafter also referred to as a "first plane") in which the target object 222 lies. In this example, the first plane defined by the reference object 210 is perpendicular to the ground. However, it should be understood that this is for illustrative purposes only. In other examples, the reference object may define any suitable plane. The control points 211, 212, 213, 214, and 215 are located on the reference object 210. It is understood that "control points", also referred to as marker points, refer to coordinate points that are easily distinguishable and unambiguous in location. Hereinafter, the control points 211, 212, 213, 214, and 215 on the reference object 210 are also referred to as first control points 211, 212, 213, 214, and 215. In the context of the present disclosure, where the target object 222 occupies a smaller space, the target object 222 may be treated as a single point.
Fig. 3 illustrates an example of a map 121 according to some embodiments of the present disclosure. The map 121 is a map that matches the image 111 shown in fig. 2. By way of example only, the map 121 is shown in fig. 3 as a live view map. However, it should be understood that the map 121 may take any other suitable form. The computing device 130 can determine the map 121 in a variety of ways, for example, the map 121 matching the image 111 can be determined by a jump path from the road network to a traffic intersection, from a traffic intersection to the electric pole 210. Alternatively, the computing device 130 may retrieve the map 121 matching the image 111 by a keyword for a traffic intersection. It should be understood that this is merely an example. The computing device 130 may determine the map 121 that matches the image 111 in any suitable manner.
In fig. 3, the corresponding points of the plurality of first control points in the map 121 are denoted by reference numerals 311, 312, 313, 314, and 315.
In some embodiments, the first control points 211, 212, 213, 214, and 215 on the image 111 and the corresponding points 311, 312, 313, 314, and 315 in the map 121 may be annotated by way of manual annotation, thereby determining pixel coordinates of the plurality of first control points and the plurality of corresponding points. In other embodiments, the first control points 211, 212, 213, 214, and 215 on the image 111 and the corresponding points 311, 312, 313, 314, and 315 on the map 121 may be automatically marked by algorithms such as feature matching in computer vision, point cloud registration, and the like, to determine the pixel coordinates of the first control points and the corresponding points. It will be appreciated that any other suitable way of labelling and determining pixel coordinates is possible.
Fig. 4 illustrates a flow diagram of an example method 400 in accordance with some embodiments of the present disclosure. The method 400 may be performed by the computing device 130 in fig. 1A. For ease of discussion, the method 400 will be described with reference to fig. 1A, 2, and 3. It should be understood that method 400 may also include additional steps not shown and/or may omit steps shown, as the scope of the disclosure is not limited in this respect.
At block 410, the computing device 130 acquires the image 111. The image 111 records a target object 222 and a reference object 210 in a spatial region. In some embodiments of the present disclosure, the image capture device 110 is used to capture images without using specialized mapping equipment, such as lidar mapping vehicles, drones, and the like, thereby substantially reducing costs. In addition, the image capturing device 110 is not required to be qualified for use, and is not limited by conditions such as a no-fly zone, acquisition time, weather, and the like, so that quick response to environmental changes can be realized.
At block 420, the computing device 130 obtains map data for the map 121 that includes the spatial coordinates of the reference object 210.
At block 430, computing device 130 determines a mapping relationship between image 111 and map 121 based on the pixel coordinates of reference object 210 in image 111 and the spatial coordinates of reference object 210.
In some embodiments, computing device 130 may determine the mapping relationship between image 111 and map 121 based on the pixel coordinates of the plurality of first control points on reference object 210 and the spatial coordinates of the corresponding points in map 121. It is to be appreciated that the computing device 130 may determine the mapping relationship between the image 111 and the map 121 in any other suitable manner. The scope of the present disclosure is not limited in this respect.
As previously described, the pixel coordinates of the plurality of first control points in the image 111 may be determined manually or using algorithms such as feature matching, point cloud registration, and the like. For example, in the example shown in fig. 2, the plurality of first control points includes control points 211, 212, 213, 214, and 215. The pixel coordinates of the control points 211, 212, 213, 214, and 215 may be determined manually or using algorithms such as feature matching, point cloud registration, and the like. Of course, the pixel coordinates of the plurality of first control points may be determined in any suitable manner.
As also previously described, the map 121 includes map data including at least spatial coordinates of respective points on the map 121. Accordingly, based on the map data of the map 121, the computing device 130 may determine the spatial coordinates of a plurality of corresponding points on the map 121. For example, in the example shown in fig. 3, computing device 130 may determine spatial coordinates of the plurality of corresponding points 311, 312, 313, 314, and 315 based on the map data.
In some embodiments, to reduce computational complexity, the computing device 130 may select coordinates in two dimensions with large magnitude of numerical change from among the spatial coordinates of the plurality of corresponding points. Further, the computing device 130 may determine the mapping relationship based on the pixel coordinates of the plurality of first control points and the coordinates in two dimensions in which the magnitude of change in the numerical values of the plurality of corresponding points is large. For example, in the example shown in FIG. 3, the X-axis, Y-axis, and Z-axis define the coordinate system used by the map 121. In some embodiments, the coordinate system is a world-wide geographic coordinate system (WGS 84). In this coordinate system, the origin of the coordinate system is located at the earth centroid, the Z-axis points to the protocol earth pole (CTP) direction defined by BIH1984.0 (international time office), the X-axis points to the intersection of the zero-degree meridian plane of BIH1984.0 and the equator of the CTP, and the Y-axis is determined by the right-hand rule. It should be understood that it is also possible to use any other suitable coordinate system, such as a mars coordinate system (i.e. the GCJ02 coordinate system), a hundred degree coordinate system (i.e. the BD09 coordinate system), etc.
With continued reference to the example shown in fig. 3, when the electrical alert bar 210 is positioned in the east direction, the magnitude of the change in the y-coordinate and the z-coordinate of the point on the electrical alert bar 210 is large, while the magnitude of the change in the x-coordinate is small, so the computing device 130 can select the y-coordinate and the z-coordinate of the point on the electrical alert bar 210. When the electrical horn 210 is in the north direction, the x-coordinate and the z-coordinate of the point on the electrical horn 210 vary by a large amount and the y-coordinate varies by a small amount, so the computing device 130 can select the x-coordinate and the z-coordinate of the point on the electrical horn 210.
In some embodiments, in order to select the coordinates in the two dimensions with larger numerical variation from the spatial coordinates of the plurality of corresponding points, and further determine the mapping relationship based on the pixel coordinates of the plurality of first control points and the coordinates in the two dimensions with larger numerical variation of the plurality of corresponding points, the computing device 130 may execute the method 500 shown in fig. 5. Method 500 may be considered a specific implementation of the actions at block 430 described above. The actions at block 430 may also be put into practice in any other suitable manner. For ease of discussion, the method 500 will be described with reference to fig. 1A, 2, and 3.
At block 510, a first statistical result, a second statistical result, and a third statistical result of the spatial coordinates of the plurality of corresponding points in three dimensions of the spatial coordinate system are determined.
In some embodiments, the statistical result comprises a variance. For example, in the example shown in fig. 3, the computing device 130 may determine a first variance of the y-coordinate, a second variance of the z-coordinate, and a third variance of the x-coordinate for the plurality of corresponding points 311, 312, 313, 314, and 315.
In other embodiments, the statistical result includes a length of the distribution interval. In such embodiments, the computing device 130 determines a length of the first distribution interval, a length of the second distribution interval, and a length of the third distribution interval in three dimensions of the spatial coordinate system for the plurality of corresponding points. For example, in the example shown in fig. 3, computing device 130 may determine that the distribution interval of the y coordinates of the plurality of corresponding points 311, 312, 313, 314, and 315 is from y1To y5(wherein y is5>y1) I.e. the length of the distribution interval of the y coordinate is y5-y1(ii) a The distribution interval of the z-coordinate of the plurality of corresponding points 311, 312, 313, 314, and 315 is from z1To z5(wherein z is5>z1) I.e. the length of the distribution interval of the z coordinate is z5-z1(ii) a The distribution interval of the x coordinates of the plurality of corresponding points 311, 312, 313, 314, and 315 is from x1To x5(wherein x5>x1) I.e. the length of the distribution interval of the x-coordinate is x5-x1
It should be understood that the distribution intervals of the spatial coordinates of the plurality of corresponding points described above are merely examples. Any suitable distribution interval may be employed depending on the particular application scenario. The scope of the present disclosure is not limited in this respect.
At block 520, the computing device 130 compares the first statistical result, the second statistical result, and the third statistical result to determine whether the first statistical result and the second statistical result are both greater than the third statistical resultAnd (5) fruit. For example, in embodiments where the statistics include the length of the distribution interval, the computing device 130 compares the length of the distribution interval (y) in the y-coordinate5-y1) Length of distribution interval of z-coordinate (z)5-z1) And the length (x) of the distribution interval of the x-coordinate5-x1) A comparison is made.
If, at block 520, the computing device 130 determines that the first statistical result and the second statistical result are both greater than the third statistical result, it means that the magnitude of the change in the value of the coordinate in the dimension corresponding to the first statistical result and the second statistical result in the spatial coordinates of the plurality of corresponding points is both greater than the magnitude of the change in the value of the coordinate in the dimension corresponding to the third statistical result. In this case, at block 530, the computing device 130 determines the above-described mapping relationship based on the pixel coordinates of the plurality of first control points and the coordinates in the dimension corresponding to the first statistical result and the second statistical result of the plurality of corresponding points.
For example, in embodiments where the statistics include the length of the distribution interval, if the computing device 130 determines y5-y1And z5-z1Are all greater than x5-x1The calculation apparatus 130 determines that the length of the distribution section of the y-coordinate and the length of the distribution section of the z-coordinate of the plurality of corresponding points are both greater than the length of the distribution section of the x-coordinate. Further, the calculation device 130 determines the above-described mapping relationship based on the pixel coordinates of the plurality of first control points and the coordinates in the dimension corresponding to the length of the distribution section of the y coordinate and the length of the distribution section of the z coordinate of the plurality of corresponding points. In other words, the computing device 130 determines the above-described mapping relationship based on the pixel coordinates of the plurality of first control points, and the y-coordinate and z-coordinate of the plurality of corresponding points.
On the other hand, if at block 520 the computing device 130 determines that the first statistical result and the second statistical result are not both greater than the third statistical result, at block 540 the computing device 130 determines the mapping based on the pixel coordinates of the plurality of first control points and the coordinates of the plurality of corresponding points in the dimension corresponding to the first statistical result and the third statistical result, or determines the mapping based on the pixel coordinates of the plurality of first control points and the coordinates of the plurality of corresponding points in the dimension corresponding to the second statistical result and the third statistical result.
Specifically, if at block 520 the computing device 130 determines that the first statistical result and the third statistical result are both greater than the second statistical result, it means that the magnitude of the change in the value of the coordinate in the dimension corresponding to the first statistical result and the third statistical result in the spatial coordinates of the plurality of corresponding points is both greater than the magnitude of the change in the value of the coordinate in the dimension corresponding to the second statistical result. In this case, at block 540, the computing device 130 determines the above-described mapping relationship based on the pixel coordinates of the plurality of first control points and the coordinates in the dimension of the plurality of corresponding points corresponding to the first statistical result and the third statistical result.
For example, in embodiments where the statistics include the length of the distribution interval, if the computing device 130 determines y5-y1And x5-x1Are all greater than z5-z1The calculation apparatus 130 determines that the length of the distribution section of the y-coordinate and the length of the distribution section of the x-coordinate of the plurality of corresponding points are both greater than the length of the distribution section of the z-coordinate. Further, the calculation device 130 determines the above-described mapping relationship based on the pixel coordinates of the plurality of first control points and the coordinates of the plurality of corresponding points in the dimension corresponding to the length of the distribution section of the y coordinate and the length of the distribution section of the x coordinate. In other words, the computing device 130 determines the above-described mapping relationship based on the pixel coordinates of the plurality of first control points, and the y-coordinate and the x-coordinate of the plurality of corresponding points.
If, at block 520, the computing device 130 determines that the second statistical result and the third statistical result are both greater than the first statistical result, it means that the magnitude of the change in the value of the coordinate in the dimension corresponding to the second statistical result and the third statistical result in the spatial coordinates of the plurality of corresponding points is both greater than the magnitude of the change in the value of the coordinate in the dimension corresponding to the first statistical result. In this case, at block 540, the computing device 130 determines the above-described mapping relationship based on the pixel coordinates of the plurality of first control points and the coordinates in the dimension of the plurality of corresponding points corresponding to the second statistical result and the third statistical result.
For example, in embodiments where the statistics include the length of the distribution interval, if the computing device 130 determines z5-z1And x5-x1Are all greater than y5-y1The calculation apparatus 130 determines that the length of the distribution section of the z-coordinate and the length of the distribution section of the x-coordinate of the plurality of corresponding points are both greater than the length of the distribution section of the y-coordinate. Further, the calculation device 130 determines the above-described mapping relationship based on the pixel coordinates of the plurality of first control points and the coordinates in the dimension corresponding to the length of the distribution section of the z-coordinate and the length of the distribution section of the x-coordinate of the plurality of corresponding points. In other words, the computing device 130 determines the above-described mapping relationship based on the pixel coordinates of the plurality of first control points, and the z-coordinate and the x-coordinate of the plurality of corresponding points.
In some embodiments, the mapping relationship comprises a homographic transformation relationship. For the purpose of explanation, how to determine the above mapping relationship will be described below with a homographic transformation relationship as an example. It should be appreciated that the mapping between image 111 and map 121 may be any suitable mapping.
Consider the following example. In this example, it is assumed that the pixel coordinates of a first control point on the reference object 210 are represented by u, v, and the spatial coordinates of the corresponding point of the first control point on the map 121 are represented by x, y, z. Further, it is also assumed that the computing apparatus 130 determines that the magnitude of change in the numerical value of the first coordinate (e.g., y-coordinate) and the second coordinate (e.g., z-coordinate) in the spatial coordinates of the corresponding point is large. In this case, the following relationship is satisfied between the pixel coordinates u, v of the first control point and the y-coordinate and z-coordinate of the corresponding point:
Figure BDA0002534190120000091
where H denotes a homographic transformation matrix. Specifically, H is a 3 × 3 matrix and can be represented as follows:
Figure BDA0002534190120000092
h may be used in calculating the homographic transformation matrix H33Set to 1 or change the mode of the homographic transformation matrix H to 1. In this case, the degree of freedom of H is eight, and therefore the homographic transformation matrix H can be calculated by selecting at least four pairs of corresponding points that are located on the same plane and are not collinear on the reference object 210 and on the map 121. In other words, at least four first control points are selected on the reference object 210, which lie on the first plane and are not collinear. It should be understood that the values of the various parameters of the homographic transformation matrix described above are merely examples and are not limiting. The various parameters in the homographic transformation matrix may take any suitable values depending on the particular application scenario.
In addition, in some application scenarios, the position of the control point may be deviated by several pixels, and even the control point may be mismatched. If only four pairs of corresponding points are used to calculate the homography transformation matrix H, a large error may occur. Therefore, in order to make the calculation more accurate, more pairs of corresponding points may be selected to calculate the homographic transformation matrix H. For example, in the example of fig. 2, five first control points 211, 212, 213, 214, and 215 are selected. Accordingly, in the example of fig. 3, five corresponding points 311, 312, 313, 314, and 315 are selected that correspond to the first control point. It should be understood that although fig. 2 and 3 each show five control points, this is for exemplary purposes only and is not intended to present any limitations. Any suitable number of control points may be arranged on the reference object and the map, according to actual needs.
In some embodiments, the homographic transformation matrix H may be calculated by least squares fitting, singular value decomposition, Levenberg-Marquarat (LM) algorithm, or the like. Alternatively, the homographic transformation matrix H may be calculated through the findhomograph () interface of the open source computer vision library (OpenCV). The homography transformation matrix H may be calculated using any suitable means now known or developed in the future, and the scope of the present disclosure is not limited in this respect.
Returning to fig. 4. After determining the mapping relationship between the image 111 and the map 121, the computing device 130 determines the spatial coordinates of the target object 222 based on the pixel coordinates of the target object 222 in the image 111 and the mapping relationship at block 440.
In various embodiments of the present disclosure, the entire area containing the reference object need not be rescanned to map, as with lidar measurements and oblique photography techniques, to determine the spatial coordinates of the target object in the map. Thus, the time required to determine the spatial coordinates of the target object can be greatly shortened. Furthermore, since lidar measurements and oblique photography techniques need not be employed, there is no need to rely on a professional operator to perform the method according to embodiments of the present disclosure.
In an embodiment in which the mapping relationship is determined based on the pixel coordinates of the plurality of first control points and the coordinates in the two dimensions in which the magnitude of change in the numerical values of the plurality of corresponding points is large, the computing device 130 may determine the coordinates in the two dimensions in which the magnitude of change in the numerical values is large in the spatial coordinates of the target object 222 based on the pixel coordinates of the target object 222 and the mapping relationship. For example, in an embodiment in which it is determined that the first statistical result and the second statistical result are both greater than the third statistical result, the computing device 130 may determine, based on the pixel coordinates of the target object 222 and the mapping relationship, a first coordinate in a dimension corresponding to the first statistical result and a second coordinate in a dimension corresponding to the second statistical result of the target object 222.
For example, in the above-described embodiment in which the mapping relationship is a homographic transformation relationship, the computing device 130 may determine the first coordinates and the second coordinates of the target object 222 based on the pixel coordinates of the target object 222 and the homographic transformation matrix H. For example, the computing device 130 may determine a first coordinate (e.g., y-coordinate) and a second coordinate (e.g., z-coordinate) of the target object 222 using equation (1) above.
In some embodiments, the spatial coordinates of the target object 222 further include a third coordinate in a dimension corresponding to the third statistical result. The spatial coordinates of the plurality of corresponding points include a first coordinate in a dimension corresponding to the first statistical result and a third coordinate in a dimension corresponding to the third statistical result of the plurality of corresponding points. In some embodiments, after determining the first and second coordinates of the target object 222, the computing device 130 may determine the third coordinates of the target object 222 by performing the method 600 illustrated in fig. 6. For ease of discussion, the method 600 will be described with reference to fig. 1A, 2, and 3.
At block 610, the computing device 130 selects at least two corresponding points from the plurality of corresponding points. The selected at least two corresponding points are collinear with the target object 222 in the map 121. For example, in the example shown in fig. 3, since the corresponding points 311, 312, 313 are collinear with the target object 222 in the map 121, the computing device 130 may select the corresponding points 311, 312, 313 from the corresponding points 311, 312, 313, 314, and 315.
In some embodiments, to make the calculation of the third coordinate of the target object 222 more accurate, the computing device 130 may select the coordinate in the dimension having a larger magnitude of change in value from the first coordinate and the second coordinate of the selected at least two corresponding points. Further, the computing device 130 may determine the third coordinates of the target object 222 based on the coordinates in the selected dimension of the at least two corresponding points, the third coordinates of the at least two corresponding points, and the coordinates of the target object 222 in the selected dimension.
In some embodiments, to select the coordinates in the dimension with the larger magnitude of change in value from the first and second coordinates of the selected at least two corresponding points, the computing device 130 may adopt a similar manner to select the coordinates in the two dimensions with the larger magnitude of change in value from the spatial coordinates of the plurality of corresponding points.
Specifically, at block 620, the computing device 130 compares the first statistical result and the second statistical result of the spatial coordinates of the at least two corresponding points. If, at block 620, the computing device 130 determines that the first statistical result is greater than the second statistical result, at block 630, the computing device 130 determines a third coordinate of the target object 222 based on the first and third coordinates of the at least two corresponding points and the first coordinate of the target object 222.
For example, the computing device 130 may compare a first variance of a first coordinate and a second variance of a second coordinate of at least two corresponding points. If the first variance is greater than the second variance, the computing device 130 may determine that a magnitude of change in a value of the first coordinate is greater than a magnitude of change in a value of the second coordinate for the at least two corresponding points. In turn, the computing device 130 selects first coordinates of at least two corresponding points.
Alternatively, the computing device 130 may compare the length of the first distribution interval and the length of the second distribution interval for at least two corresponding points. If the length of the first distribution interval is greater than the length of the second distribution interval, the computing device 130 may determine that the magnitude of the change in the value of the first coordinate is greater than the magnitude of the change in the value of the second coordinate for the at least two corresponding points. In turn, the computing device 130 selects first coordinates of at least two corresponding points.
Consider the example shown in figure 3. Assume that the computing device 130 has selected the corresponding points 311, 312, 313 and has determined the y-coordinate and z-coordinate of the target object 222. When the electric pole 210 is positioned in the east direction, the magnitude of change in the y-coordinate of the corresponding points 311, 312, 313 is greater than the magnitude of change in the z-coordinate. In this case, the computing device 130 may select the y-coordinate of the corresponding point 311, 312, 313. Further, the computing device 130 may determine the x-coordinate of the target object 222 based on the y-coordinate and the x-coordinate of the corresponding points 311, 312, 313 and the y-coordinate of the target object 222.
As previously described, the selected at least two corresponding points are substantially collinear with target object 222 in map 121. Therefore, a linear relationship is satisfied between the first coordinates and the third coordinates of each of the at least two corresponding points and the target object 222. Further, since the first and third coordinates of the at least two corresponding points are known, the computing device 130 may determine the linear relationship based on the first and third coordinates of the at least two corresponding points. Further, the computing device 130 may determine a third coordinate of the target object 222 based on the linear relationship and the first coordinate of the target object 222.
In some embodiments, to determine the linear relationship described above, the computing device 130 may fit a straight-line equation based on the first and third coordinates of each of the at least two corresponding points. Any suitable manner now known or developed in the future may be used to fit the line equation, and the scope of the present disclosure is not limited in this respect.
It is to be appreciated that the method of determining spatial coordinates of a target object according to various embodiments of the present disclosure may be performed in parallel for a plurality of target objects associated with a reference object after the reference object is installed, without having to be performed separately for each target object after installation and debugging of the plurality of target objects. Therefore, the efficiency of determining the space coordinates of the target objects is improved, and the engineering difficulty is greatly reduced.
A scenario in which an embodiment of the present disclosure is applied to determine spatial coordinates of a newly added traffic monitoring device 222 in a spatial region is described above with reference to fig. 2. In addition to this, embodiments of the present disclosure may also be applied to scenarios where the spatial coordinates of a high-rise portion of a high-rise building surface are determined. As will be described in detail below in connection with fig. 7.
Fig. 7 shows a schematic diagram of an example image 700, according to further embodiments of the present disclosure. In this example, the image 700 is an image of a building surface.
As shown in fig. 7, the image 700 includes a high-level portion 710 (also referred to as a first portion) and a low-level portion 720 (also referred to as a second portion) of the building surface. The high level section 710 includes an "XXX" icon. In this example, the target object is the high-level portion 710 and the reference object is the low-level portion 720.
To determine the spatial coordinates of the high-level portion 710, the computing device 130 may acquire an image 700 from the image capture device 110 and map data for a map (not shown) that matches the image 700 from the map database 120. The map data includes the spatial coordinates of the low-level portion 720. The computing device 130 determines the mapping relationship between the image 700 and the map based on the pixel coordinates of the low-level portion 720 in the image 700 and the spatial coordinates of the low-level portion 720. In some embodiments, the computing device 130 may determine the above-described mapping relationship by performing the method 500 described above. The specific details of the method 500 are not described herein.
After determining the mapping relationship, computing device 130 determines the spatial coordinates of highlevel portion 710 based on the pixel coordinates of highlevel portion 710 in image 700 and the mapping relationship. For example, in an embodiment where the mapping relationship is a homographic transformation relationship, computing device 130 may determine the first and second coordinates of higher-level portion 710 using equation (1) above. Further, computing device 130 may determine the third coordinate of upper level portion 710 by performing method 600 described above. The details of the method 600 are not repeated herein.
It should be appreciated that the process of determining the spatial coordinates of the high-level portion 710 including the "XXX" icon is described above by way of example only. However, embodiments of the present disclosure may also be applied to determining the spatial coordinates of any location of a high-rise portion of a high-rise building surface.
It will be appreciated that it is currently difficult to determine the spatial coordinates of high-rise portions of the surface of a high-rise building by lidar measurements and oblique photography techniques, for example, when performing map data acquisition. However, the spatial coordinates of the high-rise portion of the surface of a high-rise building can be more conveniently and quickly determined using embodiments of the present disclosure.
Fig. 8 shows a schematic block diagram of an apparatus 800 provided by the present disclosure. As shown, the apparatus 800 includes an acquisition unit 132, a mapping relationship determination unit 134, and a spatial coordinate determination unit 136. It is understood that in some embodiments, the apparatus 800 may be the aforementioned computing device 130 of fig. 2 or be a part of the computing device 130.
The acquisition unit 132 is configured to acquire an image. The image records a target object and a reference object in a spatial region. In some embodiments, the acquisition unit 132 may acquire the image from the image capture device 110. In other embodiments, the obtaining unit 132 may obtain the image uploaded by the user through a network. In still other embodiments, the obtaining unit 132 may obtain the image from a storage device (e.g., the storage device 140 shown in FIG. 1A).
The acquisition unit 132 is also configured to acquire a map. The map includes map data including spatial coordinates of a reference object. In some embodiments, the obtaining unit 132 may obtain the map from the map database 120.
In some embodiments, the computing device 130 may further include a storage unit (not shown) configured to store the image and the map. In such an embodiment, the acquisition unit 132 may acquire the image and the map from the storage unit.
The mapping relationship determination unit 134 is configured to determine the mapping relationship between the image and the map according to the pixel coordinates of the reference object in the image and the spatial coordinates of the reference object.
The spatial coordinate determination unit 136 is configured to determine the spatial coordinates of the target object based on the pixel coordinates and the mapping relationship of the target object in the image.
In some embodiments, the apparatus 800 further comprises an updating unit configured to update the spatial coordinates of the target object to the map data.
In some embodiments, the target object comprises: newly adding an entity in the space area after map data acquisition is carried out on the space area; the reference object includes: entities that exist in a spatial region prior to map data acquisition for the spatial region.
In some embodiments, the target object comprises a first portion of a building in the spatial region, and the reference object comprises a second portion of the building; the first portion is collected by a data collection device that collects map data and the second portion is not collected by the data collection device.
In some embodiments, the mapping relation determining unit 134 is further configured to determine the mapping relation between the image and the map according to the pixel coordinates of the plurality of first control points on the reference object and the spatial coordinates of the corresponding points of the plurality of first control points in the map.
The three units can perform data transmission with each other through a communication path, and it should be understood that each unit included in the apparatus 800 may be a software unit, a hardware unit, or a part of the software unit and a part of the hardware unit.
FIG. 9 illustrates a block diagram of a computing device 900 capable of implementing embodiments of the present disclosure. As shown in fig. 9, computing device 900 includes: a processor 910, a communication interface 920, and a memory 930, the processor 910, the communication interface 920, and the memory 930 being interconnected via an internal bus 940. It should be understood that the computing device 900 may be a computing device in a cloud environment, or a computing device in an edge environment.
Processor 910 may be comprised of one or more general-purpose processors, such as a Central Processing Unit (CPU), or a combination of a CPU and hardware chips. The hardware chip may be an application-specific integrated circuit (ASIC), a Programmable Logic Device (PLD), or a combination thereof. The PLD may be a Complex Programmable Logic Device (CPLD), a field-programmable gate array (FPGA), a General Array Logic (GAL), or any combination thereof.
The bus 940 may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The bus 940 may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown in FIG. 9, but not only one bus or type of bus.
Memory 930 may include volatile memory (volatile memory), such as Random Access Memory (RAM); the memory 930 may also include a non-volatile memory (non-volatile memory), such as a read-only memory (ROM), a flash memory (flash memory), a Hard Disk Drive (HDD), or a solid-state drive (SSD); the memory 930 may also include combinations of the above.
It should be noted that the memory 930 of the computing device 900 stores codes corresponding to the units of the apparatus 800, and the processor 910 executes the codes to implement the functions of the units of the apparatus 800, that is, to execute the methods at the blocks 410 to 440.
Program code for implementing the methods of the present disclosure may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program codes, when executed by the processor or controller, cause the functions/operations specified in the flowchart and/or block diagram to be performed. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented using a software program, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. The processes or functions according to the above-described embodiments of the present application are generated in whole or in part when the computer-executable instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on a computer readable storage medium or transmitted from one computer readable storage medium to another computer readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center via wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). Computer-readable storage media can be any available media that can be accessed by a computer or can comprise one or more data storage devices, such as servers, data centers, and the like, that can be integrated with the media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
Further, while operations are depicted in a particular order, this should be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. Under certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limitations on the scope of the disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single implementation. Conversely, various features that are described in the context of a single implementation can also be implemented in multiple implementations separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Claims (12)

1. A method of determining spatial coordinates of a target object, comprising:
acquiring an image recording a target object and a reference object in a spatial region;
obtaining map data of a map, wherein the map data comprises the spatial coordinates of the reference object;
determining a mapping relation between the image and the map according to the pixel coordinates of the reference object in the image and the space coordinates of the reference object; and
and determining the space coordinates of the target object based on the pixel coordinates of the target object in the image and the mapping relation.
2. The method of claim 1, further comprising:
updating the spatial coordinates of the target object to the map data.
3. The method according to claim 1 or 2,
the target object includes: newly adding an entity in the space area after map data acquisition is carried out on the space area;
the reference object includes: entities present in the spatial region prior to map data acquisition of the spatial region.
4. The method according to claim 1 or 2,
the target object comprises a first portion of a building in the region of space, and the reference object comprises a second portion of the building;
the first portion is collected by a data collection device that collects the map data, and the second portion is not collected by the data collection device.
5. The method of any of claims 1-4, wherein determining the mapping relationship between the image and the map comprises:
and determining the mapping relation according to the pixel coordinates of the first control points on the reference object and the space coordinates of the corresponding points of the first control points in the map.
6. An apparatus for determining spatial coordinates of a target object, comprising:
an acquisition unit configured to acquire an image recording a target object and a reference object in one spatial area and map data of a map including spatial coordinates of the reference object;
a mapping relation determination unit configured to determine a mapping relation between the image and the map according to pixel coordinates of the reference object in the image and spatial coordinates of the reference object; and
a spatial coordinate determination unit configured to determine spatial coordinates of the target object based on pixel coordinates of the target object in the image and the mapping relationship.
7. The apparatus of claim 6, further comprising:
an updating unit configured to update the spatial coordinates of the target object to the map data.
8. The apparatus according to claim 6 or 7,
the target object includes: newly adding an entity in the space area after map data acquisition is carried out on the space area;
the reference object includes: entities present in the spatial region prior to map data acquisition of the spatial region.
9. The apparatus according to claim 6 or 7,
the target object comprises a first portion of a building in the region of space, and the reference object comprises a second portion of the building;
the first portion is collected by a data collection device that collects the map data, and the second portion is not collected by the data collection device.
10. The apparatus according to any of claims 6-9, wherein the mapping relation determining unit is further configured to:
and determining the mapping relation according to the pixel coordinates of the first control points on the reference object and the space coordinates of the corresponding points of the first control points in the map.
11. A computing device, comprising a memory and a processor, the processor executing computer instructions stored by the memory to cause the computing device to perform the method of any of claims 1-5.
12. A computer-readable storage medium, having stored thereon a computer program which, when executed by a processor, performs the method of any one of claims 1-5.
CN202010527544.9A 2020-06-11 2020-06-11 Method, device, equipment and storage medium for determining space coordinates of target object Active CN113804100B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010527544.9A CN113804100B (en) 2020-06-11 2020-06-11 Method, device, equipment and storage medium for determining space coordinates of target object

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010527544.9A CN113804100B (en) 2020-06-11 2020-06-11 Method, device, equipment and storage medium for determining space coordinates of target object

Publications (2)

Publication Number Publication Date
CN113804100A true CN113804100A (en) 2021-12-17
CN113804100B CN113804100B (en) 2023-02-10

Family

ID=78943722

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010527544.9A Active CN113804100B (en) 2020-06-11 2020-06-11 Method, device, equipment and storage medium for determining space coordinates of target object

Country Status (1)

Country Link
CN (1) CN113804100B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114838704A (en) * 2022-04-28 2022-08-02 杭州海康威视系统技术有限公司 Height detection method and device and computer readable storage medium
CN114926523A (en) * 2022-05-06 2022-08-19 杭州海康威视系统技术有限公司 Building height measuring method and equipment
WO2023231425A1 (en) * 2022-05-31 2023-12-07 中兴通讯股份有限公司 Positioning method, electronic device, storage medium and program product

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101620671A (en) * 2009-08-14 2010-01-06 华中科技大学 Method for indirectly positioning and identifying three-dimensional buildings by using riverway landmarks
EP3242108A1 (en) * 2016-05-06 2017-11-08 Canon Kabushiki Kaisha Information processing apparatus, information processing method, program, system, and article manufacturing method
CN108921894A (en) * 2018-06-08 2018-11-30 百度在线网络技术(北京)有限公司 Object positioning method, device, equipment and computer readable storage medium
CN110119698A (en) * 2019-04-29 2019-08-13 北京百度网讯科技有限公司 For determining the method, apparatus, equipment and storage medium of Obj State
CN110378965A (en) * 2019-05-21 2019-10-25 北京百度网讯科技有限公司 Determine the method, apparatus, equipment and storage medium of coordinate system conversion parameter
CN110926453A (en) * 2019-11-05 2020-03-27 杭州博信智联科技有限公司 Obstacle positioning method and system
CN111046762A (en) * 2019-11-29 2020-04-21 腾讯科技(深圳)有限公司 Object positioning method, device electronic equipment and storage medium
CN111105461A (en) * 2019-12-27 2020-05-05 万翼科技有限公司 Positioning apparatus, positioning method based on spatial model, and readable storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101620671A (en) * 2009-08-14 2010-01-06 华中科技大学 Method for indirectly positioning and identifying three-dimensional buildings by using riverway landmarks
EP3242108A1 (en) * 2016-05-06 2017-11-08 Canon Kabushiki Kaisha Information processing apparatus, information processing method, program, system, and article manufacturing method
CN108921894A (en) * 2018-06-08 2018-11-30 百度在线网络技术(北京)有限公司 Object positioning method, device, equipment and computer readable storage medium
CN110119698A (en) * 2019-04-29 2019-08-13 北京百度网讯科技有限公司 For determining the method, apparatus, equipment and storage medium of Obj State
CN110378965A (en) * 2019-05-21 2019-10-25 北京百度网讯科技有限公司 Determine the method, apparatus, equipment and storage medium of coordinate system conversion parameter
CN110926453A (en) * 2019-11-05 2020-03-27 杭州博信智联科技有限公司 Obstacle positioning method and system
CN111046762A (en) * 2019-11-29 2020-04-21 腾讯科技(深圳)有限公司 Object positioning method, device electronic equipment and storage medium
CN111105461A (en) * 2019-12-27 2020-05-05 万翼科技有限公司 Positioning apparatus, positioning method based on spatial model, and readable storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114838704A (en) * 2022-04-28 2022-08-02 杭州海康威视系统技术有限公司 Height detection method and device and computer readable storage medium
CN114926523A (en) * 2022-05-06 2022-08-19 杭州海康威视系统技术有限公司 Building height measuring method and equipment
WO2023231425A1 (en) * 2022-05-31 2023-12-07 中兴通讯股份有限公司 Positioning method, electronic device, storage medium and program product

Also Published As

Publication number Publication date
CN113804100B (en) 2023-02-10

Similar Documents

Publication Publication Date Title
US11105638B2 (en) Method, apparatus, and computer readable storage medium for updating electronic map
CN108319655B (en) Method and device for generating grid map
CN113804100B (en) Method, device, equipment and storage medium for determining space coordinates of target object
CN108279670B (en) Method, apparatus and computer readable medium for adjusting point cloud data acquisition trajectory
CN112069856A (en) Map generation method, driving control method, device, electronic equipment and system
US8437501B1 (en) Using image and laser constraints to obtain consistent and improved pose estimates in vehicle pose databases
KR102052114B1 (en) Object change detection system for high definition electronic map upgrade and method thereof
Wang et al. Automated road sign inventory system based on stereo vision and tracking
WO2010006254A2 (en) System and methods for dynamically generating earth position data for overhead images and derived information
CN109931950B (en) Live-action navigation method, system and terminal equipment
CN111986214B (en) Construction method of pedestrian crossing in map and electronic equipment
Tang et al. Surveying, geomatics, and 3D reconstruction
CN114820769A (en) Vehicle positioning method and device, computer equipment, storage medium and vehicle
WO2022099620A1 (en) Three-dimensional point cloud segmentation method and apparatus, and mobile platform
CN112381873A (en) Data labeling method and device
CN115830073A (en) Map element reconstruction method, map element reconstruction device, computer equipment and storage medium
CN115797310A (en) Method for determining inclination angle of photovoltaic power station group string and electronic equipment
CN113160406B (en) Road three-dimensional reconstruction method and device, storage medium and electronic equipment
Wu Photogrammetry for 3D mapping in Urban Areas
CN115147549A (en) Urban three-dimensional model generation and updating method based on multi-source data fusion
CN110930455B (en) Positioning method, positioning device, terminal equipment and storage medium
Liang et al. Efficient match pair selection for matching large-scale oblique UAV images using spatial priors
KR20220062709A (en) System for detecting disaster situation by clustering of spatial information based an image of a mobile device and method therefor
CN116007637B (en) Positioning device, method, in-vehicle apparatus, vehicle, and computer program product
CN113124816A (en) Antenna work parameter generation method and device, storage medium and computer equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant