CN112037316A - Mapping generation method and device and road side equipment - Google Patents

Mapping generation method and device and road side equipment Download PDF

Info

Publication number
CN112037316A
CN112037316A CN202011001881.0A CN202011001881A CN112037316A CN 112037316 A CN112037316 A CN 112037316A CN 202011001881 A CN202011001881 A CN 202011001881A CN 112037316 A CN112037316 A CN 112037316A
Authority
CN
China
Prior art keywords
mapping relation
pixel point
world coordinate
relation table
new world
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011001881.0A
Other languages
Chinese (zh)
Other versions
CN112037316B (en
Inventor
贾金让
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apollo Zhilian Beijing Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN202011001881.0A priority Critical patent/CN112037316B/en
Publication of CN112037316A publication Critical patent/CN112037316A/en
Application granted granted Critical
Publication of CN112037316B publication Critical patent/CN112037316B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4007Interpolation-based scaling, e.g. bilinear interpolation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/04Indexing scheme for image data processing or generation, in general involving 3D image data

Abstract

The application discloses a mapping generation method and device, and relates to the technical field of image processing and intelligent transportation. The specific implementation mode comprises the following steps: acquiring discrete points of a lane line in a high-precision map, and projecting the discrete points to an image shot by a roadside camera; generating a first mapping relation table indicating the mapping relation between the coordinates of the pixel points projected by the discrete points and the world coordinates of the discrete points; performing an interpolation step on pixel points in the first mapping relation table; and projecting the new world coordinate to the image, and generating a second mapping relation table indicating the mapping relation between the coordinate of the pixel point projected by the new world coordinate and the new world coordinate. The method and the device are beneficial to rapidly determining the world coordinates corresponding to the obstacles in the image by establishing the mapping relation between the pixel point coordinates in the image and the world coordinates. And moreover, interpolation is carried out by utilizing the pixel points within the preset radius, so that the accuracy of an interpolation result can be ensured while the data in the mapping relation table is denser.

Description

Mapping generation method and device and road side equipment
Technical Field
The present application relates to the field of computer technologies, and in particular, to the field of image processing and intelligent traffic technologies, and in particular, to a method and an apparatus for generating a mapping.
Background
In the image of a monocular or binocular camera, the position of an obstacle is represented by two-dimensional pixel points. In a three-dimensional real world space, the pixels of the obstacles are represented by corresponding world coordinates.
In the related art, in order to determine the world coordinates corresponding to the pixel points in the image in the world coordinate system, a binocular camera is generally used, which has a high requirement on the cost of the device. In addition, the related art can calculate the intersection point position of the ray of the pixel and the ground plane in real time to obtain world coordinates, and the method has high requirements on the flatness of the ground.
Disclosure of Invention
Provided are a map generation method, a map generation device, a roadside device, an electronic device, and a storage medium.
According to a first aspect, there is provided a method for generating a mapping, comprising: acquiring discrete points of a lane line in a high-precision map, and projecting the discrete points to an image shot by a roadside camera; generating a first mapping relation table indicating the mapping relation between the coordinates of the pixel points projected by the discrete points and the world coordinates of the discrete points; taking the first mapping relation table as a target mapping relation table, and for each pixel point of the target mapping relation table, performing interpolation on world coordinates corresponding to the pixel point by adopting a preset radius: in response to the fact that other pixel points except the pixel point exist within the preset radius by taking the pixel point as the circle center, carrying out interpolation processing on the world coordinate corresponding to the pixel point and the world coordinates corresponding to the other pixel points to obtain a new world coordinate; and projecting the new world coordinate to the image, and generating a second mapping relation table indicating the mapping relation between the coordinate of the pixel point projected by the new world coordinate and the new world coordinate.
According to a second aspect, there is provided an apparatus for generating a map, comprising: an acquisition unit configured to acquire discrete points of a lane line in a high-precision map, project the discrete points to an image captured by a roadside camera; a first generation unit configured to generate a first mapping relationship table indicating a mapping relationship of coordinates of a pixel point to which a discrete point is projected and world coordinates of the discrete point; a first interpolation unit, configured to use the first mapping relation table as a target mapping relation table, and for each pixel point of the target mapping relation table, perform an interpolation step on a world coordinate corresponding to the pixel point by using a preset radius: in response to the fact that other pixel points except the pixel point exist within the preset radius by taking the pixel point as the circle center, carrying out interpolation processing on the world coordinate corresponding to the pixel point and the world coordinates corresponding to the other pixel points to obtain a new world coordinate; a second generation unit configured to project the new world coordinate to the image, and generate a second mapping relationship table indicating a mapping relationship of a coordinate of the pixel point to which the new world coordinate is projected and the new world coordinate.
According to a third aspect, there is provided an electronic device comprising: one or more processors; a storage device for storing one or more programs which, when executed by one or more processors, cause the one or more processors to implement a method as in any embodiment of the method for generating a map.
According to a fourth aspect, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements a method as in any one of the embodiments of the method of generating a map.
According to a fifth aspect, there is provided a roadside apparatus comprising the electronic apparatus as previously provided.
According to the scheme of the application, a binocular camera is not needed or a vehicle is not needed to collect world coordinate data, and the world coordinate corresponding to the obstacle in the image can be rapidly determined by establishing the mapping relation between the pixel point coordinate in the image and the world coordinate. And moreover, interpolation is carried out by utilizing the pixel points within the preset radius, so that the accuracy of an interpolation result can be ensured. In addition, the data in the mapping relation table can be made denser by the interpolation step.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture diagram to which some embodiments of the present application may be applied;
FIG. 2 is a flow diagram of one embodiment of a method of generating a map according to the present application;
FIG. 3 is a schematic diagram of an application scenario of a method of generating a map according to the present application;
FIG. 4a is a flow diagram of yet another embodiment of a method of generating a map according to the present application;
FIG. 4b is a schematic diagram of yet another application scenario of a generation method of a map according to the present application;
FIG. 5 is a schematic block diagram of one embodiment of a map generation apparatus according to the present application;
fig. 6 is a block diagram of an electronic device for implementing a generation method of a map according to an embodiment of the present application.
Detailed Description
The following description of the exemplary embodiments of the present application, taken in conjunction with the accompanying drawings, includes various details of the embodiments of the application for the understanding of the same, which are to be considered exemplary only. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
Fig. 1 shows an exemplary system architecture 100 to which an embodiment of the map generation method or map generation apparatus of the present application may be applied, which relates to vehicle-road coordination of intelligent transportation.
As shown in fig. 1, the system architecture 100 may include an on-board system (i.e., an on-board brain) 101, a server 102, a roadside sensing device (e.g., a roadside camera) 103, a roadside computing device (e.g., a roadside computing unit RSCU)104, and a network 105. The network 105 is used to provide a medium of communication links between the vehicle-mounted system 101, the server 102 and the server 102, the roadside computing device 104 and the server 102, and the roadside sensing device 103. Network 105 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The user may use the in-vehicle system 101 to interact with the server 102 over the network 105 to receive or send messages or the like. Various communication client applications, such as navigation applications, live applications, instant messaging tools, mailbox clients, social platform software, and the like, may be installed on the in-vehicle system 101.
The in-vehicle system 101 may be hardware or software. When the in-vehicle system 101 is hardware, it may be various electronic devices with a display screen, including but not limited to smart phones, tablet computers, e-book readers, laptop portable computers, desktop computers, and the like. When the in-vehicle system 101 is software, it can be installed in the electronic devices listed above. It may be implemented as multiple pieces of software or software modules (e.g., multiple pieces of software or software modules to provide distributed services) or as a single piece of software or software module. And is not particularly limited herein.
The server 102 may be a server that provides various services, such as a backend server that provides support for the in-vehicle system 101, the roadside sensing devices 103, and/or the roadside computing devices 104. The background server may analyze and perform other processing on the received data such as coordinates of the target pixel point of the obstacle, and feed back a processing result (for example, a world coordinate point cloud of the obstacle) to the terminal device. The server 102 may be a cloud control platform, a vehicle-road cooperative management platform, a central subsystem, an edge computing platform, a cloud computing platform, and the like.
The roadside computing device 104 may be connected to the roadside sensing device 103 and acquire an image captured by the roadside sensing device 103. The roadside computing devices 104 may be connected with the server 102. In another system architecture, the roadside sensing device 103 itself includes a computing function, and the roadside sensing device 103 may be directly connected to the server 102. The connection to the server 102 here may be a wired or wireless connection.
It should be noted that the method for determining the world coordinate point cloud provided in the embodiment of the present application may be executed by various road side devices, servers (or cloud control platforms) 102 or vehicle-mounted systems 101, and accordingly, the apparatus for determining the world coordinate point cloud may be disposed in various road side devices, servers 102 or vehicle-mounted systems 101. The roadside device here may be, for example, a roadside sensing device 103 with a computing function, a roadside computing device 104 connected to the roadside sensing device, a server 102 connected to the roadside computing device 104, or a server 102 directly connected to the roadside sensing device 103.
It should be understood that the number of onboard systems, roadside cameras, roadside computing devices, networks, and servers in FIG. 1 are merely illustrative. There may be any number of on-board systems, roadside cameras, roadside computing devices, networks, and servers, as desired for implementation.
With continued reference to FIG. 2, a flow 200 of one embodiment of a method of generating a map in accordance with the present application is shown. The generation method of the mapping comprises the following steps:
step 201, obtaining discrete points of a lane line in a high-precision map, and projecting the discrete points to an image shot by a roadside camera.
In the present embodiment, an execution subject (for example, a server or an in-vehicle system shown in fig. 1) on which the generation method of the map is executed may acquire discrete points of a lane line in a high-precision map and project the discrete points into an image captured by a roadside camera. Specifically, the discrete points of the lane lines in the high-precision map may include discrete points of lane lines drawn on an actual road surface, and may also include discrete points of lane lines not drawn on the road surface, such as discrete points of lane lines of a vehicle turning lane, that is, a lane on which the vehicle turns when turning. The roadside camera herein may refer to a roadside camera whose pose is fixed.
Step 202, a first mapping relation table indicating the mapping relation between the coordinates of the pixel point projected by the discrete point and the world coordinates of the discrete point is generated.
In this embodiment, the execution subject may generate a first mapping relationship table, where the mapping relationship table may indicate a mapping relationship between coordinates of a pixel point onto which the discrete point is projected and world coordinates of the discrete point.
Step 203, taking the first mapping relation table as a target mapping relation table, and for each pixel point of the target mapping relation table, performing an interpolation step on the world coordinate corresponding to the pixel point by adopting a preset radius: and in response to the fact that other pixel points except the pixel point exist in the preset radius by taking the pixel point as the circle center, carrying out interpolation processing on the world coordinate corresponding to the pixel point and the world coordinates corresponding to the other pixel points to obtain a new world coordinate.
In this embodiment, the executing body may use the first mapping relationship table as a target mapping relationship table, and perform an interpolation step on each pixel point in the mapping relationship table by traversing the world coordinate corresponding to the pixel point by using a preset radius. The interpolation step here may include: and in response to the fact that other pixel points except the pixel point exist in the preset radius by taking the pixel point as the circle center, carrying out interpolation processing on the world coordinate corresponding to the pixel point and the world coordinates corresponding to the other pixel points to obtain a new world coordinate.
And step 204, projecting the new world coordinate to the image, and generating a second mapping relation table indicating the mapping relation between the coordinate of the pixel point projected by the new world coordinate and the new world coordinate.
In this embodiment, the executing entity may project a new world coordinate into the image, so as to generate a second mapping relationship table, where the second mapping relationship table may be used to represent a mapping relationship between a coordinate of a pixel point onto which the new world coordinate is projected and the new world coordinate.
In practice, the roadside camera may be a monocular camera. The execution subject or other electronic device may determine the world coordinates of the obstacle in the image for the image taken by the camera using the second mapping table. For example, in the case of the execution body, the execution body may detect a position (coordinate position) of an obstacle in an image captured by the camera, determine coordinates of a center point, for example, a bottom center point, and determine target world coordinates corresponding to the coordinates of the bottom center point using the second mapping table. Thus, the position of the obstacle and the target world coordinate are used to determine the world coordinate of the whole obstacle in the world coordinate system.
According to the method provided by the embodiment of the application, a binocular camera is not needed or a vehicle is not used for collecting world coordinate data, and the world coordinate corresponding to the obstacle in the image can be rapidly determined by establishing the mapping relation between the pixel point coordinate in the image and the world coordinate. And moreover, interpolation is carried out by utilizing the pixel points within the preset radius, so that the accuracy of an interpolation result can be ensured. In addition, the data in the mapping relation table can be made denser by the interpolation step.
In some optional implementations of this embodiment, the method may further include: taking the second mapping relation table as a target mapping relation table, and executing an interpolation step on world coordinates corresponding to each pixel point of the target mapping relation table by adopting a preset radius for each pixel point; and projecting the new world coordinate generated by the interpolation step to the image, and adding the mapping relation between the coordinate of the pixel point projected by the new world coordinate and the new world coordinate into a second mapping relation table.
In these alternative implementations, the execution subject may execute the interpolation step on the second mapping relation table to make the data in the second mapping relation table denser through interpolation processing. The interpolation step here is the interpolation step in step 203.
Optionally, the preset radius may include at least two preset radii; for each pixel point of the target mapping relationship table, the step of performing interpolation on the world coordinate corresponding to the pixel point by using the preset radius may include: and for each pixel point of the target mapping relation table, sequentially adopting each preset radius of at least two preset radii according to the sequence of the preset radii from small to large, and performing an interpolation step on the world coordinate corresponding to the pixel point.
Specifically, the executing body may sequentially adopt each preset radius of the at least two radii in the order from small to large according to the preset radius for each pixel point to execute the interpolation step. For example, the at least two preset radii may include 2, 3, 4, and 5, where the number indicates the number of pixels included in the radius. The execution main body may execute, for each pixel point in the mapping relationship, an interpolation step of the world coordinate corresponding to the pixel point, with 2 as an initial radius for the pixel point. And then, performing an interpolation step on the world coordinates corresponding to the pixel points by taking 3 as a radius. And then, sequentially adopting the radiuses of 3, 4 and 5 to execute the step of interpolating the world coordinates corresponding to the pixel points.
The execution main body can perform interpolation processing on points in different radius ranges, so that more comprehensive and dense world coordinates can be obtained for pixel points in the image.
With continued reference to fig. 3, fig. 3 is a schematic diagram of an application scenario of the generation method of the map according to the present embodiment. In the application scenario of fig. 3, the execution subject 301 acquires discrete points 302 of a lane line in a high-precision map, and projects the discrete points onto an image captured by a roadside camera. The execution body 301 generates a first mapping relation table 303 indicating a mapping relation of the coordinates of the projected pixel point and the world coordinates of the pixel point. The execution main body 301 uses the first mapping relationship table as a target mapping relationship table, and for each pixel point of the target mapping relationship table, performs an interpolation step 304 on the world coordinate corresponding to the pixel point by using a preset radius: in response to determining that other pixel points other than the pixel point exist within the preset radius with the pixel point as the center of the circle, the world coordinate corresponding to the pixel point and the world coordinates corresponding to the other pixel points are interpolated to obtain a new world coordinate 305. The execution subject 301 projects the new world coordinates to the image, and generates a second mapping relationship table 306 indicating a mapping relationship of coordinates of the pixel point to which the new world coordinates are projected and the new world coordinates 305.
With further reference to fig. 4a, a flow 400 of yet another embodiment of a method of generating a mapping is shown. The process 400 includes the following steps:
step 401, obtaining discrete points of a lane line in a high-precision map, and projecting the discrete points to an image shot by a road side camera.
In the present embodiment, an execution subject (for example, a server or a terminal device shown in fig. 1) on which the generation method of the map operates may acquire discrete points of a lane line in a high-precision map and project the discrete points into an image captured by a roadside camera. Specifically, the discrete points of the lane lines in the high-precision map may include discrete points of lane lines drawn on an actual road surface, and may also include discrete points of lane lines not drawn on the road surface, such as discrete points of lane lines of a vehicle turnaround lane.
Step 402, a first mapping relation table indicating the mapping relation between the coordinates of the pixel points projected by the discrete points and the world coordinates of the discrete points is generated.
In this embodiment, the execution subject may generate a first mapping relationship table, where the mapping relationship table may indicate a mapping relationship between coordinates of a pixel point onto which a discrete point is projected and world coordinates of the discrete point.
Step 403, using the first mapping relationship table as a target mapping relationship table, and for each pixel point of the target mapping relationship table, performing an interpolation step on the world coordinate corresponding to the pixel point by using a preset radius: and in response to the fact that other pixel points except the pixel point exist in the preset radius by taking the pixel point as the circle center, carrying out interpolation processing on the world coordinate corresponding to the pixel point and the world coordinates corresponding to the other pixel points to obtain a new world coordinate.
In this embodiment, the executing body may take the first mapping relationship table as a target mapping relationship table, and execute the interpolation step for the world coordinates corresponding to each pixel point in the mapping relationship table. The interpolation step here may include: and in response to the fact that other pixel points except the pixel point exist in the preset radius by taking the pixel point as the circle center, carrying out interpolation processing on the world coordinate corresponding to the pixel point and the world coordinates corresponding to the other pixel points to obtain a new world coordinate.
Step 404, projecting the new world coordinate to the image, and generating a second mapping relation table indicating the mapping relation between the coordinate of the pixel point projected by the new world coordinate and the new world coordinate.
In this embodiment, the executing entity may project a new world coordinate into the image, so as to generate a second mapping relationship table, where the second mapping relationship table may be used to represent a mapping relationship between a coordinate of a pixel point onto which the new world coordinate is projected and the new world coordinate.
Step 405, a horizontal interpolation processing step is executed: and carrying out interpolation processing on the world coordinates corresponding to each row of pixel points in the second mapping relation table to obtain new world coordinates.
In this embodiment, the executing body may execute a horizontal interpolation processing step, where the horizontal interpolation processing step includes performing horizontal interpolation processing on world coordinates corresponding to the pixel points in the second mapping relationship table, that is, performing interpolation processing on world coordinates corresponding to each row of pixel points in the second mapping relationship table, so as to obtain new world coordinates. After that, the executing body may project the new world coordinates into the image, and add the mapping relationship generated by the projection into the second mapping relationship table.
Step 406, projecting the new world coordinate to the image, and adding the coordinate of the pixel point projected by the new world coordinate and the mapping relationship of the new world coordinate into a second mapping relationship table.
In the present embodiment, the new world coordinates here refer to new world coordinates obtained by performing the horizontal interpolation processing step. The execution main body can project the new world coordinate, establish a mapping relation between the new world coordinate and the projected coordinate of the pixel point, and add the mapping relation into a second mapping relation table.
The present embodiment can increase the density of world coordinates corresponding to the pixel points of each row by the lateral interpolation processing.
In some optional implementations of this embodiment, the method may further include: executing a longitudinal interpolation processing step: carrying out interpolation processing on world coordinates corresponding to each row of pixel points in the second mapping relation table to obtain new world coordinates; and projecting the new world coordinate to an image, and adding a mapping relation between the coordinate of the pixel point projected by the new world coordinate and the new world coordinate into a second mapping relation table, wherein the horizontal interpolation processing step is executed before and/or after the vertical interpolation processing step.
In these alternative implementations, the execution body may execute a vertical interpolation processing step and obtain new world coordinates. And projecting the new world coordinate to obtain a mapping relation between the new world coordinate and the coordinate of the pixel point projected by the projection. The executing body may add the mapping relationship to a second mapping relationship table, so that the mapping relationship in the second mapping relationship table relates to more comprehensive image pixel points.
In practice, the executing body described above may execute the vertical interpolation processing step after executing the horizontal interpolation processing step. In addition, the executing body may further execute the horizontal interpolation processing step again to further refine the second mapping relation table.
The implementation modes can enable the mapping relation in the second mapping relation table to relate to more comprehensive image pixel points, and the second mapping relation table is perfected through interpolation processing in different directions.
As shown in fig. 4b, the upper diagram of fig. 4b shows the image when the lateral interpolation processing step and the corresponding projection are not performed. The middle diagram of fig. 4b shows the projection result of the new world coordinates onto the image, resulting from the horizontal interpolation processing step performed on the upper diagram. The executing body executes the vertical interpolation processing step on the world coordinates corresponding to each column of pixel points in the second mapping relation table added with the projection result, and projects the new world coordinates obtained in the vertical interpolation processing step to the image, and the lower diagram in fig. 4b shows the projection result of the projection.
With further reference to fig. 5, as an implementation of the method shown in the above figures, the present application provides an embodiment of a device for generating a map, where the embodiment of the device corresponds to the embodiment of the method shown in fig. 2, and besides the features described below, the embodiment of the device may further include the same or corresponding features or effects as the embodiment of the method shown in fig. 2. The device can be applied to various electronic equipment.
As shown in fig. 5, the map generation apparatus 500 of the present embodiment includes: an acquisition unit 501, a first generation unit 502, a first interpolation unit 503, and a second generation unit 504. The acquiring unit 501 is configured to acquire discrete points of a lane line in a high-precision map and project the discrete points to an image shot by a roadside camera; a first generation unit 502 configured to generate a first mapping relationship table indicating a mapping relationship of coordinates of a pixel point to which a discrete point is projected and world coordinates of the discrete point; a first interpolation unit 503, configured to use the first mapping relationship table as a target mapping relationship table, and for each pixel point of the target mapping relationship table, perform an interpolation step on the world coordinate corresponding to the pixel point by using a preset radius: in response to the fact that other pixel points except the pixel point exist in the preset radius by taking the pixel point as the circle center, carrying out interpolation processing on the world coordinate corresponding to the pixel point and the world coordinates corresponding to the other pixel points to obtain a new world coordinate; a second generating unit 504 configured to project the new world coordinates to the image, and generate a second mapping relationship table indicating a mapping relationship of coordinates of the pixel point to which the new world coordinates are projected and the new world coordinates.
In this embodiment, specific processes of the obtaining unit 501, the first generating unit 502, the first interpolating unit 503 and the second generating unit 504 of the mapping generating apparatus 500 and technical effects brought by the specific processes can refer to related descriptions of step 201, step 202, step 203 and step 204 in the corresponding embodiment of fig. 2, which are not repeated herein.
In some optional implementations of this embodiment, the apparatus further includes: the second interpolation unit is configured to use the second mapping relation table as a target mapping relation table, and for each pixel point of the target mapping relation table, an interpolation step is executed on world coordinates corresponding to the pixel point by adopting a preset radius; and the projection unit is configured to project the new world coordinate generated by the interpolation step executed this time to the image, and add the mapping relation between the coordinate of the pixel point projected by the new world coordinate and the new world coordinate into the second mapping relation table.
In some optional implementations of this embodiment, the preset radius includes at least two preset radii; the second interpolation unit is further configured to execute the interpolation step on the world coordinates corresponding to each pixel point of the target mapping relation table by adopting a preset radius according to the following mode: and for each pixel point of the target mapping relation table, sequentially adopting each preset radius of at least two preset radii according to the sequence of the preset radii from small to large, and performing an interpolation step on the world coordinate corresponding to the pixel point.
In some optional implementations of this embodiment, the apparatus further includes: a lateral interpolation unit configured to perform a lateral interpolation processing step: carrying out interpolation processing on world coordinates corresponding to each row of pixel points in the second mapping relation table to obtain new world coordinates; and the first adding unit is configured to project the new world coordinate to the image, and add the mapping relation between the coordinate of the pixel point to which the new world coordinate is projected and the new world coordinate into a second mapping relation table.
In some optional implementations of this embodiment, the apparatus further includes: a vertical interpolation unit configured to perform a vertical interpolation processing step: carrying out interpolation processing on world coordinates corresponding to each row of pixel points in the second mapping relation table to obtain new world coordinates; and a second adding unit configured to project the new world coordinate to the image, and add a mapping relationship between coordinates of the pixel point onto which the new world coordinate is projected and the new world coordinate to a second mapping relationship table, wherein the horizontal interpolation processing step is performed before and/or after the vertical interpolation processing step.
According to an embodiment of the present application, an electronic device and a readable storage medium are also provided.
Fig. 6 is a block diagram of an electronic device according to a method for generating a map according to an embodiment of the present application. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationship tables, and their functions, are by way of example only and are not intended to limit implementations of the present application described and/or claimed herein.
As shown in fig. 6, the electronic apparatus includes: one or more processors 601, memory 602, and interfaces for connecting the various components, including a high-speed interface and a low-speed interface. The various components are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions for execution within the electronic device, including instructions stored in or on the memory to display graphical information of a GUI on an external input/output apparatus (such as a display device coupled to the interface). In other embodiments, multiple processors and/or multiple buses may be used, along with multiple memories and multiple memories, as desired. Also, multiple electronic devices may be connected, with each device providing portions of the necessary operations (e.g., as a server array, a group of blade servers, or a multi-processor system). In fig. 6, one processor 601 is taken as an example.
The memory 602 is a non-transitory computer readable storage medium as provided herein. The memory stores instructions executable by the at least one processor to cause the at least one processor to perform the mapping generation method provided herein. The non-transitory computer-readable storage medium of the present application stores computer instructions for causing a computer to perform the generation method of the mapping provided herein.
The memory 602, which is a non-transitory computer-readable storage medium, may be used to store non-transitory software programs, non-transitory computer-executable programs, and modules, such as program instructions/modules corresponding to the generation method of the map in the embodiment of the present application (for example, the acquisition unit 501, the first generation unit 502, the first interpolation unit 503, and the second generation unit 504 shown in fig. 5). The processor 601 executes various functional applications of the server and data processing, i.e., implementing the generation method of the map in the above-described method embodiment, by running non-transitory software programs, instructions, and modules stored in the memory 602.
The memory 602 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the mapped generation electronic device, and the like. Further, the memory 602 may include high speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory 602 optionally includes memory located remotely from the processor 601, and these remote memories may be connected to the map generating electronics over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The electronic device of the generation method of the map may further include: an input device 603 and an output device 604. The processor 601, the memory 602, the input device 603 and the output device 604 may be connected by a bus or other means, and fig. 6 illustrates the connection by a bus as an example.
The input device 603 may receive input numeric or character information and generate key signal inputs related to mapped user settings and function controls of the generating electronic device, such as a touch screen, keypad, mouse, track pad, touch pad, pointer stick, one or more mouse buttons, track ball, joystick, or other input device. The output devices 604 may include a display device, auxiliary lighting devices (e.g., LEDs), and tactile feedback devices (e.g., vibrating motors), among others. The display device may include, but is not limited to, a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display, and a plasma display. In some implementations, the display device can be a touch screen.
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, application specific ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
These computer programs (also known as programs, software applications, or code) include machine instructions for a programmable processor, and may be implemented using high-level procedural and/or object-oriented programming languages, and/or assembly/machine languages. As used herein, the terms "machine-readable medium" and "computer-readable medium" refer to any computer program product, apparatus, and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term "machine-readable signal" refers to any signal used to provide machine instructions and/or data to a programmable processor.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship tables for client and server arises by virtue of computer programs running on the respective computers and having client-server relationship tables with respect to each other.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor includes an acquisition unit, a first generation unit, a first interpolation unit, and a second generation unit. The names of these units do not in some cases constitute a limitation on the unit itself, and for example, the acquisition unit may also be described as "a unit that acquires discrete points of a lane line in a high-precision map, projects the discrete points onto an image captured by a roadside camera".
The application also provides roadside equipment comprising the electronic equipment.
As another aspect, the present application also provides a computer-readable medium, which may be contained in the apparatus described in the above embodiments; or may be present separately and not assembled into the device. The computer readable medium carries one or more programs which, when executed by the apparatus, cause the apparatus to: acquiring discrete points of a lane line in a high-precision map, and projecting the discrete points to an image shot by a roadside camera; generating a first mapping relation table indicating the mapping relation between the coordinates of the pixel points projected by the discrete points and the world coordinates of the discrete points; taking the first mapping relation table as a target mapping relation table, and for each pixel point of the target mapping relation table, performing interpolation on world coordinates corresponding to the pixel point by adopting a preset radius: in response to the fact that other pixel points except the pixel point exist in the preset radius by taking the pixel point as the circle center, carrying out interpolation processing on the world coordinate corresponding to the pixel point and the world coordinates corresponding to the other pixel points to obtain a new world coordinate; and projecting the new world coordinate to the image, and generating a second mapping relation table indicating the mapping relation between the coordinate of the pixel point projected by the new world coordinate and the new world coordinate.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention herein disclosed is not limited to the particular combination of features described above, but also encompasses other arrangements formed by any combination of the above features or their equivalents without departing from the spirit of the invention. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (13)

1. A method of generating a map, the method comprising:
acquiring discrete points of a lane line in a high-precision map, and projecting the discrete points to an image shot by a roadside camera;
generating a first mapping relation table indicating the mapping relation between the coordinates of the pixel points projected by the discrete points and the world coordinates of the discrete points;
taking the first mapping relation table as a target mapping relation table, and for each pixel point of the target mapping relation table, performing interpolation on world coordinates corresponding to the pixel point by adopting a preset radius: in response to determining that other pixel points except the pixel point exist in the preset radius by taking the pixel point as a circle center, carrying out interpolation processing on the world coordinate corresponding to the pixel point and the world coordinates corresponding to the other pixel points to obtain a new world coordinate;
and projecting the new world coordinate to the image, and generating a second mapping relation table indicating the mapping relation between the coordinate of the pixel point to which the new world coordinate is projected and the new world coordinate.
2. The method of claim 1, wherein the method further comprises:
taking the second mapping relation table as a target mapping relation table, and executing an interpolation step on world coordinates corresponding to each pixel point of the target mapping relation table by adopting a preset radius for each pixel point;
projecting the new world coordinate generated by executing the interpolation step to the image, and adding the mapping relation between the coordinate of the pixel point projected by the new world coordinate and the new world coordinate into the second mapping relation table.
3. The method of claim 1 or 2, wherein the preset radius comprises at least two preset radii;
for each pixel point of the target mapping relation table, performing interpolation on the world coordinate corresponding to the pixel point by adopting a preset radius, wherein the interpolation comprises the following steps:
and for each pixel point of the target mapping relation table, sequentially adopting each preset radius of the at least two preset radii according to the sequence of the preset radii from small to large, and executing an interpolation step on the world coordinate corresponding to the pixel point.
4. The method according to claim 1 or 2, wherein the method further comprises:
and executing a transverse interpolation processing step: carrying out interpolation processing on world coordinates corresponding to each row of pixel points in the second mapping relation table to obtain new world coordinates;
projecting the new world coordinate to the image, and adding the mapping relation between the coordinate of the pixel point projected by the new world coordinate and the new world coordinate into the second mapping relation table.
5. The method of claim 4, wherein the method further comprises:
executing a longitudinal interpolation processing step: carrying out interpolation processing on world coordinates corresponding to each row of pixel points in the second mapping relation table to obtain new world coordinates;
projecting the new world coordinate to the image, and adding the mapping relation between the coordinate of the pixel point projected by the new world coordinate and the new world coordinate into the second mapping relation table, wherein the horizontal interpolation processing step is executed before and/or after the vertical interpolation processing step.
6. An apparatus for generating a map, the apparatus comprising:
an acquisition unit configured to acquire discrete points of a lane line in a high-precision map, project the discrete points to an image captured by a roadside camera;
a first generation unit configured to generate a first mapping relationship table indicating a mapping relationship of coordinates of a pixel point to which a discrete point is projected and world coordinates of the discrete point;
a first interpolation unit, configured to use the first mapping relation table as a target mapping relation table, and for each pixel point of the target mapping relation table, perform an interpolation step on a world coordinate corresponding to the pixel point by using a preset radius: in response to determining that other pixel points except the pixel point exist in the preset radius by taking the pixel point as a circle center, carrying out interpolation processing on the world coordinate corresponding to the pixel point and the world coordinates corresponding to the other pixel points to obtain a new world coordinate;
a second generation unit configured to project the new world coordinate to the image, and generate a second mapping relationship table indicating a mapping relationship of coordinates of a pixel point to which the new world coordinate is projected and the new world coordinate.
7. The apparatus of claim 6, wherein the apparatus further comprises:
a second interpolation unit, configured to use the second mapping relation table as a target mapping relation table, and for each pixel point of the target mapping relation table, perform an interpolation step on a world coordinate corresponding to the pixel point by using a preset radius;
and the projection unit is configured to project the new world coordinate generated by executing the interpolation step at this time to the image, and add the mapping relation between the coordinate of the pixel point to which the new world coordinate is projected and the new world coordinate into the second mapping relation table.
8. The apparatus of claim 6 or 7, wherein the preset radius comprises at least two preset radii;
the second interpolation unit is further configured to execute the step of performing interpolation on the world coordinate corresponding to each pixel point of the target mapping relation table by adopting a preset radius according to the following manner:
and for each pixel point of the target mapping relation table, sequentially adopting each preset radius of the at least two preset radii according to the sequence of the preset radii from small to large, and executing an interpolation step on the world coordinate corresponding to the pixel point.
9. The apparatus of claim 6 or 7, wherein the apparatus further comprises:
a lateral interpolation unit configured to perform a lateral interpolation processing step: carrying out interpolation processing on world coordinates corresponding to each row of pixel points in the second mapping relation table to obtain new world coordinates;
and the first adding unit is configured to project the new world coordinate to the image, and add the mapping relation between the coordinate of the pixel point to which the new world coordinate is projected and the new world coordinate to the second mapping relation table.
10. The apparatus of claim 9, wherein the apparatus further comprises:
a vertical interpolation unit configured to perform a vertical interpolation processing step: carrying out interpolation processing on world coordinates corresponding to each row of pixel points in the second mapping relation table to obtain new world coordinates;
a second adding unit configured to project the new world coordinate to the image, and add a mapping relationship between coordinates of a pixel point to which the new world coordinate is projected and the new world coordinate to the second mapping relationship table, wherein the horizontal interpolation processing step is performed before and/or after the vertical interpolation processing step.
11. An electronic device, comprising:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-5.
12. A computer-readable storage medium, on which a computer program is stored, which program, when being executed by a processor, carries out the method according to any one of claims 1-5.
13. A roadside apparatus comprising the electronic apparatus of claim 11.
CN202011001881.0A 2020-09-22 2020-09-22 Mapping generation method and device and road side equipment Active CN112037316B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011001881.0A CN112037316B (en) 2020-09-22 2020-09-22 Mapping generation method and device and road side equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011001881.0A CN112037316B (en) 2020-09-22 2020-09-22 Mapping generation method and device and road side equipment

Publications (2)

Publication Number Publication Date
CN112037316A true CN112037316A (en) 2020-12-04
CN112037316B CN112037316B (en) 2024-04-16

Family

ID=73574876

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011001881.0A Active CN112037316B (en) 2020-09-22 2020-09-22 Mapping generation method and device and road side equipment

Country Status (1)

Country Link
CN (1) CN112037316B (en)

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102141398A (en) * 2010-12-28 2011-08-03 北京航空航天大学 Monocular vision-based method for measuring positions and postures of multiple robots
US20120116675A1 (en) * 2010-11-10 2012-05-10 International Business Machines Corporation Navigation on Maps of Irregular Scales or Variable Scales
CN103247030A (en) * 2013-04-15 2013-08-14 丹阳科美汽车部件有限公司 Fisheye image correction method of vehicle panoramic display system based on spherical projection model and inverse transformation model
CN105224908A (en) * 2014-07-01 2016-01-06 北京四维图新科技股份有限公司 A kind of roadmarking acquisition method based on orthogonal projection and device
WO2017080280A1 (en) * 2015-11-13 2017-05-18 杭州海康威视数字技术股份有限公司 Depth image composition method and apparatus
DE102016226336A1 (en) * 2016-12-30 2018-07-05 Siemens Healthcare Gmbh Method and device for generating a two-dimensional projection image from a three-dimensional image data set
CN108830907A (en) * 2018-06-15 2018-11-16 深圳市易尚展示股份有限公司 Projection surveying method and system based on monocular system
CN108986161A (en) * 2018-06-19 2018-12-11 亮风台(上海)信息科技有限公司 A kind of three dimensional space coordinate estimation method, device, terminal and storage medium
CN109015630A (en) * 2018-06-21 2018-12-18 深圳辰视智能科技有限公司 Hand and eye calibrating method, system and the computer storage medium extracted based on calibration point
CN109146932A (en) * 2018-07-17 2019-01-04 北京旷视科技有限公司 Determine the methods, devices and systems of the world coordinates of target point in image
CN110163930A (en) * 2019-05-27 2019-08-23 北京百度网讯科技有限公司 Lane line generation method, device, equipment, system and readable storage medium storing program for executing
US20200082183A1 (en) * 2018-09-07 2020-03-12 Baidu Online Network Technology (Beijing) Co., Ltd. Method for position detection, device, and storage medium
WO2020069460A1 (en) * 2018-09-28 2020-04-02 Bounce Imaging, Inc. Panoramic camera and image processing systems and methods
CN111340864A (en) * 2020-02-26 2020-06-26 浙江大华技术股份有限公司 Monocular estimation-based three-dimensional scene fusion method and device
CN111578839A (en) * 2020-05-25 2020-08-25 北京百度网讯科技有限公司 Obstacle coordinate processing method and device, electronic equipment and readable storage medium

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120116675A1 (en) * 2010-11-10 2012-05-10 International Business Machines Corporation Navigation on Maps of Irregular Scales or Variable Scales
CN102141398A (en) * 2010-12-28 2011-08-03 北京航空航天大学 Monocular vision-based method for measuring positions and postures of multiple robots
CN103247030A (en) * 2013-04-15 2013-08-14 丹阳科美汽车部件有限公司 Fisheye image correction method of vehicle panoramic display system based on spherical projection model and inverse transformation model
CN105224908A (en) * 2014-07-01 2016-01-06 北京四维图新科技股份有限公司 A kind of roadmarking acquisition method based on orthogonal projection and device
WO2017080280A1 (en) * 2015-11-13 2017-05-18 杭州海康威视数字技术股份有限公司 Depth image composition method and apparatus
DE102016226336A1 (en) * 2016-12-30 2018-07-05 Siemens Healthcare Gmbh Method and device for generating a two-dimensional projection image from a three-dimensional image data set
CN108830907A (en) * 2018-06-15 2018-11-16 深圳市易尚展示股份有限公司 Projection surveying method and system based on monocular system
CN108986161A (en) * 2018-06-19 2018-12-11 亮风台(上海)信息科技有限公司 A kind of three dimensional space coordinate estimation method, device, terminal and storage medium
CN109015630A (en) * 2018-06-21 2018-12-18 深圳辰视智能科技有限公司 Hand and eye calibrating method, system and the computer storage medium extracted based on calibration point
CN109146932A (en) * 2018-07-17 2019-01-04 北京旷视科技有限公司 Determine the methods, devices and systems of the world coordinates of target point in image
US20200082183A1 (en) * 2018-09-07 2020-03-12 Baidu Online Network Technology (Beijing) Co., Ltd. Method for position detection, device, and storage medium
WO2020069460A1 (en) * 2018-09-28 2020-04-02 Bounce Imaging, Inc. Panoramic camera and image processing systems and methods
CN110163930A (en) * 2019-05-27 2019-08-23 北京百度网讯科技有限公司 Lane line generation method, device, equipment, system and readable storage medium storing program for executing
CN111340864A (en) * 2020-02-26 2020-06-26 浙江大华技术股份有限公司 Monocular estimation-based three-dimensional scene fusion method and device
CN111578839A (en) * 2020-05-25 2020-08-25 北京百度网讯科技有限公司 Obstacle coordinate processing method and device, electronic equipment and readable storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
蔡泽伟: "数字条纹投影三维测量技术研究", 中国博士学位论文全文数据库 信息科技辑, no. 07, 15 July 2017 (2017-07-15), pages 138 - 31 *

Also Published As

Publication number Publication date
CN112037316B (en) 2024-04-16

Similar Documents

Publication Publication Date Title
US11615605B2 (en) Vehicle information detection method, electronic device and storage medium
CN110595494B (en) Map error determination method and device
CN112101209B (en) Method and apparatus for determining world coordinate point cloud for roadside computing device
CN111462029B (en) Visual point cloud and high-precision map fusion method and device and electronic equipment
CN111626206A (en) High-precision map construction method and device, electronic equipment and computer storage medium
CN111553844B (en) Method and device for updating point cloud
CN112652016A (en) Point cloud prediction model generation method, pose estimation method and device
CN111079079B (en) Data correction method, device, electronic equipment and computer readable storage medium
EP3904829B1 (en) Method and apparatus for generating information, device, medium and computer program product
CN111722245A (en) Positioning method, positioning device and electronic equipment
KR102643425B1 (en) A method, an apparatus an electronic device, a storage device, a roadside instrument, a cloud control platform and a program product for detecting vehicle's lane changing
CN111721281A (en) Position identification method and device and electronic equipment
CN112344855A (en) Obstacle detection method and device, storage medium and drive test equipment
CN111311743B (en) Three-dimensional reconstruction precision testing method and device and electronic equipment
CN112184914A (en) Method and device for determining three-dimensional position of target object and road side equipment
CN113483774A (en) Navigation method, navigation device, electronic equipment and readable storage medium
CN111597287A (en) Map generation method, device and equipment
CN111949816B (en) Positioning processing method, device, electronic equipment and storage medium
CN111400537B (en) Road element information acquisition method and device and electronic equipment
CN111612851B (en) Method, apparatus, device and storage medium for calibrating camera
CN112102417A (en) Method and device for determining world coordinates and external reference calibration method for vehicle-road cooperative roadside camera
CN111833391A (en) Method and device for estimating image depth information
CN111814651A (en) Method, device and equipment for generating lane line
CN111260722A (en) Vehicle positioning method, apparatus and storage medium
CN114266876B (en) Positioning method, visual map generation method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20211011

Address after: 100176 101, floor 1, building 1, yard 7, Ruihe West 2nd Road, Beijing Economic and Technological Development Zone, Daxing District, Beijing

Applicant after: Apollo Zhilian (Beijing) Technology Co.,Ltd.

Address before: 2 / F, baidu building, No. 10, Shangdi 10th Street, Haidian District, Beijing 100085

Applicant before: BEIJING BAIDU NETCOM SCIENCE AND TECHNOLOGY Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant