CN111524185A - Positioning method and device, electronic equipment and storage medium - Google Patents
Positioning method and device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN111524185A CN111524185A CN202010316617.XA CN202010316617A CN111524185A CN 111524185 A CN111524185 A CN 111524185A CN 202010316617 A CN202010316617 A CN 202010316617A CN 111524185 A CN111524185 A CN 111524185A
- Authority
- CN
- China
- Prior art keywords
- position data
- data
- target
- positioning
- coordinate system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 68
- 238000013519 translation Methods 0.000 claims description 86
- 239000011159 matrix material Substances 0.000 claims description 63
- 238000012937 correction Methods 0.000 claims description 57
- 238000012545 processing Methods 0.000 claims description 57
- 239000013598 vector Substances 0.000 claims description 42
- 230000009466 transformation Effects 0.000 claims description 39
- 238000004590 computer program Methods 0.000 claims description 10
- 230000011218 segmentation Effects 0.000 claims description 7
- 230000014616 translation Effects 0.000 description 66
- 238000010586 diagram Methods 0.000 description 24
- 230000006870 function Effects 0.000 description 12
- 238000004891 communication Methods 0.000 description 10
- 238000005516 engineering process Methods 0.000 description 9
- 238000004422 calculation algorithm Methods 0.000 description 6
- 230000008569 process Effects 0.000 description 6
- 238000013528 artificial neural network Methods 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 4
- 230000005236 sound signal Effects 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 239000000835 fiber Substances 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 230000001902 propagating effect Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- RYGMFSIKBFXOCR-UHFFFAOYSA-N Copper Chemical compound [Cu] RYGMFSIKBFXOCR-UHFFFAOYSA-N 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 229910052802 copper Inorganic materials 0.000 description 1
- 239000010949 copper Substances 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/29—Geographical information databases
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Databases & Information Systems (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Data Mining & Analysis (AREA)
- General Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Navigation (AREA)
- Traffic Control Systems (AREA)
- Image Analysis (AREA)
Abstract
The disclosure relates to a positioning method and apparatus, an electronic device and a storage medium, wherein the method comprises: acquiring first position data of a reference object according to an electronic map; acquiring second position data of the reference object according to a target image shot by the image acquisition equipment; and positioning the target object according to the first position data and the second position data.
Description
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a positioning method and apparatus, an electronic device, and a storage medium.
Background
Positioning is an important technology in the fields of automatic driving and the like, and automatic driving in the true sense cannot be realized if the accurate position of a vehicle cannot be determined. Generally, the Global Positioning System (GPS) is the most widely used Positioning technology, but the Positioning error of the GPS is large, for example, the Positioning error is usually about 10m, and it is difficult to meet the accuracy requirement of automatic driving. In the related art, the laser radar can be used for improving the positioning accuracy, but the laser radar is high in cost and is not beneficial to popularization of the automatic driving technology.
Disclosure of Invention
The disclosure provides a positioning method and device, an electronic device and a storage medium.
According to an aspect of the present disclosure, there is provided a positioning method including:
acquiring first position data of a reference object according to an electronic map;
acquiring second position data of the reference object according to a target image shot by image acquisition equipment;
and positioning the target object according to the first position data and the second position data.
According to the positioning method of the embodiment of the disclosure, the target object can be positioned by utilizing the first position data of the reference object in the electronic map and the second position data of the reference object in the target image, that is, the positioning accuracy of the target object can be improved by taking the position data of the reference object as a basis, so that the positioning accuracy meets the accuracy requirement of automatic driving. And the positioning precision is improved through the target image and the electronic map which are acquired by the image acquisition equipment, the use cost is low, and the use and popularization are convenient.
In one possible implementation, the locating the target object according to the first position data and the second position data includes:
acquiring first positioning data of the target object;
and correcting the first positioning data according to the first position data and the second position data to obtain second positioning data, wherein the second positioning data is used for representing a positioning result of the target object.
In this way, the deviation between the first positioning data and the actual position can be represented by the deviation between the first position data of the reference object in the electronic map and the second position data of the reference object in the target object, so that the error of the first positioning data is corrected, and the precision of the first positioning data can be improved.
In one possible implementation, the first positioning data and the second positioning data include geographic coordinates of the reference object in a geographic coordinate system;
said locating the target object according to the first location data and the second location data comprises:
determining a position correction parameter according to the position relation of the first position data and the second position data of the reference object in a target coordinate system, wherein the target coordinate system is a coordinate system established according to the target object;
and correcting the first positioning data according to the position correction parameter to obtain second positioning data, wherein the second positioning data is used for representing a positioning result of the target object.
In this way, the position correction parameters may be determined using the reference object first position data and the second position data, i.e. the error between the first positioning data and the actual geographical position may be obtained, which may be used to correct the first positioning data to improve the positioning accuracy.
In one possible implementation, the position correction parameters include a rotation matrix;
the determining a position correction parameter according to a position relationship of the first position data and the second position data of the target object in a target coordinate system includes:
acquiring a rotation angle between the first position data and the second position data;
and determining the rotation matrix according to the rotation angle.
In this way, the rotation angle and the rotation matrix for correcting the angular deviation can be obtained from the first position data and the second position data, the direction of the first positioning data can be corrected, the positioning accuracy is improved, and the traveling direction and angle of the target object can be obtained.
In one possible implementation, the position correction parameter includes a translation component, the translation component includes a first translation component and a second translation component, the first translation component is perpendicular to the second translation component, and the reference object corresponding to the target position data is parallel to the first translation component;
determining a position correction parameter according to the first position data and the second position data of the target object, including:
acquiring a rotation angle between the first position data and the second position data;
and determining the first translation component and the second translation component according to the rotation angle and the position data to be adjusted, wherein the target position data is the first position data or the second position data, and the position data to be adjusted is position data except the target position data in the first position data and the second position data.
In this way, the first translational component and the second translational component can be obtained by the rotation angle, the position error of the first positioning data can be corrected, and the positioning accuracy is improved.
In one possible implementation, the first translation component and the second translation component are unit vectors.
In this way, the storage of the first translational component and the second translational component can be facilitated, while the adjustment amplitude of the position data to be adjusted is determined by the unit vector.
In a possible implementation manner, the acquiring first position data of the reference object according to the electronic map includes:
acquiring the geographic coordinates of the reference object in the electronic map;
and performing coordinate transformation processing on the geographic coordinates of the reference object to obtain first position data of the reference object in a target coordinate system in the electronic map, wherein the target coordinate system is a coordinate system established according to the target object.
In this way, the geographic coordinates of the reference object can be transformed into the target coordinate system, so that the positioning error can be conveniently determined in the target coordinate system, and the correction of the first positioning data is facilitated.
In a possible implementation manner, the acquiring a second position number of the reference object according to the target image captured by the image capturing device includes:
performing semantic segmentation processing on the target image to obtain the position coordinates of the reference object in the target image;
and performing coordinate transformation processing on the position coordinates to obtain second position data of the reference object in the target image in a target coordinate system, wherein the target coordinate system is a coordinate system established according to the target object.
In this way, the position coordinates of the reference object in the target image can be converted into the target coordinate system, and the second position data can be obtained, so that the position coordinates of the reference object in the target image and the coordinate system of the geographic coordinates of the reference object in the electronic map can be unified, the error between the first position data and the second position data can be conveniently obtained, and the first positioning data can be favorably corrected.
In a possible implementation manner, the performing coordinate transformation processing on the position coordinates to obtain second position data of the reference object in the target coordinate system in the target image includes:
determining a homography matrix according to the position coordinates of at least part of pixel points in the first image and the position coordinates of points corresponding to the at least part of pixel points in the target coordinate system;
and performing coordinate transformation processing on the position coordinates of the reference object in the target image according to the homography matrix to obtain the second position data.
In this way, the homography matrix can be obtained through the first image to obtain the corresponding relation between the target image and the target coordinate system, the position coordinates of the pixel points in the target image can be conveniently transformed into the position coordinates in the target coordinate system, and the coordinate transformation efficiency is improved.
In one possible implementation, the first location data and the second location data comprise point cloud data.
In this way, the expression of the first position data and the second position data in the target coordinate system is facilitated, and the calculation of the position correction parameter between the first position data and the second position data is also facilitated.
In one possible implementation, the reference object includes a lane line, the first positioning data includes satellite positioning data, and the image capturing device is disposed on the target object.
In one possible implementation, the method further includes:
and determining the relative position relationship between the target object and the reference object according to the second positioning data of the target object.
In this way, a more accurate position of the target object may be obtained.
According to an aspect of the present disclosure, there is provided a positioning apparatus including:
the first acquisition module is used for acquiring first position data of the reference object according to the electronic map;
the second acquisition module is used for acquiring second position data of the reference object according to a target image shot by the image acquisition equipment;
and the positioning module is used for positioning the target object according to the first position data and the second position data.
In one possible implementation, the location module is further configured to:
acquiring first positioning data of the target object;
and correcting the first positioning data according to the first position data and the second position data to obtain second positioning data, wherein the second positioning data is used for representing a positioning result of the target object.
In one possible implementation, the first positioning data and the second positioning data include geographic coordinates of the reference object in a geographic coordinate system;
the positioning module is further configured to:
determining a position correction parameter according to the position relation of the first position data and the second position data of the reference object in a target coordinate system, wherein the target coordinate system is a coordinate system established according to the target object;
and correcting the first positioning data according to the position correction parameter to obtain second positioning data, wherein the second positioning data is used for representing a positioning result of the target object.
In one possible implementation, the position correction parameters include a rotation matrix;
the positioning module is further configured to:
acquiring a rotation angle between the first position data and the second position data;
and determining the rotation matrix according to the rotation angle.
In one possible implementation, the position correction parameter includes a translation component, the translation component includes a first translation component and a second translation component, the first translation component is perpendicular to the second translation component, and the reference object corresponding to the target position data is parallel to the first translation component;
the positioning module is further configured to:
acquiring a rotation angle between the first position data and the second position data;
and determining the first translation component and the second translation component according to the rotation angle and the position data to be adjusted, wherein the target position data is the first position data or the second position data, and the position data to be adjusted is position data except the target position data in the first position data and the second position data.
In one possible implementation, the first translation component and the second translation component are unit vectors.
In one possible implementation, the first obtaining module is further configured to:
acquiring the geographic coordinates of the reference object in the electronic map;
and performing coordinate transformation processing on the geographic coordinates of the reference object to obtain first position data of the reference object in a target coordinate system in the electronic map, wherein the target coordinate system is a coordinate system established according to the target object.
In one possible implementation, the second obtaining module is further configured to:
performing semantic segmentation processing on the target image to obtain the position coordinates of the reference object in the target image;
and performing coordinate transformation processing on the position coordinates to obtain second position data of the reference object in the target image in a target coordinate system, wherein the target coordinate system is a coordinate system established according to the target object.
In one possible implementation, the second obtaining module is further configured to:
determining a homography matrix according to the position coordinates of at least part of pixel points in the first image and the position coordinates of points corresponding to the at least part of pixel points in the target coordinate system;
and performing coordinate transformation processing on the position coordinates of the reference object in the target image according to the homography matrix to obtain the second position data.
In one possible implementation, the first location data and the second location data comprise point cloud data.
In one possible implementation, the reference object includes a lane line, the first positioning data includes satellite positioning data, and the image capturing device is disposed on the target object.
In one possible implementation, the apparatus further includes:
and the relative position module is used for determining the relative position relationship between the target object and the reference object according to the second positioning data of the target object.
According to an aspect of the present disclosure, there is provided an electronic device including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to: the above-described positioning method is performed.
According to an aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon computer program instructions, which when executed by a processor, implement the above-described positioning method.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Other features and aspects of the present disclosure will become apparent from the following detailed description of exemplary embodiments, which proceeds with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a flow chart of a positioning method according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of an electronic map according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of a target image according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of position correction parameters according to an embodiment of the present disclosure;
fig. 5 is a schematic diagram illustrating an application of a positioning method according to an embodiment of the present disclosure;
FIG. 6 is a block diagram of a positioning device according to an embodiment of the present disclosure;
FIG. 7 is a block diagram of an electronic device according to an embodiment of the present disclosure;
fig. 8 is a block diagram of an electronic device according to an embodiment of the disclosure.
Detailed Description
Various exemplary embodiments, features and aspects of the present disclosure will be described in detail below with reference to the accompanying drawings. In the drawings, like reference numbers can indicate functionally identical or similar elements. While the various aspects of the embodiments are presented in drawings, the drawings are not necessarily drawn to scale unless specifically indicated.
The word "exemplary" is used exclusively herein to mean "serving as an example, embodiment, or illustration. Any embodiment described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
The term "and/or" herein is merely an association describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the term "at least one" herein means any one of a plurality or any combination of at least two of a plurality, for example, including at least one of A, B, C, and may mean including any one or more elements selected from the group consisting of A, B and C.
Furthermore, in the following detailed description, numerous specific details are set forth in order to provide a better understanding of the present disclosure. It will be understood by those skilled in the art that the present disclosure may be practiced without some of these specific details. In some instances, methods, means, elements and circuits that are well known to those skilled in the art have not been described in detail so as not to obscure the present disclosure.
Fig. 1 is a flowchart of a positioning method according to an embodiment of the present disclosure, as shown in fig. 1, the method includes:
in step S11, first position data of the reference object is acquired from the electronic map;
in step S12, acquiring second position data of the reference object from a target image captured by an image capturing device;
in step S13, the target object is located according to the first position data and the second position data.
According to the positioning method of the embodiment of the disclosure, the target object can be positioned by utilizing the first position data of the reference object in the electronic map and the second position data of the reference object in the target image, that is, the positioning accuracy of the target object can be improved by taking the position data of the reference object as a basis, so that the positioning accuracy meets the accuracy requirement of automatic driving. And the positioning precision is improved through the target image and the electronic map which are acquired by the image acquisition equipment, the use cost is low, and the use and popularization are convenient.
In one possible implementation, the positioning method may be performed by a terminal device or other processing device, where the terminal device may be a User Equipment (UE), a mobile device, a User terminal, a cellular phone, a cordless phone, a Personal Digital Assistant (PDA), a handheld device, a computing device, a vehicle-mounted device, a wearable device, or the like. The other processing devices may be servers or cloud servers, etc. In some possible implementations, the location method may be implemented by a processor calling computer readable instructions stored in a memory.
In one possible implementation, the electronic map may be a map that is stored and referred to digitally, and the electronic map may include location data for a plurality of locations, roads, objects, etc., such as a Baidu map, Google map, etc. In an example, the electronic map may be a high-precision electronic map (hereinafter referred to as a high-precision map) which is a map that is not seen by the driver with road information at a lane level, compared to a general map that is seen by the driver, and the high-precision map is generally a map that is used for a device such as a vehicle, includes richer road information such as a lane level, and has higher recognition precision with the precision of lane information recognition. The high-precision map includes rich information, for example, the high-precision map may include information of reference objects, which may include lane lines, road signs, traffic lights, and the like, and the present disclosure does not limit the reference objects. The information of the reference object may include lane line information (e.g., a position of a lane line), lane restriction information (e.g., lane restriction information indicated by road signs or traffic lights, such as traffic restriction information, etc.), and the like. In addition, the high-precision map also comprises other rich information, such as traffic flow information and traffic control information. The present disclosure does not limit the information included in the high-precision map.
In an example, the lane line information may be a marking line indicating a traffic lane. The lane line information may include categories of lane lines, such as one-way lanes, straight lanes, right-turn lanes, left-turn lanes, etc., and may further include a center line for dividing opposite lanes. Alternatively, the lane line information may include attributes of the lane line, such as a solid line, a dotted line, a double solid line, a double dotted line, a stop line, a turn guide line, and the like. In an example, the lane line information may include geographic coordinate information of the lane line, e.g., longitude and latitude coordinates of the lane line. The present disclosure does not limit the kind of information included in the lane line information.
Fig. 2 is a schematic diagram of an electronic map according to an embodiment of the present disclosure, and as shown in fig. 2, the lane line information may include, for example, a straight lane, a right-turn lane, a left-turn lane, and the like. In an example, the number of lanes included in the road segment may be determined by the lane line information, for example, two or three lanes may be included, and then the lane where the target object (e.g., vehicle) is located may be determined according to the number of lanes, for example, the lane where the target object is located may be determined by the location information of the target object, for example, the location information of the target object indicates that the target object is in the second lane, so as to help the driver determine the driving strategy or be the basis of the driving strategy in the automatic driving (for example, the driver may drive straight or turn the light). Further, the electronic map may include therein lane line positions, e.g., geographic coordinates of a lane line, which may be a line composed of a plurality of points (e.g., a point cloud) in an example, and the geographic coordinates of the lane line may be represented by the geographic coordinates of each point, i.e., the geographic coordinates of the lane line may include the geographic coordinates (e.g., longitude and latitude coordinates) of the plurality of points.
In an example, the geographic coordinates of the lane line may be shifted longitude and latitude coordinates, generally, in an electronic map, in order to encrypt the geographic position to improve the safety of using the electronic map, the longitude and latitude coordinates of the electronic map are usually shifted, in addition, the longitude and latitude coordinates on the earth (coordinates in a globalstar coordinate system) are converted into coordinates in a planar electronic map, and a shift may also occur.
In one possible implementation, the target object may include a vehicle (e.g., an unmanned vehicle) traveling in a road, a robot, or the like capable of traveling, or the like. The target object can run in the lane and perform straight running, turning, reversing and other operations according to the lane and/or the running instruction. Further, the image capturing device (e.g., a camera of a driving recorder, a camera, etc.) is disposed on the target object and may be used to capture an image of the target object during driving (e.g., the camera of the image capturing device is oriented in a direction of travel of the target object, i.e., the camera is oriented in the direction of travel of the target object, and may capture a target image directly in front of the target object).
In one possible implementation, first position data of a reference object (e.g., a lane line) in an electronic map and second position data of the reference object in a target object may be acquired, and the target object may be located using the first position data and the second position data. For example, the first location data and the second location data may be data fused to locate the target object. In an example, the first position data may represent position coordinates of a lane line near the target object, for example, geographic coordinates of a plurality of lane lines near the target object in the electronic map may be located, and an approximate geographic position of the target object may be determined based on the geographic coordinates (e.g., the target object is located at an intersection of a road a and a road B), and a lane where the target object is located may be determined by the second position data of the lane line in the target image (e.g., one lane line on the right side of the target image, three lane lines on the left side, which may indicate that the target object is located in the first lane), so as to improve positioning accuracy, i.e., improve accuracy to a lane line level.
In one possible implementation, a coarse position of the target object may be determined first, and then the coarse position may be corrected using the first position data and the second position data of the reference object to obtain a precise location of the target object, e.g., a lane-level location. Said locating the target object according to the first location data and the second location data comprises: acquiring first positioning data of the target object; and correcting the first positioning data according to the first position data and the second position data to obtain second positioning data, wherein the second positioning data is used for representing a positioning result of the target object.
In a possible implementation manner, the first positioning data of the target object includes satellite positioning data, for example, GPS positioning data, beidou positioning data, and the like, and the disclosure does not limit the first positioning data, and the first positioning data may determine the geographic position of the target object, but the accuracy of the first positioning data may not be high enough, for example, the error of the first positioning data may reach about 10 m. In an example, the road on which the target object is located may be queried by the first positioning data, but the first positioning data may not accurately determine in which lane of the road the target object is located. For example, it may be determined that the target object is located in a section of a road composed of three lanes by the first positioning data, and it may be determined that the target object is located in a second lane by the first positioning data, but there may be an error in the first positioning data, for example, the lane where the target object is actually located may not coincide with the lane determined by the first positioning data.
In a possible implementation manner, the first positioning data may be corrected by the first position data and the second position data, so as to obtain the second positioning data with higher accuracy. For example, the first position data and the second position data are position data of the same lane line, but the first position data is position data of the lane line in the electronic map, the second position data is position data of the lane line in the target image, the two position data represent positions of the same lane line, if there is a deviation in the positions represented by the two position data, the positioning of the target object is not accurate, and the first positioning data of the target object can be corrected by the deviation.
In an example, the first location data is location data queried in the electronic map according to the first positioning data of the target object, for example, the first positioning data is east longitude X1 degrees, and north latitude X2 degrees, and then the first location data is location data of a lane line near the east longitude and latitude. The second position data is position data of the lane line in the target object photographed by the image pickup device provided on the target object. Since the first position data and the second position data are position data of the same lane line and the second position data is position data in the captured actual image, the position deviation between the first position data and the second position data is the same as the deviation between the first positioning data and the actual position of the target object. Therefore, the first positioning data can be corrected using the positional deviation between the first positional data and the second positional data, and the second positioning data, that is, the positioning data with higher positioning accuracy is obtained.
In this way, the deviation between the first positioning data and the actual position can be represented by the deviation between the first position data of the reference object in the electronic map and the second position data of the reference object in the target object, so that the error of the first positioning data is corrected, and the precision of the first positioning data can be improved.
In one possible implementation, step S11 may include: acquiring the geographic coordinates of the reference object in the electronic map; and performing coordinate transformation processing on the geographic coordinates of the reference object to obtain first position data of the reference object in a target coordinate system in the electronic map, wherein the target coordinate system is a coordinate system established according to the target object.
In one possible implementation, the electronic map may include geographic coordinates of a reference object (e.g., a lane line), and the electronic map may be queried for geographic coordinate information of the lane line. In an example, when the electronic map is used (for example, a vehicle (target object) starts up, or the vehicle starts a navigation or positioning operation), or the electronic map is updated, the geographic coordinate information of the lane line may be queried and stored in the cache or the hard disk, and when the electronic map is stopped to be used (for example, the vehicle is turned off, or the navigation or positioning operation is finished), the geographic coordinate of the lane line may be deleted from the cache or the hard disk, so that the occupied space in the storage space is released when the electronic map is not required to be used, the storage space may be used for storing other data, and the utilization efficiency of the storage resource is improved. Or the geographic coordinate information of the lane line may not be deleted, the geographic coordinate information may not need to be queried every time the navigation or positioning operation is started, the operation may be simplified, and further, the geographic coordinate information may be queried again when the electronic map is updated, so as to update the geographic coordinates in the cache or the hard disk. In an example, the geographic coordinates of the reference object may be queried according to the first positioning data, for example, the geographic coordinates of a lane line near the coordinates may be queried according to the longitude and latitude coordinates of the first positioning data.
In one possible implementation, a target coordinate system may be established based on the target object. For example, a coordinate system may be established with the position of the target object determined by the first positioning data as an origin. For example, the target coordinate system is established with the position of the target object specified by the first positioning data as the origin, the traveling direction of the target object as one coordinate axis direction, and the direction orthogonal to the traveling direction as the other coordinate axis direction.
In one possible implementation manner, coordinate transformation may be performed on the geographic coordinate information of the lane line, that is, coordinate transformation processing from the geographic coordinate system to the target coordinate system is performed, and first position data of the lane line in the electronic map in the target coordinate system is obtained. For example, the longitude and latitude coordinates of each point of the lane line may be subjected to coordinate transformation from the geographic coordinate system to the target coordinate system, and the coordinates of each point in the target coordinate system are obtained, that is, the first position data of the lane line in the electronic map in the target coordinate system may be obtained.
In an example, a coordinate transformation matrix between the geographic coordinate system and the target coordinate system may be first determined, e.g., the target coordinate system is a coordinate system having an origin at the position of the target object determined by the first positioning data, and a relative positional relationship of the target coordinate system and the geographic coordinate system may be determined. In addition, the target coordinate system takes the traveling direction of the target object as the coordinate axis direction, and the relative angle between the target coordinate system and the geographic coordinate system can be determined. Further, the coordinate transformation matrix between the geographic coordinate system and the target coordinate system can be determined through the relative position relationship between the target coordinate system and the geographic coordinate system and the relative angle between the target coordinate system and the geographic coordinate system. Therefore, the longitude and latitude coordinates of each point of the lane line in the electronic map can be subjected to coordinate transformation processing according to the coordinate transformation matrix, and the first position data can be obtained. In an example, the first location data may include point cloud data, for example, the location of the lane line in the electronic map may be represented by geographic coordinates of a plurality of points (e.g., point clouds) on the lane line, and after the coordinate transformation process is performed, the geographic coordinates of the plurality of points on the lane line may be converted into coordinates of the plurality of points in the target coordinate system, and the point cloud data including the coordinates of the plurality of points, that is, the first location data, may be obtained.
In one possible implementation, since the first positioning data may have an error and the geographic coordinates of the lane line are obtained according to the first positioning data query, the first position data of the lane line may have an error, and the error of the first positioning data may be determined according to the error of the first position data of the lane line. For example, the error of the first position data of the lane line is matched with the error of the first positioning data, the error of the first position data of the lane line can be determined, and the error of the first positioning data can be corrected according to the error of the first position data.
In this way, the geographic coordinates of the reference object can be transformed into the target coordinate system, so that the positioning error can be conveniently determined in the target coordinate system, and the correction of the first positioning data is facilitated.
In one possible implementation, a target object (e.g., a vehicle or a robot, etc.) may be traveling and a target image directly in front of the target object is captured by a camera of a tachograph, a camera, or the like, and a reference object (e.g., a lane line) may be included in the target image.
In one possible implementation, step S12 may include: determining image coordinate information of the lane line in the target image; and carrying out coordinate transformation processing on the image coordinate information to obtain second position data of the lane line in the target image in the target coordinate system.
Fig. 3 is a schematic diagram of a target image, as shown in fig. 3, captured by an image capturing device disposed on a target object, in which a third lane from the left is directly in front of the target object, and a plurality of lane lines may be included in the target image according to an embodiment of the present disclosure.
In a possible implementation manner, the target image may be subjected to semantic segmentation processing to obtain position coordinates of the reference object in the target image, the position of the lane line in the target image may be determined by pixel detection, edge detection, and the like, or the position of the lane line in the target image may be segmented by a neural network and the like, for example, the target image is used as an input of the neural network, and the position coordinates of the lane line in the target image are output through semantic segmentation processing of the neural network. In an example, the location of the lane line in the target image may be detected by a neural network, for example, the location of the lane line may include a plurality of pixel points (e.g., a point cloud), and the location coordinates of each pixel point on the lane line in the target image may be detected.
In one possible implementation manner, the coordinate transformation processing may be performed on the position coordinates of the lane line in the target image, and second position data of the lane line in the target image in the target coordinate system is obtained. For example, the coordinates of each pixel point of the lane line in the target image may be subjected to coordinate transformation from the image coordinates to the target coordinate system, and the coordinates of each pixel point in the target coordinate system may be obtained, that is, the second position data of the lane line in the target image in the target coordinate system may be obtained. In an example, the second position data may include point cloud data, for example, the position of the lane line in the target image may be represented by position coordinates of a plurality of points (e.g., point clouds) on the lane line, and after the coordinate transformation process is performed, the position coordinates of the plurality of points on the lane line in the target image may be converted into coordinates of the plurality of points in the target coordinate system, and the point cloud data including the coordinates of the plurality of points, that is, the second position data, may be obtained.
In this way, the position coordinates of the reference object in the target image can be converted into the target coordinate system, and the second position data can be obtained, so that the position coordinates of the reference object in the target image and the coordinate system of the geographic coordinates of the reference object in the electronic map can be unified, the error between the first position data and the second position data can be conveniently obtained, and the first positioning data can be favorably corrected.
In a possible implementation manner, the performing coordinate transformation processing on the position coordinates to obtain second position data of the reference object in the target coordinate system in the target image includes: determining a homography matrix according to the position coordinates of at least part of pixel points in the first image and the position coordinates of points corresponding to the at least part of pixel points in the target coordinate system; and performing coordinate transformation processing on the position coordinates of the reference object in the target image according to the homography matrix to obtain the second position data.
In one possible implementation, a homography matrix between the target image and the target coordinate system, that is, a coordinate transformation matrix between the position coordinates of the pixel points in the target image and the position coordinates of the corresponding points in the target coordinate system, may be first determined, and may be used to transform the position coordinates of the pixel points in the target image into the target coordinate system.
In a possible implementation manner, a homography matrix may be determined according to the position coordinates of at least part of the pixel points in the first image and the position coordinates of the points corresponding to the at least part of the pixel points in the target coordinate system, that is, each pixel point of the lane line in the target image may be converted into a coordinate in the target coordinate system through the homography matrix, and then second position data of the lane line in the target image in the target coordinate system may be obtained.
In a possible implementation manner, the homography matrix may be determined by a first image acquired by an image acquisition device arranged on the target object, where the first image may be any image shot by the image acquisition device, and at least a part of pixel points in the first image may be labeled, for example, position coordinates of the pixel points in the first image may be labeled, and the position coordinates of the pixel points in the target coordinate system may be determined. According to the position coordinates in the pixel point image and the position coordinates in the target coordinate system, a homography matrix H corresponding to the target object can be determined, namely, the position coordinates in the image are transformed to a transformation matrix in the target coordinate system. The homography matrix H can be used to convert the position coordinates of an arbitrary point in the captured image into position coordinates in the target coordinate system.
In a possible implementation manner, the second position data may be obtained by performing coordinate transformation processing on the position coordinates of the reference object in the target image according to the homography matrix. In an example, the position coordinates of each pixel point of the lane line in the target image may be multiplied by the homography matrix, and the position coordinates of each pixel point in the target coordinate system may be obtained, for example, the coordinates of each pixel point in the target coordinate system may be determined according to the following formula (1):
wherein, (u, v) is the position coordinate of any pixel point of the lane line in the target image, and (x, y) is the position coordinate of the pixel point in the target coordinate system. The coordinates of each pixel point of the lane line in the target image can be subjected to coordinate transformation processing by a formula (1), and the second position data is obtained.
In this way, the homography matrix can be obtained through the first image to obtain the corresponding relation between the target image and the target coordinate system, the position coordinates of the pixel points in the target image can be conveniently transformed into the position coordinates in the target coordinate system, and the coordinate transformation efficiency is improved.
In a possible implementation manner, the second position data is position information determined according to a real image captured by the target object, the second position data can be regarded as position information with higher accuracy, an error between the first position data and the second position data can be determined as an error between the first positioning data and an actual position of the target object, and a position correction parameter of the first positioning data can be determined by using the error to correct the first positioning data.
In one possible implementation, the execution sequence of the step S11 of acquiring the first position data and the step S12 of acquiring the second position data is not limited. The case where step S11 is performed first and then step S12 is performed in fig. 1 is merely an example. Alternatively, step S12 may be executed first, and then step S11 may be executed; step S11 and step S12 may also be performed simultaneously.
In one possible implementation, step S13 may include: determining a position correction parameter according to the position relation of the first position data and the second position data of the reference object in a target coordinate system, wherein the target coordinate system is a coordinate system established according to the target object; and correcting the first positioning data according to the position correction parameter to obtain second positioning data, wherein the second positioning data is used for representing a positioning result of the target object. Wherein the first positioning data and the second positioning data comprise geographic coordinates of the reference object in a geographic coordinate system.
In this way, the position correction parameters may be determined using the reference object first position data and the second position data, i.e. the error between the first positioning data and the actual geographical position may be obtained, which may be used to correct the first positioning data to improve the positioning accuracy.
In one possible implementation, the position correction parameters include a rotation matrix; the determining a position correction parameter according to a position relationship of the first position data and the second position data of the target object in a target coordinate system includes: acquiring a rotation angle between the first position data and the second position data; and determining the rotation matrix according to the rotation angle.
In one possible implementation, the first location data and the second location data comprise point cloud data. For example, the first position data and the second position data may each be a line composed of a plurality of coordinate points, the first position data and the second position data may be regarded as a vector, and an angle between the first position data and the second position data, that is, the rotation angle, may be determined.
In a possible implementation manner, the rotation matrix may be determined according to a rotation angle, that is, a vector corresponding to the second position data is multiplied by the rotation matrix, that is, the vector corresponding to the second position data may be rotated, and the rotated vector is parallel to the first position data. Or, the vector corresponding to the first position data is multiplied by the rotation matrix, that is, the vector corresponding to the first position data is rotated, and the rotated vector is parallel to the second position data. The rotation matrix may be determined according to the following equation (2):
where R is a rotation matrix, α is a rotation angle, the vector corresponding to the second position data can be multiplied by the rotation matrix to perform rotation processing, i.e., the vector corresponding to the first position data is obtained by multiplying the vector corresponding to the second position data by the rotation matrix R, or the vector corresponding to the first position data is obtained by multiplying the vector corresponding to the second position data to obtain the vector parallel to the vector corresponding to the second position data2And l1In parallel, wherein2Is the vector corresponding to the second position data, l1Is a vector corresponding to the first position data.
In a possible implementation manner, the rotation matrix may be determined step by using an approximation algorithm or the like, so as to determine the rotation angle. In an example, the second location data may be passed through a vector l that corresponds to the second location data2Multiplication by a rotation matrix corresponding to an angle such that l2Rotate and make l1And l2If l is reduced2After rotation with l1Non-parallel, then rotated2Multiplication continues with the rotation matrix corresponding to an angle such that l2Continue to rotate until l2And l1Parallel, e.g. at the beginning of the approximation algorithm, the angle may be chosen to be larger, with l2And l1Is reduced, said angle may be selected to be a smaller angle, e.g./, of2And l1Is 52 deg., the angle may be chosen to be 20 deg. at the beginning of the approximation algorithm, the angle may be reduced to 12 deg. after two iterations of the approximation algorithm, the angle may be chosen to be 5 deg., the angle may be reduced to 12 deg. after two further iterations of the approximation algorithm2 deg., said angle can be chosen to be 1 deg., after the approximation algorithm is executed twice again, l2And l1Parallel. Further, can be based on2All matrices rotated, the rotation matrix being obtained, e.g. l may be made2And multiplying all the rotated matrixes to obtain the rotation matrix. And the rotation angle may be determined from the values of the elements of the rotation matrix, e.g. by an inverse trigonometric function. The present disclosure does not limit the method of obtaining the rotation angle and the rotation matrix.
In one possible implementation, the rotation angle may be used to determine the direction of travel of the target object (e.g., vehicle). For example, the vehicle may travel in the lane in the direction of the lane line, but only the direction of the lane line in the image may be determined in the target image, and the direction of the lane line in the geographic coordinate system may not be determined. Also, the first position data may only provide the position of the vehicle in the geographic coordinate system, and there may be an error, and thus there may be a case where the actual traveling direction of the vehicle does not coincide with the coordinate axis direction of the target coordinate system. The angle error between the actual driving direction and the coordinate axis direction can be corrected through the rotation angle, namely, the angle error between the first position data and the second position data can be used as the angle error between the actual driving direction and the coordinate axis direction, and the actual driving direction of the vehicle in the geographic coordinate system can be obtained through correction through the rotation angle.
In this way, the rotation angle and the rotation matrix for correcting the angular deviation can be obtained from the first position data and the second position data, the direction of the first positioning data can be corrected, the positioning accuracy is improved, and the traveling direction and angle of the target object can be obtained.
In one possible implementation, the position correction parameter includes a translation component, the translation component includes a first translation component and a second translation component, the first translation component is perpendicular to the second translation component, and the reference object corresponding to the target position data is parallel to the first translation component or the second translation component; determining a position correction parameter according to the first position data and the second position data of the target object, including: acquiring a rotation angle between the first position data and the second position data; and determining the first translation component and the second translation component according to the rotation angle and the position data to be adjusted, wherein the target position data is the first position data or the second position data, and the position data to be adjusted is position data except the target position data in the first position data and the second position data.
In one possible implementation, the error between the first position data and the second position data may be in addition to the angular error, a position error. That is, after rotating the vector corresponding to the second position data to be parallel to the vector corresponding to the first position data, the two vectors may not coincide, i.e., there is a position error. The position error may be corrected by translation, that is, by a first translation component parallel to the lane line corresponding to the position data to be adjusted and a second translation component perpendicular to the lane line corresponding to the position data to be adjusted. The position data to be adjusted may be first position data or second position data, and when the position data to be adjusted is the first position data, a vector corresponding to the rotated first position data (parallel to a vector corresponding to the second position data) may be translated by the first translation component and the second translation component so as to be overlapped with the vector corresponding to the second position data. When the position data to be adjusted is the second position data, the vector corresponding to the rotated second position data (parallel to the vector corresponding to the first position data) can be translated by the first translation component and the second translation component, so that the vector is overlapped with the vector corresponding to the first position data.
In one possible implementation, the first translational component may be determined as a function of the rotation angle α and the position data to be adjusted. In an example, the second position data is a position of a lane line in the target image in the target coordinate system, an angle e between the second position data and a driving direction (i.e., a direction of a coordinate axis, which is the same as the driving direction) may be determined according to a vector corresponding to the second position data, and the first translational component may be determined according to the rotation angle α and the angle e. For example, the first translation component may be determined according to the following equation (3):
ve=(cos(α+∈),sin(α+∈)) (3)
wherein ve is the first translation component. In an example, if the target object is traveling along a lane line, then e may be approximately 0.
In one possible implementation, the second translational component pe may be determined from the first translational component ve, and in an example, a vector orthogonal to ve may be chosen as the second translational component pe.
In one possible implementation, the first translation component and the second translation component are unit vectors. That is, the first translational component and the second translational component are both vectors of length 1. It may be convenient to determine the adjustment amplitude of the position data to be adjusted, for example, the distance adjusted in the direction of the first translation component is 3, and the distance adjusted in the direction of the second translation component is 2, and the present disclosure does not limit the adjustment amplitude.
In this way, the storage of the first translational component and the second translational component can be facilitated, while the adjustment amplitude of the position data to be adjusted is determined by the unit vector.
In a possible implementation manner, if the lane line is a straight line, the movement in the direction parallel to the lane line corresponding to the target position data is meaningless, that is, after the rotation and the movement in the direction perpendicular to the lane line corresponding to the target position data, the lane line corresponding to the target position data and the lane line corresponding to the position data to be adjusted can be overlapped without moving in the direction parallel to the lane line corresponding to the target position data. Therefore, the adjustment amplitude in the first translational component direction can be set to 0. If the lane line is a curved line, it makes sense to move in a direction parallel to the lane line corresponding to the first position data.
In one possible implementation, after determining the position correction parameter, only the rotation angle α, the first translational component and the second translational component may be saved to save memory space.
FIG. 4 is a schematic view of a device according to the present disclosureFor the schematic diagram of the position calibration parameters of the first embodiment, as shown in FIG. 4, the position data is adjusted to be the second position data,/2Is the vector corresponding to the second position data, l1For the vector corresponding to the first position data, l can be set by the rotation matrix R corresponding to the rotation angle α2Rotate to and1in parallel, R.l.is obtained2. And can adjust R x l through the first translation component ve and the second translation component pe2So that R x l after the position adjustment2And l1And (4) overlapping.
In one possible implementation, l may also be adjusted step by an approximation algorithm or the like2Angle and position of (a) so that (a) after adjustment2And l1And (5) coinciding, and further determining a position correction parameter. In an example, the position data to be adjusted is the second position data, and the second position data can be adjusted by a corresponding vector l2Multiplication by a rotation matrix corresponding to an angle such that2Rotate and make l1And l2If l is reduced2After rotation with l1Non-parallel, then rotated2Multiplication continues with the rotation matrix corresponding to an angle such that l2Continue to rotate until l2And l1Parallel. Further, can be based on2All matrices rotated, the rotation matrix being obtained, e.g. l may be made2And multiplying all the rotated matrixes to obtain the rotation matrix. And the rotation angle may be determined from the values of the elements of the rotation matrix, e.g. by an inverse trigonometric function. After obtaining the rotation angle, the first and second translational components ve, pe may be determined from the rotation angle and the second position data. The present disclosure is not limited to the method of determining the location correction parameters.
In one possible implementation, the first translational component ve and the second translational component pe may be determined simultaneously, for example, the first translational component ve may be determined by formula (3), and the second translational component pe may be directly determined by formula (4):
pe=(sin(α+∈),-cos(α+∈)) (4)
in this way, the first translational component and the second translational component can be obtained by the rotation angle, the position error of the first positioning data can be corrected, and the positioning accuracy is improved.
In one possible implementation, the first positioning data may be corrected, i.e. the error of the first positioning data is corrected, according to the position correction parameter. For example, the direction error of the first positioning data may be corrected according to the rotation angle in the position correction parameter, and the position error of the first positioning data may be corrected according to the first translational component and the second translational component in the position correction parameter. Thereby obtaining more accurate second positioning data. The second positioning data has higher precision and can accurately identify the lane where the target object is located.
In an example, the position correction parameter may be used to correct an error of a lane in which the target object is located. For example, in a section of road composed of three lanes, the vehicle may be determined to be in the second lane according to the first positioning data, and after the geographic coordinate information of the three lane lines is converted into the target coordinate system, the second lane line is located on the left side of the origin in the target coordinate system, and the third lane line is located on the right side of the origin in the target coordinate system. And according to the target image acquired by the vehicle, the vehicle can be positioned in a third lane, and after the image coordinate information of the second lane line and the third lane line is converted into a target coordinate system, the second lane line and the third lane line can be positioned on the left side of the original point in the target coordinate system. Errors occur in the first position data and the second position data of the second lane line and the third lane line, and position correction parameters, such as a rotation matrix, a first translational component, and a second translational component, may be determined based on the errors. Further, when the error of the first positioning data is corrected according to the position correction parameter, the error of the lane in which the vehicle is located in the first positioning data may be corrected, for example, in the target coordinate system, the second lane line and the third lane line are both adjusted to the right of the origin. The present disclosure does not limit the lane in which the target object is located and the correction method.
According to the positioning method of the embodiment of the disclosure, the position correction parameter can be determined by using the first position data and the second position data, and the error of the first positioning data is corrected by using the position correction parameter, so that the second positioning data is obtained, and the positioning precision of the target object is improved. Also, the position correction parameter may include a rotation angle or a rotation matrix, may correct a direction error, may obtain a traveling direction and an angle of the target object, may include a first translational component and a second translational component, and may correct a position error. Furthermore, the second position data is obtained through the target image acquired by the target object, so that the use cost is low, and the use and popularization are convenient.
Fig. 5 is a schematic diagram illustrating an application of the positioning method according to the embodiment of the disclosure, as shown in fig. 5, the high-precision map may include information of a reference object, the reference object includes a lane line, the first positioning data includes satellite positioning data, and the image capturing device is disposed on the target object. The geographical coordinate information of the lane lines in the high-precision map can be queried. And establishes a target coordinate system based on the first positioning data (e.g., GPS positioning information). The establishment of the target coordinate system can be referred to the above description, and is not described herein again.
In one possible implementation, the geographical coordinates of the lane line may be queried in a high-precision map, and a plurality of points (e.g., point clouds) of the lane line may be subjected to coordinate transformation processing, respectively, to obtain a point cloud Q { Q } Q in a target coordinate system1,q2……qmAnd m is a positive integer, namely, the first position data of the lane line in the target coordinate system. The coordinate transformation process can refer to the foregoing description, and is not repeated herein.
In a possible implementation manner, the target image may be an image of a vehicle in front of the vehicle, which is captured by an image capture device during driving, the target image may include a lane line on a road, semantic segmentation processing may be performed through a neural network to obtain position coordinates of the lane line in the target image, the position where the lane line is located may include a plurality of pixel points (e.g., point clouds), coordinates of each pixel point on the lane line may be detected, and coordinate transformation processing may be performed on the plurality of pixel points through a homography matrix between the target image and a target coordinate system to obtain a point cloud P { P } in the target coordinate system1,p2……pnN is a positive integer, i.e., the second position data of the lane line in the target coordinate system. The coordinate transformation process can refer to the foregoing description, and is not repeated herein.
In one possible implementation, the point cloud Q { Q } may be sorted1,q2……qmAnd point cloud P { P }1,p2……pnMatching, i.e. determining a lane line made up of corresponding points in the first and second position data, e.g. with point qi-qjCorresponding point pk-pl(i, j is a positive integer less than m, k, l is a positive integer less than n). q. q.si-qjBeing a plurality of points in a lane line, pk-plThe position correction parameters are determined for a plurality of points in a lane line and based on the corresponding lane line.
In one possible implementation, the position correction parameter may include a rotation angle α, i.e., a vector l that may correspond to the second position data2Rotate to make l2Vector l corresponding to first position data1Parallel, rotation angle is α.
In one possible implementation, the position correction parameters may include a first translation component and a second translation component. Can rotate to2I.e. R x l2Performing translations, e.g. for R x l2Run along with1Parallel direction translation process, the first translation component of the direction is ve, and can be applied to R x l2Run along with1And (5) translation processing in a vertical direction, wherein a second translation component of the direction is pe. After the translation treatment, the translated R x l is made2And l1And (4) overlapping. That is, the error between the first translational component and the second translational component is corrected.
In one possible implementation, the rotation matrix R, the first translational component ve and the second translational component pe may be used as the position correction parameters. The first positioning data can be corrected through the position correction parameters so as to correct the error of the first positioning data and improve the positioning accuracy. Through correction processing, second positioning data with high precision can be obtained, and the lane where the vehicle is located can be accurately identified through the second positioning data.
In one possible implementation, the method further includes: and determining the relative position relationship between the target object and the reference object according to the second positioning data of the target object. The relative positional relationship between the target object and the reference object may include at least one of a lane in which the target object is located, a distance between the target object and a lane line, and an angle between the target object and the lane line.
In an example, the location method may be used in the field of autonomous driving to determine an accurate location of a vehicle. The method can be used in a driving assistance system to determine the lane where the vehicle is located and the distance between the vehicle and the lane line; the method can be used in navigation of devices such as unmanned devices or robots to determine the precise position of the devices and the attitude of the devices (such as the angle of the devices relative to a lane line). The present disclosure does not limit the application field of the positioning method.
Fig. 6 is a block diagram of a positioning device according to an embodiment of the present disclosure, as shown in fig. 6, the device including:
the first acquisition module 11 is used for acquiring first position data of the reference object according to the electronic map;
the second obtaining module 12 is configured to obtain second position data of the reference object according to a target image captured by an image capturing device;
a positioning module 13, configured to position the target object according to the first position data and the second position data.
In one possible implementation, the location module is further configured to:
acquiring first positioning data of the target object;
and correcting the first positioning data according to the first position data and the second position data to obtain second positioning data, wherein the second positioning data is used for representing a positioning result of the target object.
In one possible implementation, the first positioning data and the second positioning data include geographic coordinates of the reference object in a geographic coordinate system;
the positioning module is further configured to:
determining a position correction parameter according to the position relation of the first position data and the second position data of the reference object in a target coordinate system, wherein the target coordinate system is a coordinate system established according to the target object;
and correcting the first positioning data according to the position correction parameter to obtain second positioning data, wherein the second positioning data is used for representing a positioning result of the target object.
In one possible implementation, the position correction parameters include a rotation matrix;
the positioning module is further configured to:
acquiring a rotation angle between the first position data and the second position data;
and determining the rotation matrix according to the rotation angle.
In one possible implementation, the position correction parameter includes a translation component, the translation component includes a first translation component and a second translation component, the first translation component is perpendicular to the second translation component, and the reference object corresponding to the target position data is parallel to the first translation component;
the positioning module is further configured to:
acquiring a rotation angle between the first position data and the second position data;
and determining the first translation component and the second translation component according to the rotation angle and the position data to be adjusted, wherein the target position data is the first position data or the second position data, and the position data to be adjusted is position data except the target position data in the first position data and the second position data.
In one possible implementation, the first translation component and the second translation component are unit vectors.
In one possible implementation, the first obtaining module is further configured to:
acquiring the geographic coordinates of the reference object in the electronic map;
and performing coordinate transformation processing on the geographic coordinates of the reference object to obtain first position data of the reference object in a target coordinate system in the electronic map, wherein the target coordinate system is a coordinate system established according to the target object.
In one possible implementation, the second obtaining module is further configured to:
performing semantic segmentation processing on the target image to obtain the position coordinates of the reference object in the target image;
and performing coordinate transformation processing on the position coordinates to obtain second position data of the reference object in the target image in a target coordinate system, wherein the target coordinate system is a coordinate system established according to the target object.
In one possible implementation, the second obtaining module is further configured to:
determining a homography matrix according to the position coordinates of at least part of pixel points in the first image and the position coordinates of points corresponding to the at least part of pixel points in the target coordinate system;
and performing coordinate transformation processing on the position coordinates of the reference object in the target image according to the homography matrix to obtain the second position data.
In one possible implementation, the first location data and the second location data comprise point cloud data.
In one possible implementation, the reference object includes a lane line, the first positioning data includes satellite positioning data, and the image capturing device is disposed on the target object.
In one possible implementation, the apparatus further includes:
and the relative position module is used for determining the relative position relationship between the target object and the reference object according to the second positioning data of the target object.
It is understood that the above-mentioned method embodiments of the present disclosure can be combined with each other to form a combined embodiment without departing from the logic of the principle, which is limited by the space, and the detailed description of the present disclosure is omitted.
In addition, the present disclosure also provides a positioning apparatus, an electronic device, a computer-readable storage medium, and a program, which can be used to implement any one of the methods provided by the present disclosure, and the corresponding technical solutions and descriptions and corresponding descriptions in the method sections are not repeated.
It will be understood by those skilled in the art that in the method of the present invention, the order of writing the steps does not imply a strict order of execution and any limitations on the implementation, and the specific order of execution of the steps should be determined by their function and possible inherent logic.
In some embodiments, functions of or modules included in the apparatus provided in the embodiments of the present disclosure may be used to execute the method described in the above method embodiments, and for specific implementation, reference may be made to the description of the above method embodiments, and for brevity, details are not described here again
Embodiments of the present disclosure also provide a computer-readable storage medium having stored thereon computer program instructions, which when executed by a processor, implement the above-mentioned method. The computer readable storage medium may be a non-volatile computer readable storage medium.
An embodiment of the present disclosure further provides an electronic device, including: a processor; a memory for storing processor-executable instructions; wherein the processor is configured to the above positioning method.
The electronic device may be provided as a terminal, server, or other form of device.
Fig. 7 is a block diagram illustrating an electronic device 800 in accordance with an example embodiment. For example, the electronic device 800 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, a fitness device, a personal digital assistant, or the like terminal.
Referring to fig. 7, electronic device 800 may include one or more of the following components: processing component 802, memory 804, power component 806, multimedia component 808, audio component 810, input/output (I/O) interface 812, sensor component 814, and communication component 816.
The processing component 802 generally controls overall operation of the electronic device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 802 may include one or more processors 820 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interaction between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operations at the electronic device 800. Examples of such data include instructions for any application or method operating on the electronic device 800, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 804 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The power supply component 806 provides power to the various components of the electronic device 800. The power components 806 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the electronic device 800.
The multimedia component 808 includes a screen that provides an output interface between the electronic device 800 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the electronic device 800 is in an operation mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the electronic device 800 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 also includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 814 includes one or more sensors for providing various aspects of state assessment for the electronic device 800. For example, the sensor assembly 814 may detect an open/closed state of the electronic device 800, the relative positioning of components, such as a display and keypad of the electronic device 800, the sensor assembly 814 may also detect a change in the position of the electronic device 800 or a component of the electronic device 800, the presence or absence of user contact with the electronic device 800, orientation or acceleration/deceleration of the electronic device 800, and a change in the temperature of the electronic device 800. Sensor assembly 814 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate wired or wireless communication between the electronic device 800 and other devices. The electronic device 800 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 816 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the electronic device 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer-readable storage medium, such as the memory 804, is also provided that includes computer program instructions executable by the processor 820 of the electronic device 800 to perform the above-described methods.
Fig. 8 is a block diagram illustrating an electronic device 1900 in accordance with an example embodiment. For example, the electronic device 1900 may be provided as a server. Referring to fig. 8, electronic device 1900 includes a processing component 1922 further including one or more processors and memory resources, represented by memory 1932, for storing instructions, e.g., applications, executable by processing component 1922. The application programs stored in memory 1932 may include one or more modules that each correspond to a set of instructions. Further, the processing component 1922 is configured to execute instructions to perform the above-described method.
The electronic device 1900 may also include a power component 1926 configured to perform power management of the electronic device 1900, a wired or wireless network interface 1950 configured to connect the electronic device 1900 to a network, and an input/output (I/O) interface 1958. The electronic device 1900 may operate based on an operating system stored in memory 1932, such as Windows Server, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM, or the like.
In an exemplary embodiment, a non-transitory computer readable storage medium, such as the memory 1932, is also provided that includes computer program instructions executable by the processing component 1922 of the electronic device 1900 to perform the above-described methods.
The present disclosure may be systems, methods, and/or computer program products. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied thereon for causing a processor to implement various aspects of the present disclosure.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may include, for example, but is not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
The computer program instructions for carrying out operations of the present disclosure may be assembler instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, the electronic circuitry that can execute the computer-readable program instructions implements aspects of the present disclosure by utilizing the state information of the computer-readable program instructions to personalize the electronic circuitry, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA).
Various aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Having described embodiments of the present disclosure, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the disclosed embodiments. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terms used herein were chosen in order to best explain the principles of the embodiments, the practical application, or technical improvements to the techniques in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.
Claims (19)
1. A method of positioning, comprising:
acquiring first position data of a reference object according to an electronic map;
acquiring second position data of the reference object according to a target image shot by image acquisition equipment;
and positioning the target object according to the first position data and the second position data.
2. The method of claim 1, wherein said locating the target object based on the first location data and the second location data comprises:
acquiring first positioning data of the target object;
and correcting the first positioning data according to the first position data and the second position data to obtain second positioning data, wherein the second positioning data is used for representing a positioning result of the target object.
3. The method of claim 2, wherein the first positioning data and the second positioning data comprise geographic coordinates of the reference object in a geographic coordinate system;
said locating the target object according to the first location data and the second location data comprises:
determining a position correction parameter according to the position relation of the first position data and the second position data of the reference object in a target coordinate system, wherein the target coordinate system is a coordinate system established according to the target object;
and correcting the first positioning data according to the position correction parameter to obtain second positioning data, wherein the second positioning data is used for representing a positioning result of the target object.
4. The method of claim 3, wherein the position correction parameters include a rotation matrix;
the determining a position correction parameter according to a position relationship of the first position data and the second position data of the target object in a target coordinate system includes:
acquiring a rotation angle between the first position data and the second position data;
and determining the rotation matrix according to the rotation angle.
5. The method according to claim 3 or 4, wherein the position correction parameters include a translation component, the translation component includes a first translation component and a second translation component, the first translation component is perpendicular to the second translation component, and the reference object corresponding to the target position data is parallel to the first translation component;
determining a position correction parameter according to the first position data and the second position data of the target object, including:
acquiring a rotation angle between the first position data and the second position data;
and determining the first translation component and the second translation component according to the rotation angle and the position data to be adjusted, wherein the target position data is the first position data or the second position data, and the position data to be adjusted is position data except the target position data in the first position data and the second position data.
6. The method of claim 5, wherein the first translation component and the second translation component are unit vectors.
7. The method according to any one of claims 1-6, wherein said obtaining first position data of a reference object from an electronic map comprises:
acquiring the geographic coordinates of the reference object in the electronic map;
and performing coordinate transformation processing on the geographic coordinates of the reference object to obtain first position data of the reference object in a target coordinate system in the electronic map, wherein the target coordinate system is a coordinate system established according to the target object.
8. The method according to any one of claims 1-7, wherein the obtaining a second number of positions of the reference object from the target image captured by the image capture device comprises:
performing semantic segmentation processing on the target image to obtain the position coordinates of the reference object in the target image;
and performing coordinate transformation processing on the position coordinates to obtain second position data of the reference object in the target image in a target coordinate system, wherein the target coordinate system is a coordinate system established according to the target object.
9. The method according to claim 8, wherein the performing coordinate transformation processing on the position coordinates to obtain second position data of the reference object in the target coordinate system in the target image comprises:
determining a homography matrix according to the position coordinates of at least part of pixel points in the first image and the position coordinates of points corresponding to the at least part of pixel points in the target coordinate system;
and performing coordinate transformation processing on the position coordinates of the reference object in the target image according to the homography matrix to obtain the second position data.
10. The method of any of claims 1-9, wherein the first location data and the second location data comprise point cloud data.
11. The method according to any of claims 1-10, wherein the reference object comprises a lane line, the first positioning data comprises satellite positioning data, and the image acquisition device is disposed on the target object.
12. The method according to any one of claims 1-11, further comprising:
and determining the relative position relationship between the target object and the reference object according to the second positioning data of the target object.
13. A positioning device, comprising:
the first acquisition module is used for acquiring first position data of the reference object according to the electronic map;
the second acquisition module is used for acquiring second position data of the reference object according to a target image shot by the image acquisition equipment;
and the positioning module is used for positioning the target object according to the first position data and the second position data.
14. The apparatus of claim 13, wherein the positioning module is further configured to:
acquiring first positioning data of the target object;
and correcting the first positioning data according to the first position data and the second position data to obtain second positioning data, wherein the second positioning data is used for representing a positioning result of the target object.
15. The apparatus of claim 14, wherein the first positioning data and the second positioning data comprise geographic coordinates of the reference object in a geographic coordinate system;
the positioning module is further configured to:
determining a position correction parameter according to the position relation of the first position data and the second position data of the reference object in a target coordinate system, wherein the target coordinate system is a coordinate system established according to the target object;
and correcting the first positioning data according to the position correction parameter to obtain second positioning data, wherein the second positioning data is used for representing a positioning result of the target object.
16. The apparatus of claim 15, wherein the position correction parameters comprise a rotation matrix;
the positioning module is further configured to:
acquiring a rotation angle between the first position data and the second position data;
and determining the rotation matrix according to the rotation angle.
17. The apparatus according to claim 15 or 16, wherein the position correction parameters comprise a translation component, the translation component comprising a first translation component and a second translation component, the first translation component being perpendicular to the second translation component, and the reference object corresponding to the target position data being parallel to the first translation component;
the positioning module is further configured to:
acquiring a rotation angle between the first position data and the second position data;
and determining the first translation component and the second translation component according to the rotation angle and the position data to be adjusted, wherein the target position data is the first position data or the second position data, and the position data to be adjusted is position data except the target position data in the first position data and the second position data.
18. An electronic device, comprising:
a processor; a memory for storing processor-executable instructions; wherein the processor is configured to: performing the method of any one of claims 1 to 12.
19. A computer readable storage medium having computer program instructions stored thereon, which when executed by a processor implement the method of any one of claims 1 to 12.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010316617.XA CN111524185A (en) | 2020-04-21 | 2020-04-21 | Positioning method and device, electronic equipment and storage medium |
PCT/CN2021/075436 WO2021212964A1 (en) | 2020-04-21 | 2021-02-05 | Positioning method and apparatus, and electronic device and storage medium |
JP2022520031A JP2022550188A (en) | 2020-04-21 | 2021-02-05 | Positioning method, device, electronic device, storage medium and computer program |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010316617.XA CN111524185A (en) | 2020-04-21 | 2020-04-21 | Positioning method and device, electronic equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111524185A true CN111524185A (en) | 2020-08-11 |
Family
ID=71901252
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010316617.XA Pending CN111524185A (en) | 2020-04-21 | 2020-04-21 | Positioning method and device, electronic equipment and storage medium |
Country Status (3)
Country | Link |
---|---|
JP (1) | JP2022550188A (en) |
CN (1) | CN111524185A (en) |
WO (1) | WO2021212964A1 (en) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111931217A (en) * | 2020-09-18 | 2020-11-13 | 蘑菇车联信息科技有限公司 | Map data processing method and electronic equipment |
CN112164138A (en) * | 2020-10-30 | 2021-01-01 | 上海商汤临港智能科技有限公司 | Point cloud data screening method and device |
CN112800159A (en) * | 2021-01-25 | 2021-05-14 | 北京百度网讯科技有限公司 | Map data processing method and device |
CN112950712A (en) * | 2021-02-25 | 2021-06-11 | 深圳市慧鲤科技有限公司 | Positioning method and device, electronic equipment and storage medium |
CN113038372A (en) * | 2021-03-11 | 2021-06-25 | 华高数字科技有限公司 | Wearable auxiliary positioning early warning linkage method based on block chain |
CN113052904A (en) * | 2021-03-19 | 2021-06-29 | 上海商汤临港智能科技有限公司 | Positioning method, positioning device, electronic equipment and storage medium |
CN113192139A (en) * | 2021-05-14 | 2021-07-30 | 浙江商汤科技开发有限公司 | Positioning method and device, electronic equipment and storage medium |
WO2021212964A1 (en) * | 2020-04-21 | 2021-10-28 | 上海商汤临港智能科技有限公司 | Positioning method and apparatus, and electronic device and storage medium |
CN113569800A (en) * | 2021-08-09 | 2021-10-29 | 北京地平线机器人技术研发有限公司 | Lane recognition and verification method and device, readable storage medium and electronic equipment |
CN114088061A (en) * | 2021-02-24 | 2022-02-25 | 上海商汤临港智能科技有限公司 | Target positioning method and device, electronic equipment and storage medium |
WO2022052567A1 (en) * | 2020-09-08 | 2022-03-17 | 广州小鹏自动驾驶科技有限公司 | Vehicle positioning method and apparatus, vehicle, and storage medium |
CN113569800B (en) * | 2021-08-09 | 2024-10-29 | 北京地平线机器人技术研发有限公司 | Lane recognition verification method and device, readable storage medium and electronic equipment |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114047535A (en) * | 2021-11-15 | 2022-02-15 | 中国电信股份有限公司 | Positioning method, positioning device and computer readable storage medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120035844A1 (en) * | 2010-08-06 | 2012-02-09 | Hitachi, Ltd. | Cruise assist system |
CN108453701A (en) * | 2017-02-09 | 2018-08-28 | 佳能株式会社 | Control method, the method for teaching robot and the robot system of robot |
CN109300159A (en) * | 2018-09-07 | 2019-02-01 | 百度在线网络技术(北京)有限公司 | Method for detecting position, device, equipment, storage medium and vehicle |
CN110530372A (en) * | 2019-09-26 | 2019-12-03 | 上海商汤智能科技有限公司 | Localization method, determining method of path, device, robot and storage medium |
CN110595494A (en) * | 2019-09-17 | 2019-12-20 | 百度在线网络技术(北京)有限公司 | Map error determination method and device |
CN110869978A (en) * | 2017-07-11 | 2020-03-06 | 佳能株式会社 | Information processing apparatus, information processing method, and computer program |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6939615B2 (en) * | 2018-02-02 | 2021-09-22 | トヨタ自動車株式会社 | Traffic light recognition device |
CN110738200A (en) * | 2019-12-23 | 2020-01-31 | 广州赛特智能科技有限公司 | Lane line 3D point cloud map construction method, electronic device and storage medium |
CN111524185A (en) * | 2020-04-21 | 2020-08-11 | 上海商汤临港智能科技有限公司 | Positioning method and device, electronic equipment and storage medium |
CN112001456B (en) * | 2020-10-28 | 2021-07-30 | 北京三快在线科技有限公司 | Vehicle positioning method and device, storage medium and electronic equipment |
-
2020
- 2020-04-21 CN CN202010316617.XA patent/CN111524185A/en active Pending
-
2021
- 2021-02-05 WO PCT/CN2021/075436 patent/WO2021212964A1/en active Application Filing
- 2021-02-05 JP JP2022520031A patent/JP2022550188A/en not_active Withdrawn
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120035844A1 (en) * | 2010-08-06 | 2012-02-09 | Hitachi, Ltd. | Cruise assist system |
CN108453701A (en) * | 2017-02-09 | 2018-08-28 | 佳能株式会社 | Control method, the method for teaching robot and the robot system of robot |
CN110869978A (en) * | 2017-07-11 | 2020-03-06 | 佳能株式会社 | Information processing apparatus, information processing method, and computer program |
CN109300159A (en) * | 2018-09-07 | 2019-02-01 | 百度在线网络技术(北京)有限公司 | Method for detecting position, device, equipment, storage medium and vehicle |
CN110595494A (en) * | 2019-09-17 | 2019-12-20 | 百度在线网络技术(北京)有限公司 | Map error determination method and device |
CN110530372A (en) * | 2019-09-26 | 2019-12-03 | 上海商汤智能科技有限公司 | Localization method, determining method of path, device, robot and storage medium |
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021212964A1 (en) * | 2020-04-21 | 2021-10-28 | 上海商汤临港智能科技有限公司 | Positioning method and apparatus, and electronic device and storage medium |
WO2022052567A1 (en) * | 2020-09-08 | 2022-03-17 | 广州小鹏自动驾驶科技有限公司 | Vehicle positioning method and apparatus, vehicle, and storage medium |
CN111931217A (en) * | 2020-09-18 | 2020-11-13 | 蘑菇车联信息科技有限公司 | Map data processing method and electronic equipment |
CN112164138A (en) * | 2020-10-30 | 2021-01-01 | 上海商汤临港智能科技有限公司 | Point cloud data screening method and device |
CN112800159A (en) * | 2021-01-25 | 2021-05-14 | 北京百度网讯科技有限公司 | Map data processing method and device |
CN112800159B (en) * | 2021-01-25 | 2023-10-31 | 北京百度网讯科技有限公司 | Map data processing method and device |
US11866064B2 (en) | 2021-01-25 | 2024-01-09 | Beijing Baidu Netcom Science Technology Co., Ltd. | Method and apparatus for processing map data |
CN114088061A (en) * | 2021-02-24 | 2022-02-25 | 上海商汤临港智能科技有限公司 | Target positioning method and device, electronic equipment and storage medium |
CN114088062B (en) * | 2021-02-24 | 2024-03-22 | 上海商汤临港智能科技有限公司 | Target positioning method and device, electronic equipment and storage medium |
CN114088061B (en) * | 2021-02-24 | 2024-03-22 | 上海商汤临港智能科技有限公司 | Target positioning method and device, electronic equipment and storage medium |
CN114088062A (en) * | 2021-02-24 | 2022-02-25 | 上海商汤临港智能科技有限公司 | Target positioning method and device, electronic equipment and storage medium |
CN112950712A (en) * | 2021-02-25 | 2021-06-11 | 深圳市慧鲤科技有限公司 | Positioning method and device, electronic equipment and storage medium |
WO2022179080A1 (en) * | 2021-02-25 | 2022-09-01 | 深圳市慧鲤科技有限公司 | Positioning method and apparatus, electronic device, storage medium, program and product |
CN113038372A (en) * | 2021-03-11 | 2021-06-25 | 华高数字科技有限公司 | Wearable auxiliary positioning early warning linkage method based on block chain |
CN113052904B (en) * | 2021-03-19 | 2022-12-13 | 上海商汤临港智能科技有限公司 | Positioning method, positioning device, electronic equipment and storage medium |
CN113052904A (en) * | 2021-03-19 | 2021-06-29 | 上海商汤临港智能科技有限公司 | Positioning method, positioning device, electronic equipment and storage medium |
CN113192139A (en) * | 2021-05-14 | 2021-07-30 | 浙江商汤科技开发有限公司 | Positioning method and device, electronic equipment and storage medium |
CN113569800A (en) * | 2021-08-09 | 2021-10-29 | 北京地平线机器人技术研发有限公司 | Lane recognition and verification method and device, readable storage medium and electronic equipment |
CN113569800B (en) * | 2021-08-09 | 2024-10-29 | 北京地平线机器人技术研发有限公司 | Lane recognition verification method and device, readable storage medium and electronic equipment |
Also Published As
Publication number | Publication date |
---|---|
WO2021212964A1 (en) | 2021-10-28 |
JP2022550188A (en) | 2022-11-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111524185A (en) | Positioning method and device, electronic equipment and storage medium | |
CN107957266B (en) | Positioning method, positioning device and storage medium | |
US10959049B2 (en) | Scene sharing-based navigation assistance method and terminal | |
CN110473259A (en) | Pose determines method and device, electronic equipment and storage medium | |
CN112284400B (en) | Vehicle positioning method and device, electronic equipment and computer readable storage medium | |
CN111664866A (en) | Positioning display method and device, positioning method and device and electronic equipment | |
CN111289006A (en) | Lane navigation path generation method and device and driving control method and device | |
CN112433211B (en) | Pose determination method and device, electronic equipment and storage medium | |
CN112432637B (en) | Positioning method and device, electronic equipment and storage medium | |
US12020463B2 (en) | Positioning method, electronic device and storage medium | |
CN110865405A (en) | Fusion positioning method and device, mobile equipment control method and electronic equipment | |
CN114578329A (en) | Multi-sensor joint calibration method, device, storage medium and program product | |
CN113066135A (en) | Calibration method and device of image acquisition equipment, electronic equipment and storage medium | |
CN112541971A (en) | Point cloud map construction method and device, electronic equipment and storage medium | |
CN114563005A (en) | Road positioning method, device, equipment, vehicle and storage medium | |
CN112432636B (en) | Positioning method and device, electronic equipment and storage medium | |
CN116740158B (en) | Image depth determining method, device and storage medium | |
CN111860074B (en) | Target object detection method and device, and driving control method and device | |
CN111859003A (en) | Visual positioning method and device, electronic equipment and storage medium | |
CN114608591B (en) | Vehicle positioning method and device, storage medium, electronic equipment, vehicle and chip | |
CN109961646B (en) | Road condition information error correction method and device | |
KR20220155421A (en) | Positioning method and device, electronic device, storage medium and computer program | |
CN111832338A (en) | Object detection method and device, electronic equipment and storage medium | |
WO2022110777A1 (en) | Positioning method and apparatus, electronic device, storage medium, computer program product, and computer program | |
WO2022110801A1 (en) | Data processing method and apparatus, electronic device, and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200811 |
|
RJ01 | Rejection of invention patent application after publication |