CN114216471A - Electronic map determination method and device, electronic equipment and storage medium - Google Patents
Electronic map determination method and device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN114216471A CN114216471A CN202111454730.5A CN202111454730A CN114216471A CN 114216471 A CN114216471 A CN 114216471A CN 202111454730 A CN202111454730 A CN 202111454730A CN 114216471 A CN114216471 A CN 114216471A
- Authority
- CN
- China
- Prior art keywords
- lane
- type
- determining
- position information
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 66
- 230000004044 response Effects 0.000 claims abstract description 25
- 238000004590 computer program Methods 0.000 claims description 14
- 238000004891 communication Methods 0.000 claims description 13
- 239000003550 marker Substances 0.000 claims description 8
- 238000007621 cluster analysis Methods 0.000 claims description 5
- 238000012545 processing Methods 0.000 abstract description 9
- 238000010586 diagram Methods 0.000 description 17
- 238000005516 engineering process Methods 0.000 description 13
- 238000012790 confirmation Methods 0.000 description 12
- 230000011218 segmentation Effects 0.000 description 9
- 230000008569 process Effects 0.000 description 7
- 230000008859 change Effects 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 238000012015 optical character recognition Methods 0.000 description 5
- 230000006399 behavior Effects 0.000 description 4
- 238000013473 artificial intelligence Methods 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 3
- 230000001788 irregular Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 238000013135 deep learning Methods 0.000 description 2
- 230000007547 defect Effects 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000018109 developmental process Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 241001270131 Agaricus moelleri Species 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000002457 bidirectional effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000003709 image segmentation Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000005065 mining Methods 0.000 description 1
- 238000003058 natural language processing Methods 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/28—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
- G01C21/30—Map- or contour-matching
- G01C21/32—Structuring or formatting of map data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/3453—Special cost functions, i.e. other than distance or default speed limit of road segments
- G01C21/3492—Special cost functions, i.e. other than distance or default speed limit of road segments employing speed data or traffic data, e.g. real-time or historical
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
- G01C21/3658—Lane guidance
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Traffic Control Systems (AREA)
- Navigation (AREA)
Abstract
The disclosure provides a map determination method, a map determination device, electronic equipment and a storage medium, and relates to the technical field of digital processing, in particular to the technical field of intelligent transportation. The specific implementation scheme is as follows: determining the type of the lane and the driving attribute of the lane based on text information on the ground corresponding to the lane; determining the type of the lane based on the lane mark corresponding to the lane; determining position information of the lane based on the lane identification in response to the type of the lane determined based on the text information being the same as the type of the lane determined based on the lane identification; identifying the type of the lane and the driving attribute on an electronic map based on the position information of the lane.
Description
Technical Field
The present disclosure relates to the field of data processing technologies, and in particular, to a method and an apparatus for determining an electronic map in the field of intelligent transportation, an electronic device, and a storage medium.
Background
In recent years, with the rapid development and popularization of terminal positioning technology and automatic driving technology, electronic maps become key research objects in the field of intelligent transportation; because the traditional electronic map facing the road level can not finely show the road structure of the urban complex road, the intelligent traffic can not exert the advantages of the urban complex road to the maximum extent.
Disclosure of Invention
The disclosure provides an electronic map determination method, an electronic map determination device, electronic equipment and a storage medium.
According to an aspect of the present disclosure, there is provided a map determining method including:
determining the type of the lane and the driving attribute of the lane based on text information on the ground corresponding to the lane;
determining the type of the lane based on the lane mark corresponding to the lane;
determining position information of the lane based on the lane identification in response to the type of the lane determined based on the text information being the same as the type of the lane determined based on the lane identification;
identifying the type of the lane and the driving attribute on an electronic map based on the position information of the lane.
According to a second aspect of the present disclosure, there is provided an electronic map determination apparatus including:
a first determination unit, configured to determine a type of a lane and a driving property of the lane based on text information on a ground corresponding to the lane;
a second determining unit, configured to determine a type of the lane based on a lane identifier corresponding to the lane;
a response unit for determining position information of the lane based on the lane marker in response to the type of the lane determined based on the text information being the same as the type of the lane determined based on the lane marker;
an identification unit for identifying the type of the lane and the driving attribute on an electronic map based on the position information of the lane.
A third aspect of the present disclosure provides an electronic device, comprising: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the electronic map determination method described above.
A fourth aspect of the present disclosure provides a non-transitory computer-readable storage medium storing computer instructions for causing the computer to execute the electronic map determination method described above.
A fifth aspect of the present disclosure provides a computer program product comprising a computer program/instructions which, when executed by a processor, implements the electronic map determination method described above.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present disclosure, nor do they limit the scope of the present disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The drawings are included to provide a better understanding of the present solution and are not to be construed as limiting the present disclosure. Wherein:
FIG. 1 shows a schematic view of textual information on the ground and an aerial sign of a bus lane;
fig. 2 shows an architectural schematic diagram of an electronic map determination system provided by an embodiment of the present disclosure;
fig. 3 is a schematic flow chart illustrating an alternative electronic map determining method provided by the embodiment of the present disclosure;
fig. 4 is a schematic flow chart illustrating another alternative electronic map confirmation method provided by the embodiment of the present disclosure;
fig. 5 is a schematic flow chart illustrating a further alternative electronic map determining method provided by the embodiment of the present disclosure;
fig. 6(a) shows a front view image of a lane;
FIG. 6(b) is a diagram illustrating semantic segmentation results provided by an embodiment of the present disclosure;
FIG. 6(c) is a schematic diagram illustrating obtaining at least one sub recognition box provided by an embodiment of the present disclosure;
FIG. 6(d) is a schematic diagram illustrating an obtained text recognition box provided by an embodiment of the present disclosure;
FIG. 7 illustrates an alternative schematic diagram of an electronic map identification provided by an embodiment of the present disclosure;
FIG. 8 illustrates an alternative schematic diagram of planning a navigation path provided by an embodiment of the present disclosure;
fig. 9 is a schematic diagram illustrating an alternative structure of an electronic map determining apparatus provided by an embodiment of the present disclosure;
FIG. 10 shows a schematic block diagram of an example electronic device that may be used to implement embodiments of the present disclosure.
Detailed Description
Exemplary embodiments of the present disclosure are described below with reference to the accompanying drawings, in which various details of the embodiments of the disclosure are included to assist understanding, and which are to be considered as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
In recent years, with rapid development and popularization of terminal positioning technology and automatic driving technology, map data at a common road level has not been able to meet the needs of users. The lane-level navigation service can enable the user to distinguish the awkwardness of 'not clearly distinguishing main and auxiliary roads' and 'walking wrong at the ramp intersection'. Navigation maps are being stepped up from the conventional road level to the lane level. The lane-level navigation map data covers a plurality of elements, wherein some special lane attributes, such as bus lanes, tidal lanes and the like, are very important geographic attribute elements and play an important role in lane-level driving navigation, safe driving and the like.
When map navigation is performed, since there are some road sections where signs are not obvious, the user wants to know which lanes are the special lanes, for example: whether the current lane is a bus-only road or whether the current lane is totally or semi-enclosed, and whether the current lane is in accordance with traffic regulations in the time interval; whether the current lane is a tide lane or not and whether the current time interval is open or not; whether the current lane is a variable lane, what the driving direction of the lane is at the current time, and the like.
In addition, in the process of lane-level navigation, more elements of navigation are 'vehicles' rather than 'people', and in the path planning stage of lane-level navigation, if the electronic map attribute can show which lanes are special lanes, such as bus lanes, and the lanes are in a closed state in a certain time interval, the path planning calculation stage can avoid such road sections; in the navigation stage, if the vehicle generates irregular behaviors, such as occupying a bus lane, the navigation can also prompt the irregular behaviors, and a driving route is planned, so that the influence on normal bus driving is avoided.
In the related technology, a navigation engine of a bus road is designed based on a navigation strategy of a road level, only the existence of a bus lane on the road is prompted, and a simple prompt is given, so that whether the bus runs on a bus-only road or not cannot be positioned, and whether the bus breaks rules or not cannot be judged. And when the electronic map is confirmed, map operators mainly use the recycled panoramic images to verify whether the lane attributes change or not, and a bus lane is special, so that the length of a bus lane extending through a crossing can reach several kilometers, and the lane attributes can be distinguished whether the lane attributes change or not by manually verifying a plurality of images, thereby bringing great verification cost and lowering the updating efficiency of the base map.
In addition, from the navigation perspective, the traditional navigation electronic map is oriented to the road level instead of the lane level, and has the defect that the road structure of the urban complex road cannot be elaborately shown, in the navigation process, navigation prompt can be only carried out according to the position of the vehicle on the road, the navigation electronic map is similar to a bus special road, the bus special road can be singly prompted to exist on the road section, the specific spatial position and range of the bus special road cannot be finely positioned, when the driver has irregular driving or driving violating regulations, the navigation cannot timely correct the violation behaviors of the driver, and due to the fact that no data support of the lane level exists, the navigation function of some details cannot be exerted.
From the perspective of map confirmation, because the traffic signs in the real world are newly added, modified or deleted along with the change of time, and the like, compared with the traditional standard precision base map data, the lane-level precision base map data has a more complex structure and higher operation difficulty, the process period of manual operation is longer, and the timeliness is lower; in addition, since the special lane needs to check the change of the ground text attribute and the sign attribute at the same time, the cost of the manual check and the work is large.
The present disclosure provides an electronic map confirmation method, which at least solves the defects of the electronic map in the prior art.
The lane-level navigation product needs to calculate a lane-level route through ETA in a path planning stage according to the driving position of a vehicle and the positioning of time and space, the congestion rate of which lane is low, which lanes are more beneficial to left turn and turn around, which lanes are tide bidirectional lanes, which lanes have a multifunctional steering effect at an intersection and the like, and lane-level data can calculate whether the vehicle is reasonable to drive or not in real time in the navigation stage and whether traffic violation and other behaviors occur or not.
Artificial intelligence technology has been widely used in various fields in the industry, including speech recognition, image recognition, natural language processing, knowledge map inference, etc., where the images have rich information and self-acquired panoramic image data, and a large number of high-definition panoramic images are more suitable for image road modeling, traffic sign recognition and detection, and have natural infrastructure for assisting road mapping, automatically updating geographic elements, road attribute mining, etc.
The embodiment of the disclosure mainly comprises two parts, namely electronic map confirmation, and lane-level navigation is carried out based on the electronic map.
Fig. 1 shows a schematic representation of text information and an air sign on the ground of a bus lane.
As shown in fig. 1, a bus lane may be determined by text information on the ground of the lane (a in fig. 1) and an air sign at the entrance or exit of the lane (b in fig. 1).
In the disclosed embodiment, the confirmation of the electronic map includes confirmation of text information on the ground and confirmation of lane markings of an aerial sign.
For the confirmation of the text information on the ground of the lane, a panoramic image (such as a front view image) can be used, a depth learning image segmentation technology is used to segment a pixel connected domain of ground characters, the located pixel region is converted into a corresponding overhead view image, traffic signs on the ground in the overhead view image, such as steering information, lane line information, lane ground text and the like, are easier to recognize relative to the front view image, the ground information in the overhead view image is comprehensive and clear, the position (first position information) of the text information is located from the overhead view image, the characters in the image are clustered by using the position of the text information in the overhead view image, the text information in the overhead view image is cut into small pictures (sub Recognition boxes), the text information of the small pictures can be recognized by an Optical Character Recognition (OCR) Recognition engine, the content of the text information on the ground can be recognized, and whether the first information is included or not can be judged according to the Recognition result of the OCR, comparing the lane attribute with a corresponding lane attribute in the electronic map to be determined, and if the corresponding lane attribute in the electronic map to be determined is a non-bus lane, updating the lane attribute into a bus lane; and if the corresponding lane attribute in the electronic map to be determined is a bus lane, further confirming the change point of the bus lane attribute (namely the starting position and the ending position of the bus lane).
In the embodiment of the disclosure, the attribute of the bus lane can be determined according to the aerial sign; and confirming that the attribute of the lane is a bus lane under the condition that the lane is represented by the lane mark of the air sign under the condition that the air sign exists. In addition, the aerial sign is generally set up in the initial position and end position of the bus lane, in order to label the scope of the bus lane, through discerning and positioning to the sign of aerial, position its position in the space, through the density clustering, position to the actual space position of the sign, carry on the subdivision to the sign through the deep learning technology finally, thus discern the attribute of the special lane of bus type sign, and the position of the said aerial sign, and then compare with said electronic map to be confirmed; optionally, the attribute, the starting position, the ending position and the driving attribute of the lane can be determined by combining text information of the lane ground.
Referring to fig. 2, fig. 2 is an architectural diagram of an electronic map determining system 100 provided by the embodiment of the present disclosure, in order to support an exemplary application, an electronic map determining apparatus 400 is connected to a server 200 through a network 300, where the network 300 may be a wide area network or a local area network, or a combination of the two, and data transmission is implemented using a wireless or wired link.
In some embodiments, the electronic map determination method provided by the embodiments of the present disclosure may be implemented by an electronic map determination apparatus. For example, the electronic map determination apparatus 400 runs a client, and the client 410 may be a client for determining an electronic map. The client may capture a front view image and/or a panoramic image of the lane and transmit the images to the server 200 through the network 300.
When the electronic map is required to be determined, the client acquires a front view image and/or a panoramic image of the lane, wherein the client can shoot the lane through a camera inside the electronic map determination device 400; a front view image and/or a panoramic image captured independently of the camera of the electronic map determination device 400 may also be received. Wherein the panoramic image is used to determine an overhead image of the lane.
In some embodiments, taking the electronic device as a server as an example, the electronic map determination method provided by the embodiments of the present disclosure may be cooperatively implemented by the server and the electronic map determination device.
When the electronic map to be determined needs to be performed, the client acquires a front view and/or a top view of the lane, wherein the client can shoot the lane through a camera inside the electronic map determining device 400; a front view image and/or a panoramic image captured independently of the camera of the electronic map determination device 400 may also be received. Wherein the panoramic image is used to determine an overhead image of the lane. Then, the server 200 determines the type of the lane and the driving attribute of the lane based on the text information on the ground in the front-view image; the server 200 determines the type of the lane based on the lane mark corresponding to the lane in the front-view image; determining position information of the lane based on the lane identification in response to the type of the lane determined based on the text information being the same as the type of the lane determined based on the lane identification; and identifying the type of the lane and the driving attribute on an electronic map based on the position information of the lane so as to realize the determination of the electronic map. Wherein the electronic map may be stored in the database 500.
In some embodiments, the electronic map determination apparatus 400 or the server 200 may implement the electronic map determination method provided by the embodiments of the present disclosure by executing a computer program, for example, the computer program may be a native program or a software module in an operating system; may be a local (Native) Application (APP), i.e. a program that needs to be installed in the operating system to run; or may be an applet, i.e. a program that can be run only by downloading it to the browser environment; but also an applet that can be embedded into any APP. In general, the computer programs described above may be any form of application, module or plug-in.
In practical applications, the server 200 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a Cloud server providing basic Cloud computing services such as a Cloud service, a Cloud database, Cloud computing, a Cloud function, Cloud storage, a network service, Cloud communication, a middleware service, a domain name service, a security service, a CDN, and a big data and artificial intelligence platform, where Cloud Technology (Cloud Technology) refers to a hosting Technology for unifying series resources such as hardware, software, and a network in a wide area network or a local area network to implement computing, storage, processing, and sharing of data. The electronic map determination apparatus 400 and the server 200 may be directly or indirectly connected through wired or wireless communication, and the disclosure is not limited thereto.
Based on the above description of the electronic map determination system provided by the embodiment of the present disclosure, the electronic map determination method provided by the embodiment of the present disclosure is described below. In some embodiments, the electronic map determining method provided by the embodiment of the present disclosure may be implemented by a server or an electronic map determining device alone, or implemented by a server and an electronic map determining device in a cooperation manner, and the electronic map determining method provided by the embodiment of the present disclosure is described below by taking an embodiment of the electronic map determining device as an example. Referring to fig. 3, an alternative flow chart of the electronic map determining method provided by the embodiment of the disclosure is shown, and the method will be described according to various steps.
Fig. 3 is a schematic flow chart illustrating an alternative electronic map determining method provided by the embodiment of the present disclosure, which will be described according to various steps.
Step S301, determining the type of the lane and the driving attribute of the lane based on the text information on the ground corresponding to the lane.
In some embodiments, an electronic map determination device (hereinafter referred to simply as a device) determines a type of a lane and a driving property of the lane based on text information on a ground surface corresponding to the lane. Wherein the type of lane comprises at least one of: a bus lane, a bus rapid transit exclusive lane, a tide lane, a variable lane and an emergency lane; the driving attribute information of the lane includes: for the time that a non-special vehicle cannot travel in the lane (or the time that only a special vehicle is allowed to travel in the lane), for example, in the case that the ground text information is "bus lane, 17:30-19:30, 7:30-9: 30" for a bus lane, the travel attribute information of the bus lane is: the non-driving time of the non-bus in the bus lane is 7:30-9:30, 17:30-19: 30.
In some optional embodiments, the device acquires a front-view image corresponding to the lane; acquiring text information on the ground corresponding to the lane based on the front-view image; determining that the lane is a first type lane in response to the text information including first information; determining a driving property of the lane based on second information included in the text information.
Wherein the front view image comprises an image captured by the vehicle in the direction of travel. The first information includes at least one of: "public transport", "special transport", "public transport" and "public transport"; the first type of lane comprises a bus lane; the second information includes time information such as '17: 30-19:30, 7:30-9: 30' in the text information.
In specific implementation, the device acquires first position information of a pixel communication area in the front-view image; acquiring a pixel corresponding to the first position information in an overhead view image corresponding to the lane; performing cluster analysis on pixels corresponding to the first position information to obtain at least one sub-recognition frame; acquiring a text recognition box based on the at least one sub-recognition box; and recognizing the text in the text recognition box, wherein the recognition result is the text information on the ground corresponding to the lane.
For example, the device performs semantic segmentation based on the front-view image, and determines a pixel connected region corresponding to the text information on the ground and first position information of the pixel connected region based on the result of the semantic segmentation; and acquiring first position information of the pixel connected region in the front-view image. Due to the fact that text information in the front view is seriously deformed, in order to guarantee the accuracy of recognition, a top view image corresponding to the front view image is obtained, clustering analysis is conducted on pixels corresponding to the first position information in the top view image, and at least one sub-recognition frame is obtained; acquiring a text recognition box based on the at least one sub-recognition box; and recognizing the text in the text recognition box, wherein the recognition result is the text information on the ground corresponding to the lane.
In some optional embodiments, at least one sub recognition box may be obtained Based on a Density-Based Clustering of Applications with Noise (DBSCAN) with Noise; text in the text recognition box may also be recognized based on OCR character recognition techniques. In addition, other clustering methods and character recognition methods which can achieve the same idea in the embodiment of the disclosure can be used to achieve the same purpose as the embodiment of the disclosure; the present disclosure is not particularly limited to specific clustering methods and character recognition methods.
Step S302, determining the type of the lane based on the lane mark corresponding to the lane.
In some embodiments, the apparatus determines the type of the lane based on a lane identification corresponding to the lane. The lane mark can be a traffic sign which is erected on two sides or above the lane and displayed in the air.
In specific implementation, the device acquires a front-view image corresponding to the lane; acquiring at least two lane marks corresponding to the lanes based on the front-view image; determining that the lane is a first type of lane in response to any of the at least two lane markings including the first information.
In some optional embodiments, lane markers of the bus lane are set at the start position and the end position of the bus lane, and are used for marking the range of the bus lane; the device identifies lane marks in the front-view image, performs fine classification on the lane marks based on a deep learning technology, and determines the type of the lane.
Step S303, in response to the type of the lane determined based on the text information being the same as the type of the lane determined based on the lane marking, determining position information of the lane based on the lane marking.
In some embodiments, the apparatus determines a type of a lane based on text information on the ground corresponding to the lane, and confirms whether the two are the same after determining the type of the lane based on the lane identification; determining position information of the lane based on the lane marker in response to the type of the lane determined based on the text information being the same as the type of the lane determined based on the lane marker.
Or, in response to the type of lane determined based on the text information being different from the type of lane determined based on the lane identification, marking the lane. The type of lane may then be confirmed manually. For example, the device determines that the lane is a bus lane based on text information on the ground corresponding to the lane; if the device confirms that the front view corresponding to the lane does not have the lane mark, determining that the lane is a non-public traffic lane; marking the lane if the types of the lanes determined based on the text information on the ground and the lane marks are different, and confirming the types of the lanes in a manual mode.
In some embodiments, the apparatus determines position information of at least two lane markers corresponding to the lane, respectively entry position information of the lane and exit position information of the lane.
In specific implementation, the device determines second position information of the electronic equipment for acquiring the front-view image and the number of pixels between the at least two lane markers and the electronic equipment respectively; determining position information of a lane mark corresponding to the lane exit based on second position information of the electronic equipment and the number of pixels between the electronic equipment and the lane mark corresponding to the lane exit; and/or determining the position information of the lane mark corresponding to the lane entrance based on the second position information of the electronic equipment and the number of pixels between the electronic equipment and the lane mark corresponding to the lane entrance.
Step S304, identifying the type of the lane and the driving attribute on an electronic map based on the position information of the lane.
In some embodiments, the apparatus identifies a type of the lane and driving attributes of the lane on an electronic map based on entrance position information of the lane and exit position information of the lane to facilitate subsequent planning of a navigation path.
Therefore, by the electronic map determining method provided by the embodiment of the disclosure, the type of the lane is determined jointly through the text information on the ground of the lane and the lane mark corresponding to the lane; under the condition that the types of the lanes determined by the electronic map and the lane type are the same, further marking the types of the lanes and the driving attribute information on the electronic map; on one hand, the electronic map is refined from the existing road level to the lane level; on the other hand, the type-level driving attribute information of the lane can be more accurately identified in the electronic map, so that the lane-level navigation based on the electronic map is facilitated subsequently, and the user experience is improved.
Fig. 4 is a schematic flow chart illustrating another alternative electronic map confirmation method provided in the embodiment of the present disclosure, which will be described according to various parts.
In some embodiments, after the step S301 to the step S304 are performed to identify the type of the lane and the driving attribute on the electronic map, the method may further include:
and step S401, planning a navigation path based on the types of the lanes in the electronic map and the driving attributes.
In some embodiments, the device determines whether the current time is within a time interval corresponding to the driving attribute; if the current time is within the time interval corresponding to the driving attribute, planning the navigation path without planning the navigation path based on the lane; or if the current time is not in the time interval corresponding to the driving attribute, planning a navigation path based on the lane when planning the navigation path.
In some optional embodiments, during the running of the vehicle, the method further comprises:
step S402, if the current time is in the time interval corresponding to the driving attribute, responding to the driving of the vehicle on the lane, and sending first warning information.
In some embodiments, if the current time is within the time interval corresponding to the driving attribute and the vehicle is driving in the lane, sending a first warning message. The first warning information is used for representing that the current vehicle occupies a bus lane.
Therefore, under the condition that the user does not drive according to the specified lane, and the illegal vehicle is illegally occupied, the illegal vehicle can be found in time based on the lane-level electronic map, and an alarm is given to prompt the vehicle to switch lanes in time, so that the bus can normally drive, and the influence on the normal driving of the bus is avoided.
And step S403, if the current time is in the time interval corresponding to the driving attribute, responding to the driving of the vehicle on the lane, and sending second warning information.
In some embodiments, if the current time is within the time interval corresponding to the driving attribute, a second warning message is sent in response to the vehicle entering the lane or exiting the lane. Wherein the second warning information is used for representing that the vehicle enters the lane or exits the lane.
Therefore, when a non-bus vehicle enters or leaves the lane in the bus lane special time (the time that only the bus vehicle is allowed to run on the lane), an alarm is given, the violation of the bus is timely prompted, the bus runs in a standard mode, and the influence on the normal running of the bus is avoided.
And step S404, if the current time is not in the time interval corresponding to the driving attribute, responding to the driving of the vehicle on the lane, and sending third warning information.
In some optional embodiments, if the current time is not within the time interval corresponding to the driving attribute, the device responds to the vehicle driving in the lane and sends third warning information; and the third warning information is used for representing and prompting avoidance of the public transport vehicle.
A bus stop station is generally arranged on a bus lane for the bus to stop and drop passengers; if the current time is not in the time interval corresponding to the driving attribute, and the vehicle drives in the lane, the third warning information is sent under the condition that the vehicle is confirmed to drive in a certain range near the bus stop, a driver is prompted to pay attention to the bus stop in the front, pay attention to the deceleration to avoid the bus or pay attention to the bus on the right and the like, the vehicle is prompted to drive in a standard mode, and the image bus is prevented from driving normally.
Therefore, compared with the traditional navigation engine, the lane-level navigation engine can judge whether a driver belongs to the standard driving or not and whether potential safety hazards exist or not according to the relation between the driving position of the vehicle and the bus lane for the intelligent navigation prompts such as intelligent planning, lane occupying prompting, entering and exiting bus lane providing, parking prompting and the like in the navigation process.
Fig. 5 shows a schematic flow chart of yet another alternative electronic map determining method provided by the embodiment of the present disclosure, which will be described according to various steps.
In the embodiment of the disclosure, the electronic map confirmation method may include identification of ground text information and identification of air sign information; no precedence order exists between the two.
Step S501, determining the type of the lane and the driving attribute of the lane based on the text information on the ground corresponding to the lane.
In some embodiments, the electronic map confirmation apparatus determining the type of the lane and the driving property of the lane based on text information on the ground corresponding to the lane may include: acquiring first position information of a pixel communication area in the front-view image; acquiring a pixel corresponding to the first position information in an overhead view image corresponding to the lane; performing cluster analysis on pixels corresponding to the first position information to obtain at least one sub-recognition frame; acquiring a text recognition box based on the at least one sub-recognition box; recognizing the text in the text recognition box, wherein the recognition result is the text information on the ground corresponding to the lane; determining that the lane is a first type of lane in response to the text information including first information.
In specific implementation, the device can perform semantic segmentation on the front-view image based on a deep neural network segmentation model (such as U-Net), and extract first position information of a pixel connected region containing ground characters in a semantic segmentation result.
Wherein fig. 6(a) shows a front view image of a lane;
FIG. 6(b) is a diagram illustrating semantic segmentation results provided by an embodiment of the present disclosure; as shown in fig. 6(b), different gray scales characterize different categories. Determining pixels corresponding to the first position information in the overhead view image corresponding to the lane, and performing cluster analysis on the pixels;
FIG. 6(c) is a schematic diagram illustrating obtaining at least one sub recognition box provided by an embodiment of the present disclosure; as shown in fig. 6(c), for a plurality of terrestrial text connected regions in one image, at least one sub-recognition box is obtained by dense clustering with DBSCAN.
FIG. 6(d) is a schematic diagram illustrating an obtained text recognition box provided by an embodiment of the present disclosure; as shown in fig. 6(d), the at least one sub recognition box is merged to obtain a text recognition box. And segmenting the text recognition box from the top view, driving the text recognition box into an OCR character recognition engine for recognition, and if the recognition result comprises first information, indicating that the lane is a bus lane.
Further, attribute information of the lane is added to the electronic map.
And step S502, determining the type of the lane based on the lane mark corresponding to the lane.
In some embodiments, the apparatus determining the type of the lane based on the lane identification corresponding to the lane may include: performing semantic segmentation on the front-view image of the lane, and determining the position of an aerial sign (lane mark) in a semantic segmentation result; clustering the air signs, and determining text information included in the air signs based on the clustering result; and if the text information included by the air sign comprises first information, determining that the lane is a bus lane.
In some optional embodiments, after determining that the lane is a bus lane, the apparatus may further obtain at least one front view image corresponding to the lane, and obtain at least two lane markers corresponding to the lane based on the at least one front view image; and determining the position information of at least two lane marks corresponding to the lane, namely the entrance position information of the lane and the exit position information of the lane.
In some embodiments, the apparatus may further identify the type of the lane, entrance position information of the lane, and exit position information of the lane on the electronic map. The device may also update the electronic map based on the type of the lane determined by the lane identifier corresponding to the lane, if the type of the lane in the electronic map is different from the type of the lane determined based on the lane identifier corresponding to the lane.
In some optional embodiments, the apparatus compares the confirmation results of the types of the lanes of step S501 and step S502; if the type of the lane determined based on the text information is the same as the type of the lane determined based on the lane mark, confirming that the lane is a bus lane, and identifying the type of the lane, the driving attribute of the lane, the entrance position information of the lane and the exit position information of the lane in the electronic map; if the type of the lane determined based on the text information is different from the type of the lane determined based on the lane mark, giving an alarm; to indicate the need to further verify the type of lane.
Fig. 7 shows an alternative schematic diagram of an electronic map identifier provided by an embodiment of the present disclosure.
In fig. 7, the type of the lane, the driving property of the lane, the entrance position information of the lane, and the exit position information of the lane are identified in the electronic map based on the text information on the ground and the lane identification.
And step S503, planning a navigation path based on the types of the lanes in the electronic map and the driving attributes.
In some embodiments, the device determines whether the current time is within a time interval corresponding to the driving attribute; if the current time is within the time interval corresponding to the driving attribute, planning the navigation path without planning the navigation path based on the lane; or if the current time is not in the time interval corresponding to the driving attribute, planning a navigation path based on the lane when planning the navigation path. Wherein the planning the navigation path without the basis of the lane comprises: the lane is not considered in the navigation path completely, and the lane is not included in the final navigation result completely; the planning a navigation path based on the lane comprises: the lane is considered in the navigation path, and the final navigation result may or may not include the lane.
Fig. 8 shows an alternative schematic diagram of planning a navigation path provided by an embodiment of the present disclosure.
As shown in fig. 8, if the driving attributes of the lanes are 7:30-9:30 and 17:30-19:30 and the navigation time is 8:30, the lanes are not considered in the navigation path; and if the driving attributes of the lanes are 7:30-9:30 and 17:30-19:30 and the navigation time is 11:30, the lanes are considered in the navigation path.
Therefore, by the electronic map determining method provided by the embodiment of the disclosure, the type of the lane is determined jointly through the text information on the ground of the lane and the lane mark corresponding to the lane; under the condition that the types of the lanes determined by the electronic map and the lane type are the same, further marking the types of the lanes and the driving attribute information on the electronic map; on one hand, the electronic map is refined from the existing road level to the lane level; on the other hand, the type-level driving attribute information of the lane can be more accurately identified in the electronic map; in the subsequent navigation process, compared with the traditional navigation engine, the lane-level navigation engine can judge whether a driver belongs to the standard driving or not according to the relation between the driving position of a vehicle and a bus lane and whether potential safety hazards exist or not and improve user experience for intelligent planning, lane occupying prompt, entering and exiting bus lane lifting, parking prompt and other intelligent navigation prompts in the navigation process.
Fig. 9 is a schematic diagram illustrating an alternative structure of an electronic map determining apparatus provided in an embodiment of the present disclosure, which will be described according to various parts.
In some embodiments, the electronic map determination apparatus 600 includes: a first determining unit 601, a second determining unit 602, a responding unit 603 and an identifying unit 604.
The first determining unit 601 is configured to determine a type of a lane and a driving property of the lane based on text information on a ground corresponding to the lane;
the second determining unit 602 is configured to determine a type of the lane based on a lane identifier corresponding to the lane;
the response unit 603 is configured to determine position information of the lane based on the lane marker in response to that the type of the lane determined based on the text information is the same as the type of the lane determined based on the lane marker;
the identifying unit 604 is configured to identify a type of the lane and the driving attribute on an electronic map based on the position information of the lane.
The first determining unit 601 is specifically configured to acquire a front view image corresponding to the lane; acquiring text information on the ground corresponding to the lane based on the front-view image; determining that the lane is a first type lane in response to the text information including first information; determining a driving property of the lane based on second information included in the text information.
The first determining unit 601 is specifically configured to acquire first position information of a pixel connected region in the front view image; acquiring a pixel corresponding to the first position information in an overhead view image corresponding to the lane; performing cluster analysis on pixels corresponding to the first position information to obtain at least one sub-recognition frame; acquiring a text recognition box based on the at least one sub-recognition box; and recognizing the text in the text recognition box, wherein the recognition result is the text information on the ground corresponding to the lane.
The second determining unit 602 is specifically configured to obtain a front view image corresponding to the lane; acquiring at least two lane marks corresponding to the lanes based on the front-view image; determining that the lane is a first type of lane in response to any of the at least two lane markings including the first information.
The second determining unit 602 is specifically configured to determine position information of at least two lane marks corresponding to the lane, where the position information is entry position information of the lane and exit position information of the lane.
The second determining unit 602 is specifically configured to determine second position information of the electronic device that acquires the front-view image, and a number of pixels between each of the at least two lane markers and the electronic device; determining position information of a lane mark corresponding to the lane exit based on second position information of the electronic equipment and the number of pixels between the electronic equipment and the lane mark corresponding to the lane exit; and/or determining the position information of the lane mark corresponding to the lane entrance based on the second position information of the electronic equipment and the number of pixels between the electronic equipment and the lane mark corresponding to the lane entrance.
In some embodiments, the electronic map determining apparatus 600 may further include: a planning unit 605.
The planning unit 605 is configured to plan a navigation path based on the type of the lane in the electronic map and the driving attribute.
The planning unit 605 is specifically configured to determine whether the current time is within a time interval corresponding to the driving attribute; if the current time is within the time interval corresponding to the driving attribute, planning the navigation path without planning the navigation path based on the lane; or if the current time is not in the time interval corresponding to the driving attribute, planning a navigation path based on the lane when planning the navigation path.
The planning unit 605 is further configured to at least one of:
transmitting first warning information in response to the vehicle traveling in the lane;
and sending second warning information in response to the vehicle entering the lane or exiting the lane.
In the technical scheme of the disclosure, the acquisition, storage, application and the like of the personal information of the related user all accord with the regulations of related laws and regulations, and do not violate the good customs of the public order.
The present disclosure also provides an electronic device, a readable storage medium, and a computer program product according to embodiments of the present disclosure.
FIG. 10 shows a schematic block diagram of an example electronic device 800 that may be used to implement embodiments of the present disclosure. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 10, the apparatus 800 includes a computing unit 801 that can perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM)802 or a computer program loaded from a storage unit 808 into a Random Access Memory (RAM) 803. In the RAM 803, various programs and data required for the operation of the device 800 can also be stored. The calculation unit 801, the ROM 802, and the RAM 803 are connected to each other by a bus 804. An input/output (I/O) interface 805 is also connected to bus 804.
A number of components in the device 800 are connected to the I/O interface 805, including: an input unit 806, such as a keyboard, a mouse, or the like; an output unit 807 such as various types of displays, speakers, and the like; a storage unit 808, such as a magnetic disk, optical disk, or the like; and a communication unit 809 such as a network card, modem, wireless communication transceiver, etc. The communication unit 809 allows the device 800 to exchange information/data with other devices via a computer network such as the internet and/or various telecommunication networks.
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, Field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), system on a chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for implementing the methods of the present disclosure may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program codes, when executed by the processor or controller, cause the functions/operations specified in the flowchart and/or block diagram to be performed. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server may be a cloud server, a server of a distributed system, or a server with a combined blockchain.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present disclosure may be executed in parallel, sequentially, or in different orders, as long as the desired results of the technical solutions disclosed in the present disclosure can be achieved, and the present disclosure is not limited herein.
The above detailed description should not be construed as limiting the scope of the disclosure. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present disclosure should be included in the scope of protection of the present disclosure.
Claims (13)
1. The electronic map determining method comprises the following steps:
determining the type of the lane and the driving attribute of the lane based on text information on the ground corresponding to the lane;
determining the type of the lane based on the lane mark corresponding to the lane;
determining position information of the lane based on the lane identification in response to the type of the lane determined based on the text information being the same as the type of the lane determined based on the lane identification;
identifying the type of the lane and the driving attribute on an electronic map based on the position information of the lane.
2. The method of claim 1, wherein the determining the type of the lane and the driving attributes of the lane based on text information on the ground corresponding to the lane comprises:
acquiring a front-view image corresponding to the lane;
acquiring text information on the ground corresponding to the lane based on the front-view image;
determining that the lane is a first type lane in response to the text information including first information;
determining a driving property of the lane based on second information included in the text information.
3. The method of claim 2, wherein the obtaining text information on the ground corresponding to the lane based on the front view image comprises:
acquiring first position information of a pixel communication area in the front-view image;
acquiring a pixel corresponding to the first position information in an overhead view image corresponding to the lane;
performing cluster analysis on pixels corresponding to the first position information to obtain at least one sub-recognition frame;
acquiring a text recognition box based on the at least one sub-recognition box;
and recognizing the text in the text recognition box, wherein the recognition result is the text information on the ground corresponding to the lane.
4. The method of claim 1, wherein the determining the type of the lane based on the lane identification to which the lane corresponds comprises:
acquiring a front-view image corresponding to the lane;
acquiring at least two lane marks corresponding to the lanes based on the front-view image;
determining that the lane is a first type of lane in response to any of the at least two lane markings including the first information.
5. The method of claim 1, wherein the determining the location information of the lane based on the lane identification comprises:
and determining the position information of at least two lane marks corresponding to the lane, namely the entrance position information of the lane and the exit position information of the lane.
6. The method of claim 5, wherein the position information of the at least two lane markings is determined by:
determining second position information of electronic equipment for acquiring the front-view image and the number of pixels between the at least two lane markers and the electronic equipment respectively;
determining position information of a lane mark corresponding to the lane exit based on second position information of the electronic equipment and the number of pixels between the electronic equipment and the lane mark corresponding to the lane exit;
and/or determining the position information of the lane mark corresponding to the lane entrance based on the second position information of the electronic equipment and the number of pixels between the electronic equipment and the lane mark corresponding to the lane entrance.
7. The method of claim 1, wherein after identifying the type of lane and the driving profile on the electronic map, the method further comprises:
and planning a navigation path based on the type of the lane in the electronic map and the driving attribute.
8. The method of claim 7, wherein the planning a navigation path based on the type of lanes in the electronic map and the driving attributes comprises:
confirming whether the current time is in a time interval corresponding to the driving attribute;
if the current time is within the time interval corresponding to the driving attribute, planning the navigation path without planning the navigation path based on the lane;
or if the current time is not in the time interval corresponding to the driving attribute, planning a navigation path based on the lane when planning the navigation path.
9. The method of claim 8, wherein if the current time is within a time interval corresponding to the driving profile, the method further comprises at least one of:
transmitting first warning information in response to the vehicle traveling in the lane;
and sending second warning information in response to the vehicle entering the lane or exiting the lane.
10. Electronic map determination apparatus comprising:
a first determination unit, configured to determine a type of a lane and a driving property of the lane based on text information on a ground corresponding to the lane;
a second determining unit, configured to determine a type of the lane based on a lane identifier corresponding to the lane;
a response unit for determining position information of the lane based on the lane marker in response to the type of the lane determined based on the text information being the same as the type of the lane determined based on the lane marker;
an identification unit for identifying the type of the lane and the driving attribute on an electronic map based on the position information of the lane.
11. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-9.
12. A non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of any one of claims 1-9.
13. A computer program product comprising a computer program which, when executed by a processor, implements the method according to any one of claims 1-9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111454730.5A CN114216471A (en) | 2021-12-01 | 2021-12-01 | Electronic map determination method and device, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111454730.5A CN114216471A (en) | 2021-12-01 | 2021-12-01 | Electronic map determination method and device, electronic equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114216471A true CN114216471A (en) | 2022-03-22 |
Family
ID=80699387
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111454730.5A Pending CN114216471A (en) | 2021-12-01 | 2021-12-01 | Electronic map determination method and device, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114216471A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114719878A (en) * | 2022-04-06 | 2022-07-08 | 北京百度网讯科技有限公司 | Vehicle navigation method and device, system, electronic equipment and computer medium |
CN114937253A (en) * | 2022-06-15 | 2022-08-23 | 北京百度网讯科技有限公司 | Vehicle type information processing method and device, electronic equipment and storage medium |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009223817A (en) * | 2008-03-18 | 2009-10-01 | Zenrin Co Ltd | Method for generating road surface marked map |
US20200041284A1 (en) * | 2017-02-22 | 2020-02-06 | Wuhan Jimu Intelligent Technology Co., Ltd. | Map road marking and road quality collecting apparatus and method based on adas system |
CN111238498A (en) * | 2018-11-29 | 2020-06-05 | 沈阳美行科技有限公司 | Lane-level display road map generation method and device and related system |
CN111460861A (en) * | 2019-01-21 | 2020-07-28 | 阿里巴巴集团控股有限公司 | Road traffic sign identification method, device and identification equipment |
CN112733793A (en) * | 2021-01-22 | 2021-04-30 | 北京嘀嘀无限科技发展有限公司 | Method and device for detecting bus lane, electronic equipment and storage medium |
CN113029187A (en) * | 2021-03-30 | 2021-06-25 | 武汉理工大学 | Lane-level navigation method and system fusing ADAS fine perception data |
US20210312195A1 (en) * | 2020-12-16 | 2021-10-07 | Apollo Intelligent Connectivity (Beijing) Technology Co., Ltd. | Lane marking detecting method, apparatus, electronic device, storage medium, and vehicle |
-
2021
- 2021-12-01 CN CN202111454730.5A patent/CN114216471A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009223817A (en) * | 2008-03-18 | 2009-10-01 | Zenrin Co Ltd | Method for generating road surface marked map |
US20200041284A1 (en) * | 2017-02-22 | 2020-02-06 | Wuhan Jimu Intelligent Technology Co., Ltd. | Map road marking and road quality collecting apparatus and method based on adas system |
CN111238498A (en) * | 2018-11-29 | 2020-06-05 | 沈阳美行科技有限公司 | Lane-level display road map generation method and device and related system |
CN111460861A (en) * | 2019-01-21 | 2020-07-28 | 阿里巴巴集团控股有限公司 | Road traffic sign identification method, device and identification equipment |
US20210312195A1 (en) * | 2020-12-16 | 2021-10-07 | Apollo Intelligent Connectivity (Beijing) Technology Co., Ltd. | Lane marking detecting method, apparatus, electronic device, storage medium, and vehicle |
CN112733793A (en) * | 2021-01-22 | 2021-04-30 | 北京嘀嘀无限科技发展有限公司 | Method and device for detecting bus lane, electronic equipment and storage medium |
CN113029187A (en) * | 2021-03-30 | 2021-06-25 | 武汉理工大学 | Lane-level navigation method and system fusing ADAS fine perception data |
Non-Patent Citations (2)
Title |
---|
周知红;: "面向无人驾驶的车道级道路电子地图制作", 测绘与空间地理信息, no. 02, 25 February 2018 (2018-02-25) * |
徐修健: "公交专用车道视频图像监控系统的设计与实现", 公 交 专 用 车 道 视 频 图 像 监 控 系 统 的 设 计 与 实 现, 15 May 2015 (2015-05-15) * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114719878A (en) * | 2022-04-06 | 2022-07-08 | 北京百度网讯科技有限公司 | Vehicle navigation method and device, system, electronic equipment and computer medium |
CN114937253A (en) * | 2022-06-15 | 2022-08-23 | 北京百度网讯科技有限公司 | Vehicle type information processing method and device, electronic equipment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20220215667A1 (en) | Method and apparatus for monitoring vehicle, cloud control platform and system for vehicle-road collaboration | |
CN111611955B (en) | Method, device, equipment and storage medium for identifying passable construction road | |
CN112634611B (en) | Method, device, equipment and storage medium for identifying road conditions | |
CN114216471A (en) | Electronic map determination method and device, electronic equipment and storage medium | |
CN112598192B (en) | Method and device for predicting vehicle entering logistics park, storage medium and terminal | |
CN112665606A (en) | Walking navigation method, device, equipment and storage medium | |
US20220237529A1 (en) | Method, electronic device and storage medium for determining status of trajectory point | |
CN112069279A (en) | Map data updating method, device, equipment and readable storage medium | |
US20230159052A1 (en) | Method for processing behavior data, method for controlling autonomous vehicle, and autonomous vehicle | |
CN115880928A (en) | Real-time updating method, device and equipment for automatic driving high-precision map and storage medium | |
CN115060249A (en) | Electronic map construction method, device, equipment and medium | |
CN115273477A (en) | Crossing driving suggestion pushing method, device and system and electronic equipment | |
CN112926630B (en) | Route planning method, route planning device, electronic equipment and computer readable medium | |
CN113420692A (en) | Method, apparatus, device, medium, and program product for generating direction recognition model | |
CN114677848A (en) | Perception early warning system, method, device and computer program product | |
CN114998863B (en) | Target road identification method, device, electronic equipment and storage medium | |
CN114724113B (en) | Road sign recognition method, automatic driving method, device and equipment | |
US20220390249A1 (en) | Method and apparatus for generating direction identifying model, device, medium, and program product | |
US20230126172A1 (en) | Method of outputting prompt information, device, medium, and vehicle | |
CN114582125B (en) | Method, device, equipment and storage medium for identifying road traffic direction | |
CN115114312A (en) | Map data updating method and device and electronic equipment | |
CN114179805A (en) | Driving direction determining method, device, equipment and storage medium | |
CN114218344A (en) | Map data updating method, apparatus, device, storage medium, and program product | |
CN114689061A (en) | Navigation route processing method and device of automatic driving equipment and electronic equipment | |
CN112861701A (en) | Illegal parking identification method and device, electronic equipment and computer readable medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |