CN109115242B - Navigation evaluation method, device, terminal, server and storage medium - Google Patents

Navigation evaluation method, device, terminal, server and storage medium Download PDF

Info

Publication number
CN109115242B
CN109115242B CN201810985464.0A CN201810985464A CN109115242B CN 109115242 B CN109115242 B CN 109115242B CN 201810985464 A CN201810985464 A CN 201810985464A CN 109115242 B CN109115242 B CN 109115242B
Authority
CN
China
Prior art keywords
navigation
arrow
screenshot
image
guidance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810985464.0A
Other languages
Chinese (zh)
Other versions
CN109115242A (en
Inventor
贾康
袁辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201810985464.0A priority Critical patent/CN109115242B/en
Publication of CN109115242A publication Critical patent/CN109115242A/en
Application granted granted Critical
Publication of CN109115242B publication Critical patent/CN109115242B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass

Abstract

The embodiment of the invention discloses a navigation evaluation method, a navigation evaluation device, a terminal, a server and a storage medium. The method comprises the following steps: performing navigation test on the terminal based on a plurality of different pre-constructed navigation tracks; in the navigation test process, when an enlarged intersection image is generated, screenshot is carried out on a navigation interface; and sending the screenshot to a server so that the server can identify the direction of the guidance arrow in the enlarged intersection image on the screenshot and the direction of the arrow of the guidance target on the guidance panel by using an image identification algorithm, and evaluating the enlarged intersection image according to the consistency of the guidance direction and the direction of the arrow. By adopting the technical scheme, the embodiment of the invention utilizes the navigation track to test, generates and intercepts the enlarged road junction image, and adopts the image recognition algorithm to evaluate the accuracy of the enlarged road junction image, thereby improving the efficiency of evaluating the enlarged road junction image and realizing the integral evaluation of the manufacturing effect of the enlarged road junction image.

Description

Navigation evaluation method, device, terminal, server and storage medium
Technical Field
The embodiment of the invention relates to the technical field of computers, in particular to a navigation evaluation method, a navigation evaluation device, a navigation evaluation terminal, a navigation evaluation server and a storage medium.
Background
With the development of technology, people increasingly apply navigation software when going out. In the process of navigation by using navigation software, when a user drives to a crossroad, the navigation software usually generates an enlarged crossroad image to guide the user because voice prompt is long and the driving direction is difficult to express accurately.
The enlarged intersection usually includes lane information of a driving road junction and an inducing arrow pointing to a forward direction to indicate a correct forward direction, and the accuracy of the enlarged intersection directly affects the navigation effect, so that it is necessary to evaluate the production effect and the inducing effect of the enlarged intersection in a large number of different complexity levels and different regions.
However, most of the existing evaluation methods rely on human eyes to check the correctness of the vector enlarged view at the intersection, so that the efficiency is low, the time consumption for collecting a sample set is long, the data amount is small, and the whole evaluation of the manufacturing effect of the enlarged view at the intersection is not facilitated.
Disclosure of Invention
The embodiment of the invention provides a navigation evaluation method, a navigation evaluation device, a terminal, a server and a storage medium, which are used for improving the efficiency of evaluating an enlarged road junction image and realizing the overall evaluation of the manufacturing effect of the enlarged road junction image.
In a first aspect, an embodiment of the present invention provides a navigation evaluation method, which is applied to a terminal, and the method includes:
performing navigation test on the terminal based on a plurality of different pre-constructed navigation tracks;
in the navigation test process, when an enlarged intersection image is generated, screenshot is carried out on a navigation interface;
and sending the screenshot to a server so that the server can identify the direction of the guidance arrow in the enlarged intersection image on the screenshot and the direction of the arrow of the guidance target on the guidance panel by using an image identification algorithm, and evaluating the enlarged intersection image according to the consistency of the guidance direction and the direction of the arrow.
In a second aspect, an embodiment of the present invention further provides a navigation evaluation method, applied to a server, where the method includes:
acquiring a screenshot of a corresponding navigation interface when the intersection enlarged image is generated, which is sent by a terminal;
and identifying the direction of the guidance arrow in the enlarged intersection image on the screenshot and the direction of the arrow of the steering guidance target on the guidance panel by using an image identification algorithm, and evaluating the enlarged intersection image according to the consistency of the guidance direction and the direction of the arrow.
In a third aspect, an embodiment of the present invention further provides a navigation evaluation apparatus configured in a terminal, where the navigation evaluation apparatus includes:
the test module is used for carrying out navigation test on the terminal based on a plurality of different pre-constructed navigation tracks;
the screenshot module is used for screenshot the navigation interface when generating an enlarged intersection image in the navigation test process;
and the sending module is used for sending the screenshot to a server so that the server can identify the direction of the guidance of the turning arrow in the enlarged intersection image on the screenshot and the direction of the turning guidance mark on the guidance panel by using an image identification algorithm, and evaluating the enlarged intersection image according to the consistency of the guidance direction and the direction of the turning arrow.
In a fourth aspect, an embodiment of the present invention provides a navigation evaluation apparatus configured in a server, where the apparatus includes:
the acquisition module is used for acquiring a screenshot of a corresponding navigation interface when the enlarged intersection image is generated, which is sent by the terminal;
and the evaluation module is used for identifying the direction of the guidance arrow in the enlarged intersection image on the screenshot and the direction of the arrow of the steering guidance target on the guidance panel by using an image identification algorithm, and evaluating the enlarged intersection image according to the consistency of the guidance direction and the direction of the arrow.
In a fifth aspect, the present invention further provides a terminal, including:
one or more processors;
a storage device for storing one or more programs,
when the one or more programs are executed by the one or more processors, the one or more processors implement any of the navigation evaluation methods applied to the terminal in the embodiments of the present invention.
In a sixth aspect, the present invention further provides a server, including:
one or more processors;
a storage device for storing one or more programs,
when the one or more programs are executed by the one or more processors, the one or more processors implement the navigation evaluation method applied to the server in any of the embodiments of the present invention.
In a seventh aspect, an embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the navigation evaluation method applied to the terminal in any one of the embodiments of the present invention.
In an eighth aspect, the embodiment of the present invention further provides another computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the navigation evaluation method applied to a server in any one of the embodiments of the present invention.
The embodiment of the invention carries out navigation test through the terminal based on a plurality of different navigation tracks which are constructed in advance, carries out screenshot on a navigation interface when generating an enlarged intersection image in the navigation test process, and sends the screenshot to the server, so that the server can identify the direction of the arrow turning in the enlarged intersection image on the screenshot and the direction of the arrow turning an induction target on an induction panel by using an image recognition algorithm, and evaluate the enlarged intersection image according to the consistency of the induction direction and the direction of the arrow. By adopting the technical scheme, the embodiment of the invention adopts the image recognition algorithm to evaluate the accuracy of the road junction enlarged image in the process of testing by utilizing the navigation track, thereby improving the efficiency of evaluating the road junction enlarged image and realizing the integral evaluation of the manufacturing effect of the road junction enlarged image.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings used in the description of the embodiments of the present invention will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the contents of the embodiments of the present invention and the drawings without creative efforts.
Fig. 1 is a schematic flow chart of a navigation evaluation method according to an embodiment of the present invention;
fig. 2 is a schematic flow chart of a navigation evaluation method according to a second embodiment of the present invention;
fig. 3 is a schematic flow chart of a navigation evaluation method according to a third embodiment of the present invention;
fig. 4 is a schematic flow chart of a navigation evaluation method according to a fourth embodiment of the present invention;
fig. 5 is a schematic structural diagram of a navigation evaluation apparatus according to a fifth embodiment of the present invention;
fig. 6 is a schematic structural diagram of a navigation evaluation apparatus according to a sixth embodiment of the present invention;
fig. 7 is a schematic structural diagram of a terminal according to a seventh embodiment of the present invention.
Detailed Description
The technical scheme of the invention is further explained by the specific implementation mode in combination with the attached drawings. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Example one
Fig. 1 is a flowchart of a navigation evaluation method according to an embodiment of the present invention, which is applicable to a case of evaluating a navigation application, especially a case of evaluating the correctness of an enlarged intersection image generated in the navigation application, and the method can be executed by a navigation evaluation device, which can be implemented by software and/or hardware, and which can be integrated in any terminal having a network communication function and a display function, such as a typical user terminal device, for example, a mobile phone, a tablet computer, or a vehicle-mounted navigator. Referring to fig. 1, the method of the present embodiment specifically includes:
and S110, performing navigation test on the terminal based on a plurality of different pre-constructed navigation tracks.
The terminal receives a plurality of different navigation tracks issued by the server and is used for carrying out navigation test. In the embodiment, a software simulation test can be adopted for a navigation test, and in the test process, the whole road navigation process of the navigation application in the vehicle driving process is simulated based on the navigation track.
And S120, in the navigation test process, when the enlarged intersection image is generated, screenshot is carried out on the navigation interface.
Usually, when driving to an intersection, the navigation system generates an enlarged intersection image, and displays the navigation information on the navigation interface in a picture form. Lane information at the intersection and a turning arrow are generally displayed on the enlarged intersection, and the vehicle is induced by the advancing direction of the turning arrow. In addition, a guidance panel is also displayed on the navigation interface, and the distance from the current position to the intersection and the steering guidance marks are usually displayed on the guidance panel. Illustratively, the steering guidance indicator is an arrow representing a steering direction, and can indicate directions of left turning, right turning, straight going, turning around, and the like.
In this embodiment, during the navigation test, the navigation process of the real navigation application may be simulated, and the enlarged intersection image may also be generated on the navigation interface when the enlarged intersection image needs to be generated. At this time, screenshot can be performed on the navigation interface, so that whether the intersection enlarged image is accurate or not can be evaluated subsequently according to the screenshot.
S130, sending the screenshot to a server so that the server can identify the direction of the guidance arrow in the enlarged intersection image on the screenshot and the direction of the arrow of the guidance target on the guidance panel by using an image identification algorithm, and evaluating the enlarged intersection image according to the consistency of the guidance direction and the direction of the arrow.
Specifically, an enlarged intersection image and a guidance panel of the current intersection are displayed on the captured screenshot. For example, if the current driving position is 154 meters away from the crossroad and the turning direction entering the crossroad is a left turn, the guidance panel displays "turn left after 154 meters" and displays a left turn arrow as a turning guidance sign, and correspondingly, the intersection enlarged view displays lane information of the corresponding crossroad and displays a turning arrow turning left on the corresponding lane.
Since the accuracy of the generated enlarged intersection image is critical to correct and safe driving of the vehicle, the guidance panel can be used for evaluating the accuracy of the enlarged intersection image in the navigation application by judging whether the guidance direction of the turning arrow on the enlarged intersection image is consistent with the direction of the turning guidance mark on the guidance panel.
For example, the terminal and the server may be connected through a wired communication manner or a WIreless communication manner, and more specifically, the WIreless communication manner may be a communication manner established based on a WIreless Fidelity (WIFI) technology, for example. And after the terminal acquires the screenshot of the navigation interface, automatically sending the screenshot to a server through a pre-established communication channel.
Illustratively, the enlarged intersection comprises a vector enlarged view, a mode view, a real view or a street view. The vector enlarged view is an enlarged view of the intersection drawn by line segments and curves according to geometric characteristics of the intersection, road information is abstractly represented by the lines, and a forward direction is indicated by a turning arrow. The pattern diagram is an enlarged intersection diagram obtained through three-dimensional modeling, and indicates a forward direction by using a turning arrow, and is different from the vector enlarged diagram in that intersection information can be represented in a more three-dimensional manner. The real view and the street view indicate the advancing direction by using a turning arrow in a real intersection image, wherein the intersection image is usually acquired by an acquisition vehicle on the spot and can more truly represent intersection information.
According to the technical scheme, the terminal conducts navigation test based on a plurality of different navigation tracks which are constructed in advance, during the navigation test, when an enlarged intersection image is generated, screenshot is conducted on a navigation interface, the screenshot is sent to the server, so that the server can identify the direction of an induced arrow in the enlarged intersection image on the screenshot and the direction of an arrow of a steering induced target on an induced panel by means of an image recognition algorithm, and the enlarged intersection image is evaluated according to the consistency of the induced direction and the direction of the arrow. By adopting the technical scheme, the embodiment of the invention adopts the image recognition algorithm to evaluate the accuracy of the road junction enlarged image in the process of testing by utilizing the navigation track, thereby improving the efficiency of evaluating the road junction enlarged image and realizing the integral evaluation of the manufacturing effect of the road junction enlarged image.
Example two
Fig. 2 is a flowchart of a navigation evaluation method according to a second embodiment of the present invention, which is further optimized based on the above-mentioned embodiments. As shown in fig. 2, the method may include:
and S210, performing navigation test on the terminal based on a plurality of different pre-constructed navigation tracks.
And S220, monitoring an enlarged intersection image generation instruction in the navigation test process.
And S230, responding to the enlarged intersection image to generate an instruction, and counting down according to preset time length.
And S240, when the countdown is finished, screenshot is carried out on the current navigation interface.
The real road navigation overall process is simulated in the navigation test process, so that when the navigation application on the terminal judges that the current driving needs to generate the position of the enlarged intersection image, an enlarged intersection image generation instruction can be automatically generated to generate the enlarged intersection image. In this embodiment, the terminal monitors the intersection enlarged image generation instruction, so as to capture a navigation interface. Specifically, when the enlarged intersection image is monitored, countdown is started according to a preset time length, and when the countdown is finished, screenshot is performed on the current navigation interface.
Generally, after a navigation application generates an enlarged intersection image generation instruction, operation processing is performed for a period of time, then an enlarged intersection image is generated and displayed on a navigation interface, and after a period of time, that is, after a vehicle passes through a current position, the enlarged intersection image is canceled from being displayed. Therefore, in order to capture the enlarged intersection, the screenshot operation needs to be performed within the period from when the enlarged intersection is displayed on the navigation interface to when the enlarged intersection disappears from the navigation interface. Therefore, the preset time length is the time length preset according to the display time of the enlarged intersection image, namely the enlarged intersection image is captured after the preset time length after the enlarged intersection image generation instruction is monitored. For example, after 0.5 second passes after the navigation application generates the enlarged intersection image generation instruction, the enlarged intersection image is generated and displayed on the navigation interface, and after 5 seconds passes, the enlarged intersection image is cancelled to be displayed, so that the preset time duration can be set to 1 second, for example, when the enlarged intersection image generation instruction is monitored, the countdown is carried out for 1 second, and then the screenshot includes the enlarged intersection image and the guidance panel on the navigation interface.
And S250, sending the screenshot to a server so that the server can identify the direction of the guidance arrow in the enlarged intersection image on the screenshot and the direction of the arrow of the guidance target on the guidance panel by using an image identification algorithm, and evaluating the enlarged intersection image according to the consistency of the guidance direction and the direction of the arrow.
Illustratively, the enlarged intersection comprises a vector enlarged view, a mode view, a real view or a street view.
Further, as a preference, the method may further include:
and correspondingly sending the road data on the map base map corresponding to the screenshot and the timestamp to the server for storage.
It should be noted that the road data and the timestamp on the map base map corresponding to the screenshot are sent to the server for storage, and the purpose is to find the corresponding road data according to the timestamp to correct the road junction enlarged view if the road junction enlarged view on a certain screenshot is found to be incorrect in the process of evaluating the road junction enlarged view. Specifically, the operation of sending the road data and the timestamp may be executed together with the operation of sending the screenshot, or the road data and the timestamp may be uniformly sent to the server after the screenshot is sent or after the navigation test is completed.
According to the technical scheme, the terminal conducts navigation test based on a plurality of different navigation tracks which are constructed in advance, in the process of conducting the navigation test, an enlarged intersection image generation instruction is monitored in the process of conducting the navigation test, countdown is conducted according to the enlarged intersection image generation instruction in response to the preset duration, and when the countdown is finished, screenshot is conducted on the current navigation interface, so that the enlarged intersection image can be intercepted in the screenshot process. The screenshot is sent to the server, so that the server can recognize the direction of the guidance arrow in the screenshot enlarged road junction image and the direction of the arrow of the guidance target on the guidance panel by using an image recognition algorithm, and the enlarged road junction image is evaluated according to the consistency of the guidance direction and the direction of the arrow, so that the efficiency of evaluating the enlarged road junction image can be improved, and the overall evaluation of the effect of making the enlarged road junction image is realized.
EXAMPLE III
Fig. 3 is a schematic flow chart of a navigation evaluation method according to a third embodiment of the present invention, which is applicable to a case of evaluating a navigation application, especially a case of evaluating the correctness of an enlarged intersection view generated in the navigation application, where the method may be implemented by a navigation evaluation device, the navigation evaluation device may be implemented by software and/or hardware, and the navigation evaluation device may be integrated in a server. Referring to fig. 3, the method of the present embodiment specifically includes:
and S310, acquiring a screenshot of a corresponding navigation interface when the intersection enlarged image is generated, which is sent by the terminal.
The screenshot comprises an enlarged intersection image generated by the navigation application and an induction panel on a navigation interface.
S320, identifying the direction of the guidance arrow in the enlarged intersection image on the screenshot and the direction of the arrow of the steering guidance target on the guidance panel by using an image identification algorithm, and evaluating the enlarged intersection image according to the consistency of the guidance direction and the direction of the arrow.
Wherein, any kind of image recognition algorithm in the prior art can be adopted for realization. Illustratively, the above operations may include an image extraction process, an image recognition process, and a similarity evaluation process.
The image extraction process is a process of extracting a turning arrow in the intersection enlargement and an arrow part of a turning guidance target on the guidance panel, and may be, for example, extraction based on a contour, a region, or a color.
For example, the turning arrow of the road junction enlarged view and the arrow of the turning induction mark on the induction panel may be extracted based on the region. Specifically, a region with a fixed size at a fixed position in the screenshot of the navigation interface is selected, so that the selected region contains an arrow of a guidance panel turning guidance target and serves as a region of the identified object; a region of a fixed size centered on the center position of the enlarged intersection image is selected so that the selected region includes a turning arrow in the enlarged intersection image and is a region to be identified. For example, the guidance panel is located above the navigation interface, the arrow of the steering guidance target is located at the upper left corner of the guidance panel, and a region with a fixed size at the fixed position of the upper left corner of the guidance panel is selected, so that the selected region contains the arrow of the steering guidance target of the guidance panel; and the enlarged intersection image is positioned below the navigation interface, and the turning arrow occupies a larger area in the enlarged intersection image and is far away from the edge of the enlarged intersection image, so that the area which takes the central position of the enlarged intersection image as the central point and expands to 80% of the area of the enlarged intersection image to the periphery is selected.
Illustratively, after extracting the turning arrow of the intersection enlarged view and the area where the arrow of the turning induction mark on the induction panel is located based on the area, the induction direction of the turning arrow and the arrow direction of the turning induction mark are further extracted. In this embodiment, the extraction may be performed based on colors, for example, the color of the intersection enlarged view turning arrow and the color of the arrow of the turning guidance target may be preset in the navigation application, and the set color is not a fixed color, and the arrow may be clearly distinguished from the background color. For example, the background color of the guidance panel is usually black, the color of the arrow turning to the guidance target may be set to white; considering that the enlarged intersection view may be a vector diagram, a pattern diagram, a real view diagram or a street view diagram, the color of the lane is usually black, the color of the trees in the surrounding environment is green, and the color of the sky is blue, the turning arrow of the enlarged intersection view can be set to be yellow. The specific extraction process is a process of extracting characteristic information such as the type and the arrow direction of a turning arrow in the intersection enlarged view and a turning guidance target on the guidance panel.
The similarity evaluation process is to judge whether the extracted enlarged intersection image is consistent with the guidance direction of the turning arrow in the enlarged intersection image and the direction of the turning guidance target arrow on the guidance panel, if the extracted enlarged intersection image is inconsistent with the guidance direction of the turning guidance target arrow in the guidance panel, the enlarged intersection image is considered to have problems, so that preliminary evaluation and screening are realized, and the enlarged intersection image with the problems is further manually confirmed.
And S330, receiving and storing the road data and the time stamp on the map base map corresponding to the screenshot sent by the terminal.
In the process of evaluating the road junction enlarged image, if the road junction enlarged image on a certain image is found to be incorrect, corresponding road data can be found according to the timestamp, and therefore the road junction enlarged image is corrected.
According to the technical scheme of the embodiment, the screenshot of the corresponding navigation interface sent by the terminal when the enlarged intersection image is generated is obtained through the server, the guidance direction of the turning arrow in the enlarged intersection image on the screenshot and the arrow direction of the turning guidance target on the guidance panel are identified by using an image recognition algorithm, and the enlarged intersection image is evaluated according to the consistency of the guidance direction and the arrow direction. By adopting the technical scheme, the embodiment of the invention adopts the image recognition algorithm to evaluate the accuracy of the road junction enlarged image in the process of testing by utilizing the navigation track, thereby improving the efficiency of evaluating the road junction enlarged image and realizing the integral evaluation of the manufacturing effect of the road junction enlarged image.
Example four
Fig. 4 is a flowchart of a navigation evaluation method according to a fourth embodiment of the present invention, and the present embodiment is further optimized based on the foregoing embodiments. As shown in fig. 4, the navigation evaluation method provided by this embodiment includes:
and S410, extracting road data of different urban areas and different road structures.
Generally, intersections with different road structures correspond to different enlarged intersection images, so that in the embodiment, road data of different urban areas and different road structures are extracted, and more types of sample data are provided for the navigation test process, so that the enlarged intersection images with more types of road structures can be evaluated.
S420, a navigation track sample set is manufactured and constructed according to the road data, wherein the navigation track sample set comprises a plurality of different navigation tracks.
The navigation track comprises navigation path information, road information and the like, and the navigation track sample set is a set of a plurality of navigation path information and corresponding road information samples.
And S430, issuing the navigation track sample set to the terminal so that the terminal can perform navigation test based on a plurality of different navigation tracks in the navigation track sample set.
The server and the terminal can be connected in a wired communication mode or in a wireless communication mode. The server can send the constructed navigation track sample set to the terminal in a wired communication mode or a wireless communication mode.
And S440, acquiring a screenshot of a corresponding navigation interface when the intersection enlarged image is generated, which is sent by the terminal.
S450, identifying a first target area on the guidance panel from the screenshot, wherein the first target area comprises a steering guidance target.
The first target area is an area where the steering induction mark on the induction panel is located in the screenshot, and the first target area should contain the steering induction mark to ensure that the complete steering induction mark is extracted.
Specifically, a region with a fixed size at a fixed position in the screenshot of the navigation interface is selected, so that the selected region includes an arrow of the guidance panel turning guidance target, and the arrow is used as a first target region. For example, the guidance panel is located above the navigation interface, the arrow of the turning guidance target is located at the upper left corner of the guidance panel, and a region with a fixed size at the fixed position of the upper left corner of the guidance panel is selected as the first target region, so that the selected first target region includes the arrow of the turning guidance target of the guidance panel.
And S460, identifying a second target area on the enlarged intersection image from the screenshot, wherein the second target area takes the central position of the enlarged intersection image as a central point, expands to the periphery to the corresponding area occupying the preset proportion of the enlarged intersection image and comprises a turning arrow.
And the second target area is an area where a turning arrow on the road junction enlarged view in the screenshot is located, and the second target area should contain the turning arrow in order to ensure that the complete turning arrow is extracted.
Specifically, a region of a fixed size centered on the center position of the enlarged intersection view is selected, so that the selected region includes a turning arrow in the enlarged intersection view, and is taken as the second target region. For example, the enlarged intersection is located below the navigation interface, and the turning arrow occupies a larger area in the enlarged intersection and is far away from the edge of the enlarged intersection, then a region which takes the center position of the enlarged intersection as the center point and extends to 80% of the area of the enlarged intersection map to the periphery is selected as the second target region, so that the selected second target region contains the turning arrow in the enlarged intersection.
And S470, respectively identifying the arrow direction of the turning guidance target in the first target area and the guidance direction of the turning guidance target in the second target area by using an image identification algorithm.
For example, the image recognition algorithm may extract the turning guidance mark and the turning arrow based on colors.
And S480, identifying the similarity between the induction direction and the arrow direction, and evaluating the correctness of the enlarged intersection image according to the similarity identification result.
For example, before similarity identification is performed on the guidance direction and the arrow direction, the turning arrow in the enlarged intersection image and the type and the arrow direction of the arrow of the turning guidance target are firstly classified, and the picture information is converted into a series of feature expressions, such as feature vectors. After the features of the guidance direction and the arrow direction are obtained, the features of the turning arrow and the turning guidance target arrow in the enlarged intersection image are input into the similarity calculation model based on the similarity calculation model, and then a similarity value is obtained. And comparing the obtained similarity value with a preset similarity threshold, and if the similarity value is greater than the preset similarity threshold, determining that the direction of the turning arrow in the enlarged intersection is consistent with the direction of the arrow of the turning induction mark, wherein the direction of the turning arrow in the enlarged intersection is correct.
And S490, receiving and storing the road data and the time stamp on the map base map corresponding to the screenshot sent by the terminal.
According to the technical scheme, the server extracts the road receipt to construct the navigation track sample set and sends the navigation track sample set to the terminal, a data basis is provided for the terminal to conduct navigation testing, a screenshot of a navigation interface corresponding to the screenshot sent by the terminal when an enlarged intersection image is generated is obtained, an image recognition algorithm is utilized to recognize the direction of a turning arrow in the enlarged intersection image on the screenshot and the direction of a turning arrow of a turning guidance target on a guidance panel, the enlarged intersection image is evaluated according to the consistency of the direction of the turning arrow and the direction of the turning arrow, the efficiency of evaluating the enlarged intersection image is improved, and the overall evaluation of the effect of producing the enlarged.
EXAMPLE five
Fig. 5 is a schematic structural diagram of a navigation evaluation device according to a fifth embodiment of the present invention, which is applicable to a situation of evaluating a navigation application, especially a situation of evaluating the correctness of an enlarged intersection image generated in the navigation application, where the navigation evaluation device may be implemented by software and/or hardware, and the navigation evaluation device may be integrated in any terminal having a network communication function and a display function, such as a typical user terminal device, for example, a mobile phone, a tablet computer, or a car navigator. Referring to fig. 5, the apparatus specifically includes:
the test module 501 is configured to perform a navigation test on a terminal based on a plurality of different pre-constructed navigation tracks;
the screenshot module 502 is used for screenshot the navigation interface when generating an enlarged intersection image in the navigation test process;
the sending module 503 is configured to send the screenshot to a server, so that the server recognizes, by using an image recognition algorithm, an induction direction of a turning arrow in the intersection enlarged image on the screenshot and an arrow direction of a turning induction target on an induction panel, and evaluates the intersection enlarged image according to a consistency between the induction direction and the arrow direction.
Optionally, the screenshot module 502 includes:
the monitoring unit is used for monitoring an enlarged intersection image generation instruction in the navigation test process;
the timing unit responds to the enlarged intersection image to generate an instruction and performs countdown according to preset duration;
and the execution unit is used for carrying out screenshot on the current navigation interface when the countdown is finished.
Optionally, the navigation evaluation apparatus further includes:
and the second sending module is used for correspondingly sending the road data on the map base map corresponding to the screenshot and the timestamp to the server for storage.
Optionally, the enlarged intersection view comprises a vector enlarged view, a mode view, a live-action view or a street view.
By the navigation evaluation device in the fifth embodiment of the invention, the problems of high evaluation time cost and low accuracy caused by the evaluation of the enlarged intersection by human eyes are solved, and the enlarged intersection can be evaluated more effectively, quickly and comprehensively.
The navigation evaluation device configured in the terminal provided by the embodiment of the invention can execute the navigation evaluation method provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method.
EXAMPLE six
Fig. 6 is a schematic structural diagram of a navigation evaluation device according to a sixth embodiment of the present invention, where this embodiment is applicable to a case of evaluating a navigation application, especially a case of evaluating correctness of an enlarged intersection view generated in the navigation application, and the navigation evaluation device may be implemented by software and/or hardware, and may be integrated in a server. Referring to fig. 6, the apparatus specifically includes:
an obtaining module 601, configured to obtain a screenshot of a navigation interface corresponding to a generated intersection enlarged image sent by a terminal;
the evaluation module 602 is configured to identify, by using an image recognition algorithm, an induction direction of a turning arrow in the enlarged intersection image in the screenshot and an arrow direction of a turning induction target on an induction panel, and evaluate the enlarged intersection image according to a consistency between the induction direction and the arrow direction.
Optionally, the evaluation module 602 includes:
the first identification unit is used for identifying a first target area on the guidance panel from the screenshot, wherein the first target area contains a turning guidance mark;
the second identification unit is used for identifying a second target area on the enlarged intersection image from the screenshot, wherein the second target area takes the central position of the enlarged intersection image as a central point, expands to the periphery to the area corresponding to the enlarged intersection image occupying the preset proportion and comprises a turning arrow;
the image recognition unit is used for respectively recognizing the arrow direction of the turning induction mark in the first target area and the induction direction of the turning arrow in the second target area by using an image recognition algorithm;
and the similarity evaluation unit is used for identifying the similarity between the induction direction and the arrow direction and evaluating the correctness of the enlarged intersection image according to the similarity identification result.
Optionally, the navigation evaluation device further includes:
the data extraction module is used for extracting road data of different urban areas and different road structures;
the track construction module is used for manufacturing and constructing a navigation track sample set according to the road data, wherein the navigation track sample set comprises a plurality of different navigation tracks;
and the communication module is used for issuing the navigation track sample set to the terminal so that the terminal can carry out navigation test based on a plurality of different navigation tracks in the navigation track sample set.
Optionally, the navigation evaluation device further includes:
and the receiving module is used for receiving and storing the road data and the time stamp on the map base map, which are sent by the terminal and correspond to the screenshot time.
By the navigation evaluation device of the sixth embodiment of the invention, the problems of high evaluation time cost and low accuracy caused by the evaluation of the enlarged intersection by human eyes are solved, and the enlarged intersection can be evaluated more effectively, quickly and comprehensively.
The navigation evaluation device configured in the server provided by the embodiment of the invention can execute the navigation evaluation method provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method.
EXAMPLE seven
Referring to fig. 7, the present embodiment provides a terminal 700, which includes: one or more processors 720; the storage device 710 is configured to store one or more programs, and when the one or more programs are executed by the one or more processors 720, the one or more processors 720 implement the navigation evaluation method applied to the terminal according to the embodiment of the present invention, including:
performing navigation test on the terminal based on a plurality of different pre-constructed navigation tracks;
in the navigation test process, when an enlarged intersection image is generated, screenshot is carried out on a navigation interface;
and sending the screenshot to a server so that the server can identify the direction of the guidance arrow in the enlarged intersection image on the screenshot and the direction of the arrow of the guidance target on the guidance panel by using an image identification algorithm, and evaluating the enlarged intersection image according to the consistency of the guidance direction and the direction of the arrow.
Of course, it can be understood by those skilled in the art that the processor 720 can also implement the technical solution of the log processing method applied to the test terminal provided by any embodiment of the present invention.
The electronic device 700 shown in fig. 7 is only an example and should not bring any limitation to the functions and the scope of use of the embodiments of the present invention.
As shown in fig. 7, electronic device 700 is embodied in the form of a general purpose computing device. The components of the electronic device 700 may include, but are not limited to: one or more processors 720, a memory device 710, and a bus 750 that couples the various system components (including the memory device 710 and the processors 720).
Bus 750 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, such architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, micro-channel architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Electronic device 700 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by electronic device 700 and includes both volatile and nonvolatile media, removable and non-removable media.
The storage 710 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM)711 and/or cache memory 712. The electronic device 700 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 713 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 7, commonly referred to as a "hard drive"). Although not shown in FIG. 7, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In such cases, each drive may be connected to bus 750 by one or more data media interfaces. Storage 710 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.
A program/utility 714 having a set (at least one) of program modules 715 may be stored, for instance, in storage 710, such program modules 715 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination may comprise an implementation of a network environment. The program modules 715 generally perform the functions and/or methodologies of any of the embodiments described herein.
The electronic device 700 may also communicate with one or more external devices 760 (e.g., keyboard, pointing device, display 770, etc.), with one or more devices that enable a user to interact with the electronic device 700, and/or with any devices (e.g., network card, modem, etc.) that enable the electronic device 700 to communicate with one or more other computing devices. Such communication may occur via input/output (I/O) interface 730. Also, the electronic device 700 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the internet) via the network adapter 740. As shown in FIG. 7, the network adapter 740 communicates with the other modules of the electronic device 700 via the bus 750. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the electronic device 700, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
The processor 720 executes various functional applications and data processing by executing programs stored in the storage device 710, for example, implementing a log processing method applied to a test terminal provided by an embodiment of the present invention.
An embodiment of the present invention further provides a server, including: one or more processors; the storage device is used for storing one or more programs, and when the one or more programs are executed by the one or more processors, the one or more processors implement the navigation evaluation method applied to the server, which is provided by the embodiment of the invention, and the navigation evaluation method applied to the server comprises the following steps:
acquiring a screenshot of a corresponding navigation interface when the intersection enlarged image is generated, which is sent by a terminal;
and identifying the direction of the guidance arrow in the enlarged intersection image on the screenshot and the direction of the arrow of the steering guidance target on the guidance panel by using an image identification algorithm, and evaluating the enlarged intersection image according to the consistency of the guidance direction and the direction of the arrow.
Of course, those skilled in the art will understand that the processor may also implement the technical solution of the navigation evaluation method applied to the server provided by any embodiment of the present invention. The hardware structure and the functions of the electronic device can be explained with reference to the seventh embodiment.
Example eight
The present embodiments provide a storage medium containing computer-executable instructions which, when executed by a computer processor, perform a navigation evaluation method applied to a terminal, the method including:
performing navigation test on the terminal based on a plurality of different pre-constructed navigation tracks;
in the navigation test process, when an enlarged intersection image is generated, screenshot is carried out on a navigation interface;
and sending the screenshot to a server so that the server can identify the direction of the guidance arrow in the enlarged intersection image on the screenshot and the direction of the arrow of the guidance target on the guidance panel by using an image identification algorithm, and evaluating the enlarged intersection image according to the consistency of the guidance direction and the direction of the arrow.
Of course, the storage medium containing the computer-executable instructions provided by the embodiments of the present invention is not limited to the method operations described above, and may also perform related operations in the navigation evaluation method applied to the terminal provided by any embodiments of the present invention.
Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
Yet another computer-readable storage medium is provided in an embodiment of the present invention, wherein the computer-executable instructions, when executed by a computer processor, perform a navigation evaluation method applied to a server, the method including:
acquiring a screenshot of a corresponding navigation interface when the intersection enlarged image is generated, which is sent by a terminal;
and identifying the direction of the guidance arrow in the enlarged intersection image on the screenshot and the direction of the arrow of the steering guidance target on the guidance panel by using an image identification algorithm, and evaluating the enlarged intersection image according to the consistency of the guidance direction and the direction of the arrow.
Of course, the storage medium containing the computer-executable instructions provided by the embodiments of the present invention is not limited to the method operations described above, and may also perform related operations in the navigation evaluation method applied to the server provided by any embodiments of the present invention. The description of the storage medium can be found in the explanation of embodiment eight.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (14)

1. A navigation evaluation method is applied to a terminal, and is characterized in that the method comprises the following steps:
performing navigation test on the terminal based on a plurality of different pre-constructed navigation tracks;
in the navigation test process, when an enlarged intersection image is generated, screenshot is carried out on a navigation interface;
and sending the screenshot to a server so that the server can identify the direction of the guidance arrow in the enlarged intersection image on the screenshot and the direction of the arrow of the guidance target on the guidance panel by using an image identification algorithm, and evaluating the enlarged intersection image according to the consistency of the guidance direction and the direction of the arrow.
2. The method of claim 1, wherein the screenshot of the navigation interface during the generation of the enlarged intersection view during the navigation test comprises:
monitoring an enlarged intersection image generation instruction in the navigation test process;
responding to the enlarged intersection image generation instruction, and counting down according to preset time length;
and when the countdown is finished, screenshot is carried out on the current navigation interface.
3. The method of claim 1, wherein after the screenshot of the navigation interface, the method further comprises:
and correspondingly sending the road data on the map base map corresponding to the screenshot and the timestamp to the server for storage.
4. The method of any one of claims 1-3, wherein the enlarged intersection view comprises a vector enlarged view, a pattern view, a live view, or a street view.
5. A navigation evaluation method is applied to a server, and is characterized by comprising the following steps:
acquiring a screenshot of a corresponding navigation interface when the intersection enlarged image is generated, which is sent by a terminal;
and identifying the direction of the guidance arrow in the enlarged intersection image on the screenshot and the direction of the arrow of the steering guidance target on the guidance panel by using an image identification algorithm, and evaluating the enlarged intersection image according to the consistency of the guidance direction and the direction of the arrow.
6. The method of claim 5, wherein the enlarged intersection view is a vector enlarged view;
correspondingly, the step of identifying the direction of the guidance arrow in the enlarged intersection image on the screenshot and the direction of the arrow of the steering guidance target on the guidance panel by using an image identification algorithm, and evaluating the enlarged intersection image according to the consistency of the guidance direction and the direction of the arrow comprises the following steps:
identifying a first target area on an induction panel from the screenshot, wherein the first target area comprises a steering induction mark;
identifying a second target area on the enlarged intersection image from the screenshot, wherein the second target area takes the central position of the enlarged intersection image as a central point, expands to the periphery to the corresponding area occupying the preset proportion of the enlarged intersection image and comprises a turning arrow;
respectively identifying the arrow direction of the turning induction mark in the first target area and the induction direction of the turning arrow in the second target area by using an image identification algorithm;
and identifying the similarity between the induction direction and the arrow direction, and evaluating the correctness of the enlarged intersection image according to the similarity identification result.
7. The method according to claim 5, wherein before acquiring the screenshot of the corresponding navigation interface when the enlarged intersection view is generated, which is sent by the terminal, the method further comprises:
extracting road data of different urban areas and different road structures;
manufacturing and constructing a navigation track sample set according to the road data, wherein the navigation track sample set comprises a plurality of different navigation tracks;
and issuing the navigation track sample set to the terminal so that the terminal can perform navigation test based on a plurality of different navigation tracks in the navigation track sample set.
8. The method according to any one of claims 5-7, further comprising:
and receiving and storing road data and a time stamp on the map base map corresponding to the screenshot sent by the terminal.
9. A navigation evaluation apparatus configured in a terminal, the apparatus comprising:
the test module is used for carrying out navigation test on the terminal based on a plurality of different pre-constructed navigation tracks;
the screenshot module is used for screenshot the navigation interface when generating an enlarged intersection image in the navigation test process;
and the sending module is used for sending the screenshot to a server so that the server can identify the direction of the guidance of the turning arrow in the enlarged intersection image on the screenshot and the direction of the turning guidance mark on the guidance panel by using an image identification algorithm, and evaluating the enlarged intersection image according to the consistency of the guidance direction and the direction of the turning arrow.
10. A navigation evaluation device configured in a server, the device comprising:
the acquisition module is used for acquiring a screenshot of a corresponding navigation interface when the enlarged intersection image is generated, which is sent by the terminal;
and the evaluation module is used for identifying the direction of the guidance arrow in the enlarged intersection image on the screenshot and the direction of the arrow of the steering guidance target on the guidance panel by using an image identification algorithm, and evaluating the enlarged intersection image according to the consistency of the guidance direction and the direction of the arrow.
11. A terminal, characterized in that the terminal comprises:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the navigation evaluation method of any of claims 1-4.
12. A server, characterized in that the server comprises:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement a navigation evaluation method as recited in any of claims 5-8.
13. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the navigation evaluation method according to any one of claims 1 to 4.
14. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out a navigation evaluation method according to any one of claims 5 to 8.
CN201810985464.0A 2018-08-28 2018-08-28 Navigation evaluation method, device, terminal, server and storage medium Active CN109115242B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810985464.0A CN109115242B (en) 2018-08-28 2018-08-28 Navigation evaluation method, device, terminal, server and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810985464.0A CN109115242B (en) 2018-08-28 2018-08-28 Navigation evaluation method, device, terminal, server and storage medium

Publications (2)

Publication Number Publication Date
CN109115242A CN109115242A (en) 2019-01-01
CN109115242B true CN109115242B (en) 2020-11-06

Family

ID=64860261

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810985464.0A Active CN109115242B (en) 2018-08-28 2018-08-28 Navigation evaluation method, device, terminal, server and storage medium

Country Status (1)

Country Link
CN (1) CN109115242B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109933642B (en) * 2019-04-04 2023-03-07 北京三快在线科技有限公司 Method and device for generating guide picture, electronic equipment and storage medium
CN110779541B (en) * 2019-04-10 2021-11-23 北京嘀嘀无限科技发展有限公司 Display method and system of steering arrow
CN111427486A (en) * 2020-03-24 2020-07-17 斑马网络技术有限公司 Navigation prompt testing method and device
CN112613694B (en) * 2020-11-27 2023-09-08 北京百度网讯科技有限公司 Evaluation method, device and equipment of navigation map and readable storage medium

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI386626B (en) * 2008-07-07 2013-02-21 Wistron Corp Geographic information updating device for a navigation system and related navigation system
CN101726308B (en) * 2008-10-15 2012-07-25 北京龙图通信息技术有限公司 Method for generating crossing actual scene induced map of navigation electronic map
CN101738191B (en) * 2009-12-23 2012-04-18 沈阳美行科技有限公司 Navigation device and one-side multi-junction enlarged image display method
CN103206960B (en) * 2012-01-13 2016-04-27 北京四维图新科技股份有限公司 A kind of method for path navigation and device
CN104075729B (en) * 2013-03-29 2017-02-08 高德软件有限公司 Method, device and terminal device for displaying electronic map
JP6160191B2 (en) * 2013-04-15 2017-07-12 アイシン・エィ・ダブリュ株式会社 Driving support system, method and program
CN103353308B (en) * 2013-04-18 2015-08-19 沈阳美行科技有限公司 A kind of real-time aspect billboard guide design method of guider
CN104457790B (en) * 2014-11-27 2017-07-25 百度在线网络技术(北京)有限公司 Evaluate and test method, test device and its construction method of the inducing effect of navigation product
CN105957379A (en) * 2016-05-30 2016-09-21 乐视控股(北京)有限公司 Traffic information recognition method and device, and vehicle
CN107560622A (en) * 2016-07-01 2018-01-09 板牙信息科技(上海)有限公司 A kind of method and apparatus based on driving image-guidance
CN106648631A (en) * 2016-11-30 2017-05-10 北京联创新图科技有限公司 Navigation interface display method targeting smart car equipment
CN106840209B (en) * 2017-02-22 2020-04-21 百度在线网络技术(北京)有限公司 Method and apparatus for testing navigation applications

Also Published As

Publication number Publication date
CN109115242A (en) 2019-01-01

Similar Documents

Publication Publication Date Title
CN109145680B (en) Method, device and equipment for acquiring obstacle information and computer storage medium
CN109115242B (en) Navigation evaluation method, device, terminal, server and storage medium
CN109141464B (en) Navigation lane change prompting method and device
CN109343061B (en) Sensor calibration method and device, computer equipment, medium and vehicle
CN107067003B (en) Region-of-interest boundary extraction method, device, equipment and computer storage medium
CN109492507B (en) Traffic light state identification method and device, computer equipment and readable medium
CN109961522B (en) Image projection method, device, equipment and storage medium
CN111652087B (en) Car inspection method, device, electronic equipment and storage medium
JP2023055697A (en) Automatic driving test method and apparatus, electronic apparatus and storage medium
CN111738041A (en) Video segmentation method, device, equipment and medium
CN110533940B (en) Method, device and equipment for identifying abnormal traffic signal lamp in automatic driving
CN111401228A (en) Video target labeling method and device and electronic equipment
CN112396032B (en) Writing detection method and device, storage medium and electronic equipment
CN111856417B (en) Performance analysis method, device, terminal and storage medium of vehicle millimeter wave radar
CN108286973B (en) Running data verification method and device and hybrid navigation system
CN115272222A (en) Method, device and equipment for processing road detection information and storage medium
CN110363193B (en) Vehicle weight recognition method, device, equipment and computer storage medium
CN109300322B (en) Guideline drawing method, apparatus, device, and medium
CN114565908A (en) Lane line detection method and device, electronic device and storage medium
CN114120071A (en) Detection method of image with object labeling frame
CN114186007A (en) High-precision map generation method and device, electronic equipment and storage medium
CN113409393B (en) Method and device for identifying traffic sign
CN114429631B (en) Three-dimensional object detection method, device, equipment and storage medium
CN109270566A (en) Air navigation aid, navigation effect test method, device, equipment and medium
CN111950356B (en) Seal text positioning method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant