WO2020113528A1 - Procédé et appareil de traitement de navigation et dispositif de traitement de navigation - Google Patents

Procédé et appareil de traitement de navigation et dispositif de traitement de navigation Download PDF

Info

Publication number
WO2020113528A1
WO2020113528A1 PCT/CN2018/119616 CN2018119616W WO2020113528A1 WO 2020113528 A1 WO2020113528 A1 WO 2020113528A1 CN 2018119616 W CN2018119616 W CN 2018119616W WO 2020113528 A1 WO2020113528 A1 WO 2020113528A1
Authority
WO
WIPO (PCT)
Prior art keywords
navigation
information
target
mobile platform
image
Prior art date
Application number
PCT/CN2018/119616
Other languages
English (en)
Chinese (zh)
Inventor
周游
蔡剑钊
杜劼熹
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2018/119616 priority Critical patent/WO2020113528A1/fr
Priority to CN201880065696.2A priority patent/CN111213031A/zh
Publication of WO2020113528A1 publication Critical patent/WO2020113528A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • G01C21/343Calculating itineraries, i.e. routes leading from a starting point to a series of categorical destinations using a global route restraint, round trips, touristic trips
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3602Input other than that of destination using image analysis, e.g. detection of road signs, lanes, buildings, real preceding vehicles using a camera
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3644Landmark guidance, e.g. using POIs or conspicuous other objects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3658Lane guidance
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/48Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system

Definitions

  • Embodiments of the present invention relate to the field of image processing technology, and in particular, to a navigation processing method, device, and navigation processing device.
  • the navigation system can plan a navigation path for the driver to use according to the starting position and the destination position input by the driver, and the driver can reach the destination according to the navigation path.
  • the existing navigation system is usually affected by the environment or weather, which results in inaccurate navigation results.
  • GPS Global Positioning System
  • the GPS signal is affected by the weather and the environment, and the GPS signal is poor between some buildings or under the overpass.
  • the accuracy of GPS navigation results is poor, especially on roads with multiple entrances and exits, there will be some problems.
  • the navigation system broadcasts the navigation information of the intersection after driving through the intersection, causing the driver to miss the correct route. Therefore, how to improve the accuracy of navigation results has become a hot issue in research.
  • Embodiments of the present invention provide a navigation processing method, device, navigation processing device, and storage medium, which can be combined with an image-assisted navigation module to navigate a mobile platform.
  • an embodiment of the present invention provides a navigation processing method.
  • the method is used for mobile platform navigation.
  • the mobile platform includes a navigation module and a vision module.
  • the method includes:
  • the navigation information of the navigation module is optimized based on the navigation assistance information to obtain target navigation information of the mobile platform.
  • an embodiment of the present invention provides a navigation processing device.
  • the navigation processing device is configured on a mobile platform and is used to navigate the mobile platform.
  • the mobile platform includes a navigation module and a vision module.
  • the navigation processing device includes :
  • a collection module used to call the vision module to collect a target image of the environment where the mobile platform is currently located, and the target image includes an image area of a traffic sign;
  • a processing module configured to determine navigation assistance information of the mobile platform according to the target image
  • the processing module is further used to optimize the navigation information of the navigation module based on the navigation assistance information to obtain target navigation information of the mobile platform.
  • an embodiment of the present invention provides a navigation processing device, which is configured on a mobile platform and used to navigate a mobile platform.
  • the mobile platform includes a navigation module and a vision module.
  • the navigation processing device includes a processing A processor and a communication interface, the processor and the communication interface are connected to each other, wherein the communication interface is controlled by the processor to send and receive instructions, and the processor is used to:
  • the navigation information of the navigation module is optimized based on the navigation assistance information to obtain target navigation information of the mobile platform.
  • an embodiment of the present invention provides a computer storage medium that stores computer program instructions, which are used to implement the above-described navigation processing method when the computer program instructions are executed.
  • the mobile platform can call the vision module to collect the target image of the current environment of the mobile platform, and determine the navigation assistance information of the mobile platform according to the target image, and then optimize the navigation information of the navigation module based on the navigation assistance information to obtain Target navigation information for mobile platforms.
  • the current environmental position can be quickly analyzed in combination with the captured environmental images to obtain navigation assistance information, and to a certain extent, assist the navigation module to navigate, to a certain extent, to avoid navigation errors or navigation Prompting problems that are not timely enough will help improve the accuracy of mobile platform navigation.
  • 1a is a schematic structural diagram of a mobile platform provided by an embodiment of the present invention.
  • FIG. 1b is a schematic diagram of a scenario provided by an embodiment of the present invention.
  • FIG. 2 is a schematic flowchart of a navigation processing method provided by an embodiment of the present invention.
  • 3a is a schematic diagram of a standard image provided by an embodiment of the present invention.
  • 3b is a schematic diagram of a target image provided by an embodiment of the present invention.
  • 3c is a schematic diagram of an image including an image area of a traffic sign provided by the implementation of the present invention.
  • 3d is a schematic diagram of another standard image provided by an embodiment of the present invention.
  • 3e is a schematic diagram of another target image provided by an embodiment of the present invention.
  • FIG. 3f is a schematic diagram of another scenario provided by an embodiment of the present invention.
  • FIG. 4 is a schematic flowchart of another navigation processing method provided by an embodiment of the present invention.
  • FIG. 5 is a schematic structural diagram of a navigation processing device according to an embodiment of the present invention.
  • FIG. 6 is a schematic structural diagram of a navigation processing device according to an embodiment of the present invention.
  • An embodiment of the present invention provides a navigation processing method for navigating a mobile platform, where the mobile platform may be some mobile devices capable of driving on public transportation roads, such as autonomous vehicles, smart electric vehicles, and skateboards Vehicles such as cars and balance cars may also be auxiliary driving devices installed on these mobile devices, such as a driving recorder installed on the mobile devices.
  • the mobile platform may be some mobile devices capable of driving on public transportation roads, such as autonomous vehicles, smart electric vehicles, and skateboards Vehicles such as cars and balance cars may also be auxiliary driving devices installed on these mobile devices, such as a driving recorder installed on the mobile devices.
  • the mobile platform may include a navigation module and a vision module.
  • the navigation module is used to navigate the mobile platform and collect navigation information of the mobile platform.
  • the navigation information may include the navigation route of the mobile platform from the start point to the end point, the name of the navigation road of the navigation route, the distance traveled, Current location, etc.
  • the navigation module may be a device installed with a navigation system (such as GPS, Beidou navigation system, inertial navigation system, etc.).
  • the vision module can be used to collect images of the environment in which the mobile platform is located and provide related functions for image collection.
  • the vision module can include a camera device that can be installed in front of the mobile platform to collect the mobile platform Images of the surrounding environment.
  • FIG. 1 a shows a mobile platform 10 including: a vision module 100 and a navigation module 101. It can be seen that both the vision module 100 and the navigation module 101 can be fixed on the main structure of the mobile platform, wherein the vision module 100 is externally placed at the position of the mobile platform precursor, and is used to collect an image in front of the current environment of the mobile platform
  • the navigation module 101 can be built into the mobile platform and used to navigate the mobile platform.
  • the mobile platform 10 may be a car driving on a public transportation road, and a vision module 100 is placed outside the position of the front body of the car (such as the front of the car).
  • the vision module 100 may be used When collecting the target image 101 in front of the mobile platform, it can be seen that the target image 101 includes an image area of a traffic sign, and the image area includes: an image area 102 and an image area 103, wherein the traffic sign corresponding to the image area 102 It is a multi-lane sign, and the traffic sign corresponding to the image area 103 is a junction sign.
  • the mobile platform 10 may perform recognition processing on the image area to determine navigation assistance information for the car, and the navigation assistance information may include steering information (such as going straight into Jianshe Avenue, turning right into Fujiye No.1 Road, etc.) , Distance information (such as the first distance between the mobile platform and the traffic sign and the second distance indicated by the traffic sign 1km), the name of the road indicated by the traffic sign (such as Jianshe Avenue, Fujie No. 1 Road in Figure 1b), and the lane Information, the lane information includes the number of lanes on the road currently driven by the mobile platform, and the driving directions corresponding to the lanes.
  • steering information such as going straight into Jianshe Avenue, turning right into Fujiye No.1 Road, etc.
  • Distance information such as the first distance between the mobile platform and the traffic sign and the second distance indicated by the traffic sign 1km
  • the name of the road indicated by the traffic sign such as Jianshe Avenue, Fujie No. 1 Road in Figure 1b
  • the lane information includes the number of lanes on the road currently driven by the
  • the mobile platform 10 may optimize the navigation information of the navigation module based on the navigation assistance information to obtain target navigation information of the mobile platform, and the target navigation information includes the navigation of the mobile platform to the target object (such as Fujie Yilu) The distance, the navigation direction of the mobile platform moving to the target object (such as turning right), and lane indication information for instructing the mobile platform to turn into the target lane matching the navigation direction.
  • target navigation information includes the navigation of the mobile platform to the target object (such as Fujie Yilu) The distance, the navigation direction of the mobile platform moving to the target object (such as turning right), and lane indication information for instructing the mobile platform to turn into the target lane matching the navigation direction.
  • the current driving road of the mobile platform is 4 lanes, of which: the two lanes on the left are straight lanes, the first lane on the right is a right turn lane, the second lane on the right is a straight right turn lane, and the navigation direction of the mobile platform to the target object is Turn right, then the lane indication information can be used to instruct the mobile platform to turn into the two right-turnable lanes on the right (that is, the target lane that matches the navigation direction).
  • the navigation aid information can be determined by using image recognition technology, and the navigation information of the mobile platform can be optimized based on the navigation aid information to obtain the target navigation information of the mobile platform, which can assist the navigation module in navigating and help to correct the navigation module's Navigation deviation improves the accuracy of navigation on mobile platforms.
  • FIGS. 1a and 1b are only for illustration.
  • the mobile platform shown in FIGS. 1a and 1b may also be an auxiliary driving device installed on a mobile device, such as a mobile device On the driving recorder.
  • FIG. 1b is also only an example of the scene involved in the embodiment of the present invention, and is mainly used to explain the principle of image recognition and navigation processing of the mobile platform based on the vision module and the navigation module of the embodiment of the present invention.
  • FIG. 2 is a schematic flowchart of a navigation processing method provided by an embodiment of the present invention.
  • the method according to the embodiment of the present invention may be executed by a mobile platform.
  • the method is used for mobile platform navigation.
  • the mobile platform includes a visual Modules and navigation modules.
  • the mobile platform may call the vision module in S201 to collect a target image of the current environment of the mobile platform, and the target image includes an image area of a traffic sign.
  • the mobile platform can call the vision module at a preset time interval to collect the environmental image of the current environment of the mobile platform, and detect whether there is an image area of the traffic sign in the environmental image through the neural network.
  • the environment image is determined as the target image.
  • the neural network may be a convolutional neural network.
  • the navigation assistance information of the mobile platform may be determined according to the target image in S202.
  • a traffic sign library may be established in advance, and the traffic sign library includes at least one standard image and semantic information corresponding to each standard image.
  • the standard images are all images of traffic signs that comply with national standards.
  • the semantic information may be, for example, straight, turn left, turn right, etc.
  • the standard image is an image corresponding to a road sign prompting a fork as shown in FIG. 3a.
  • the road sign prompting a fork includes two indicating elements, namely an indicating element 30 and an indicating element 31, wherein: the semantics corresponding to the indicating element 30
  • the information is straight, indicating that the semantic information corresponding to element 31 is a right turn.
  • the mobile platform may perform feature point matching on the image content of the above image area and the standard image in the traffic sign library, and determine a target standard image from the traffic sign library based on the feature point matching result, the target standard image Associated with the target traffic sign included in the image area. Further, the navigation assistance information of the mobile platform can be determined according to the semantic information of the target standard image.
  • the image content of the image area and any standard image in the traffic sign library can be referred to the feature point, and the The feature point corresponding to the image area is compared with the feature point of any standard image. If the similarity score between the two is greater than or equal to the preset similarity score threshold, the standard image and the image can be determined.
  • the target traffic sign included in the area is associated, that is, any one of the standard images is determined as the target standard image.
  • the target image collected by the mobile platform is shown in FIG. 3b.
  • the target image includes the image area of the traffic sign, and the image area includes the image area 32.
  • the traffic sign library It includes the standard image corresponding to the road sign as shown in FIG. 3a, and the semantic information corresponding to the standard image (the semantic information corresponding to the indication element 30 is straight, and the semantic information corresponding to the indication element 31 is right turn).
  • the mobile platform extracts the image area 32 from the target image to obtain the traffic sign image (that is, the image content of the image area 32) as shown in FIG. 3c.
  • the similarity score between the two is 95, and the similarity score 95 is greater than the preset similarity score threshold 90, and then the standard image corresponding to the road sign can be determined as the target standard image, that is, It can be determined that the target traffic sign corresponding to the image area 32 is a crossroad sign.
  • the mobile platform after the mobile platform determines the target standard image from the traffic sign library, it can determine the initial navigation direction according to the semantic information of the target standard image, and perform text recognition on the image content of the image area to determine the image area
  • the mobile platform may determine navigation assistance information of the mobile platform according to the initial navigation direction and road indication information.
  • the navigation assistance information may include steering information and distance information, the distance information is calculated according to the first distance and the second distance obtained from the road indication information, the first distance refers to the mobile platform and the target traffic sign the distance between.
  • the turn information indicates the direction to enter the designated location, and the turn information is determined according to the initial navigation direction and road instruction information.
  • the mobile platform 10 calls the vision module 100 to collect the target image of the image area including the traffic sign, as shown in FIG. 3b
  • the target traffic sign corresponding to the image area 32 is determined from the traffic sign library
  • the associated target standard image is shown in FIG. 3a.
  • the target standard image is a standard image corresponding to an intersection sign, that is, it can be determined that the target traffic sign corresponding to the image area 32 is an intersection sign.
  • the initial navigation direction indicated by the target traffic sign can be determined according to the semantic information corresponding to the standard image of the road sign (the semantic information corresponding to the indication element 30 is straight, and the semantic information corresponding to the indication element 31 is right turn) Go straight and turn right.
  • the mobile platform can perform text recognition on the image content shown in FIG. 3c (that is, the image content of the image area 32), and the text recognition can include digital information analysis and text information analysis.
  • the second distance of 1km can be determined through digital information analysis, and the fields "Jianshe Avenue” and “Fujie Yilu” can be determined through text information analysis, and then the road indication information indicated by the image content includes the fields "Jianshe Avenue", The distance between "Tengye Yilu” and the second is 1km.
  • the mobile platform may determine the turn information (straight into Jianshe Avenue and right turn into Fujiye 1st Road) based on the determined initial navigation direction (straight and turn right) and the road instruction information, and then determine the mobile platform Navigation aids.
  • the first distance refers to the distance between the mobile platform and the target traffic sign.
  • the mobile platform may determine the contact point between the target traffic sign and the ground in the target image, and determine the reference angle based on the position information of the contact point in the target image, and then based on the reference height of the vision module And the reference angle, the distance between the mobile platform and the target traffic sign (that is, the first distance) is calculated.
  • the collection angle when the vision module collects the target image is horizontal, that is, the target image is an image directly in front of the mobile platform.
  • the mobile platform can determine the pixel information between the contact point and the horizontal center line of the target image according to the position information of the contact point in the target image, and the pixel information represents the number of pixel points.
  • the mobile platform is preset with a correspondence between the number of pixels and the preset angle. Specifically, one pixel point may correspond to a preset angle.
  • the mobile platform can determine the number of target pixels according to the correspondence between the number of target pixels characterized by the pixel information and the preset number of pixels and the preset angle Reference angle. For example, one pixel corresponds to 0.01°, and if the number of target pixels is 1000, then the reference angle corresponding to the number of target pixels is 10°.
  • FIG. 3e is the target image collected by the vision module, including: target traffic sign 34, horizontal center line 35 and contact point 36 of the target traffic sign 34 with the ground, the vision installed at the front of the vehicle Module 37, it can be seen that the collection angle when the vision module 37 collects the target image is horizontal. among them, For reference angle, h is the reference height of the vision module relative to the ground.
  • the horizontal distance d between the vision module 37 and the contact point 36 can be regarded as the distance from the mobile platform to the target traffic sign ( The first distance) is the same, the distance Therefore, at a known reference angle
  • the first distance can be calculated based on the trigonometric function cot. For example, reference angle Is 2° and the reference height h is 2m, then the first distance is 57m.
  • the collection angle when the vision module collects the target image is non-horizontal, that is, there is a certain pitch angle ⁇ ( ⁇ is greater than 0) between the collection angle when the vision module collects the target image and the horizontal line.
  • the mobile platform can determine the reference angle in the same manner as described above. Further, the mobile platform may adjust the reference angle according to the pitch angle ⁇ to obtain a relative reference angle, and then calculate the distance between the mobile platform and the target traffic sign based on the reference height of the vision module and the relative reference angle.
  • the relative reference angle may be obtained by subtracting the pitch angle ⁇ from the reference angle.
  • the pitch angle ⁇ is related to the installation angle of the vision module relative to the horizontal line.
  • the installation angle of the vision module relative to the horizontal line can be entered into the calibration information of the vision module, and when the mobile platform needs to calculate the first distance, the collection target of the vision module can be determined according to the installation angle in the calibration information
  • the pitch angle ⁇ may be the same as the installation angle described above.
  • the vision module may be deflected upward with respect to the horizontal line, that is, the acquisition angle corresponding to the target image may change as the vision module deflects.
  • the deflection angle of the vision module relative to the horizontal line can be obtained when the vision module collects the target image, and according to the deflection angle, it is determined that the vision module collects the target image Collection angle. That is, the above-mentioned pitch angle ⁇ is determined.
  • the pitch angle ⁇ may be the same as the above-mentioned deflection angle.
  • the above vision module may include a binocular camera.
  • the mobile platform can call the binocular camera to collect target images of the environment where the mobile platform is currently located, that is, each time it is taken, two different target images can be obtained at the same time.
  • the first distance can be obtained by calculating the parallax of two target images using the principle of binocular vision ranging.
  • the above vision module may further include a monocular camera, and the mobile platform uses the monocular camera to collect images of the environment of the mobile platform at different times, for example, the front and back frame images taken before and after the time to obtain the first distance.
  • the mobile platform After the mobile platform determines the navigation assistance information, it can optimize the navigation information of the navigation module based on the navigation assistance information in S203 to obtain the target navigation information of the mobile platform.
  • the navigation assistance information may also include identification information of at least one indicating object indicated by the target traffic sign.
  • the indication objects include indication area (such as Nanshan District, Futian District, etc.), road indication direction (such as Binhe Avenue eastward), indication road (such as Yanhe Avenue, Jianshe Avenue, etc.) or indication location (such as C Paradise);
  • the identification information of the indicating object may be the name of the indicating object.
  • a road sign as shown in FIG. 3c indicates that there are two roads (that is, indication objects) with road names (that is, identification information), which are "Jiang Avenue” and “Tengye Yilu”, respectively.
  • the road name and the navigation information may be searched in association, and if it is determined that any one of the at least one road indicated by the target traffic sign exists in the navigation information, then any road is determined as the target road ( That is the target object).
  • target navigation information corresponding to the movement of the mobile platform to the target road is generated, and the target navigation information includes the navigation distance and navigation direction of the mobile platform to the target road.
  • the mobile platform may also obtain navigation information from the navigation module, where the navigation information includes at least one navigation object corresponding to the navigation route of the mobile platform Identification information. Further, based on the navigation information and the navigation assistance information, it can be determined whether the target object exists in the at least one indication object, and the identification information of the target object matches the identification information of the navigation object. If there is a target object, step S203 is triggered.
  • the navigation object includes the navigation area of the navigation route (such as Nanshan District, Futian District, etc.), the direction of the navigation road (such as the east of Binhe Avenue), the navigation road (such as Yanhe Avenue, Jianshe Avenue, etc.) or the navigation location (such as C Paradise); the identification information of the navigation object can be the name of the navigation object.
  • the navigation area of the navigation route such as Nanshan District, Futian District, etc.
  • the direction of the navigation road such as the east of Binhe Avenue
  • the navigation road such as Yanhe Avenue, Jianshe Avenue, etc.
  • the navigation location such as C Paradise
  • the identification information of the navigation object can be the name of the navigation object.
  • the mobile platform may sum the first distance and the second distance in the navigation assistance information to obtain the navigation distance that the mobile platform moves to the target object.
  • the steering information in the navigation assistance information can be parsed to obtain the navigation direction of the mobile platform moving to the target object, thereby generating target navigation information corresponding to the mobile platform moving to the target object.
  • the target navigation information includes the navigation distance that the mobile platform moves to the target object, and the navigation direction that the mobile platform moves to the target object.
  • the target image of the image area including the traffic sign is shown in FIG. 3b, and the target traffic sign included in the image area includes the crossroad sign shown in FIG. 3a.
  • the mobile platform determines the target image according to the target image.
  • the navigation assistance information includes turning information, distance information, and the road name of at least one road indicated by the target traffic sign. Among them, the turn information instructs to go straight into Jianshe Avenue and turn right into Tengye 1st Road; the distance information includes a first distance of 500m and a second distance of 1km. The first distance is the distance between the mobile platform and the target traffic sign, the second The distance is the distance indicated by the target traffic sign.
  • the mobile platform obtains navigation information from the navigation module, where the navigation information includes identification information of at least one navigation object corresponding to the navigation route of the mobile platform. Further, the mobile platform compares the road names “Jianshe Avenue” and “Tengye Yilu” of the road indicated by the target traffic signs with the road names of the various navigation roads corresponding to the road route, and the comparison results show that the road names exist in the navigation route. "Fengye Yilu”, then the road “Fujiye Yilu” can be determined as the target object. Further, the mobile platform may sum the first distance and the second distance in the navigation assistance information to obtain that the navigation distance that the mobile platform moves to the target object is 1.5 km.
  • the steering information in the navigation assistance information is analyzed to obtain that the navigation direction of the mobile platform moving to the target object "Fujie Yilu" is a right turn, thereby generating target navigation information corresponding to the mobile platform moving to the target object "Fujiye Yilu”.
  • the target navigation information includes the navigation distance of the mobile platform moving to the target object of 1.5 km, and the navigation direction of the mobile platform moving to the target object “to the right”.
  • the mobile platform can also output a prompt message, such as "1.5km turn right into Tengye Yilu” to prompt the user to enter the target object "Tengye Yilu” "All the way” navigation distance and direction.
  • a prompt message such as "1.5km turn right into Tengye Yilu” to prompt the user to enter the target object "Tengye Yilu” "All the way” navigation distance and direction.
  • the navigation module can avoid inaccurate positioning due to poor navigation signals; or, due to positioning accuracy constraints, on roads with multiple entrances and exits, the problem of prompting a turn after often missing the intersection will often be missed.
  • the navigation assistance information may include steering information indicating at least one direction, and identification information corresponding to each direction indicating the object.
  • the mobile platform may obtain navigation information from the navigation module, where the navigation information includes identification information of at least one navigation object corresponding to the navigation route of the mobile platform. If it is determined based on the navigation information and the navigation assistance information that at least one target object indicating that the identification information matches the navigation object's identification information exists in the object, the steering information in the navigation assistance information may be parsed to obtain the mobile platform moving to the target object Navigation direction, and then generate target navigation information corresponding to the target object moved by the mobile platform.
  • the target navigation information includes the navigation direction of the mobile platform to the target object.
  • the road sign indicates that there are two roads (that is, the target) and the road names (that is, identification information) are "Jianshe Avenue” and “Tengye Yilu”.
  • the turn information can indicate straight Enter “Building Avenue” and turn right into “Fujie Yilu”.
  • the navigation information includes identification information of at least one navigation object corresponding to the navigation route of the mobile platform.
  • the mobile platform compares the road names “Jianshe Avenue” and “Tengye Yilu” of the road indicated by the target traffic signs with the road names of the various navigation roads corresponding to the road route, and the comparison results show that the road names exist in the navigation route.
  • the mobile platform analyzes the steering information in the navigation assistance information, and obtains that the navigation direction of the mobile platform moving to the target object "Fujie Yilu” is a right turn, which can be generated to move the mobile platform to the target corresponding to the target object "Fujiye Yilu” Navigation information.
  • the target navigation information includes the navigation direction "rightward" of the mobile platform moving to the target object.
  • the mobile platform can also output a prompt message, such as "turn right to enter the Tengye Yilu” to prompt the user to enter the target object "Tengye Yilu” Navigation direction.
  • the navigation assistance information may include steering information indicating at least one direction, and identification information corresponding to each direction indicating the object.
  • the mobile platform can optimize the navigation information of the navigation module according to the steering information in the navigation assistance information to obtain the target navigation information of the mobile platform.
  • the target traffic sign corresponding to the image area is a fork sign as shown in Fig. 3c, and the turn information corresponding to the fork sign indicates straight-going into "Jianshe Avenue” and right-turning into "Fujiye First Road”.
  • the mobile platform can parse the steering information in the navigation assistance information to obtain the instructions for turning right into "Fujiye First Road” and going straight into "Building Avenue” to generate target navigation information.
  • the target navigation information includes that the navigation direction of the mobile platform moving to "Fujie Yilu” is to turn right, and the navigation direction to "Jianshe Avenue” is to go straight. Further, it can also output direction prompt information, such as "Turn right into Fujiye 1st Road and go straight into Jianshe Avenue”.
  • the target image includes an image area of a traffic sign, and the image area includes a sub-area of a lane sign.
  • the navigation assistance information may further include lane information obtained by analyzing and analyzing the collected lane signs.
  • the lane information includes the number of lanes on the road currently driven by the mobile platform and the driving directions corresponding to the lanes.
  • the mobile platform may perform feature point matching on the image content of the sub-area and the standard image in the traffic sign library, and determine the target standard lane image from the traffic sign library based on the feature point matching result, The target standard lane image is associated with the target lane indicator included in the sub-region.
  • the number of lanes indicated by the target lane sign and the driving direction corresponding to each lane can be determined by combining the semantic information of the target standard lane image, and then the lane information including the number of lanes and the corresponding driving direction of each lane can be generated.
  • the lane indication information can be determined according to the above lane information and the navigation direction, and then the target navigation information including the lane indication information is generated, and the lane indication information is used to indicate movement
  • the platform turns into the target lane that matches the navigation direction.
  • the target image collected by the mobile platform includes an image area 32 and an image area 33.
  • the image area 33 is a sub-area including a lane sign.
  • the traffic sign library includes a standard image as shown in FIG. 3d, that is, semantic information corresponding to the standard image, and the semantic information includes the name of the lane sign corresponding to the standard image as "4-lane sign" and the lane sign indication
  • the number of lanes is 4 and the corresponding driving directions of the lanes indicated by the lane signs are: the left two lanes are straight lanes, the first right lane is a right turn lane, and the second right lane is a straight right turn lane.
  • the mobile platform matches the image content corresponding to the image area 33 with the standard image shown in FIG. 3d and then determines that the standard image shown in FIG. 3d corresponds to the image area 33 based on the feature point matching result Match the target lane signs, you can determine the standard image shown in Figure 3d as the target standard lane image, and then combine the semantic information of the target standard lane image to determine the number of lanes indicated by the target lane sign is 4 , And the corresponding driving directions of each lane are: the left two lanes are straight lanes, the first right lane is a right turn lane, and the second right lane is a straight right turn lane (that is, lane information).
  • the two lanes that can turn right (that is, the two lanes on the right) can be determined to match the navigation direction based on the above lane information and the navigation direction
  • the target lane which in turn generates lane indication information for instructing the mobile platform to turn into the two lanes on the right.
  • the mobile platform may also output prompt information that matches the lane indication information.
  • the navigation direction is right turn, and the lane indication information is used to instruct the mobile platform to turn into two lanes on the right.
  • the prompt information that matches the lane indication information may be "Turn right, please turn to the right two lanes in advance.”
  • the mobile platform may call a vision module to collect the target image of the current environment of the mobile platform, and determine navigation assistance information of the mobile platform according to the target image, and then optimize the navigation information of the navigation module based on the navigation assistance information. Get the target navigation information of the mobile platform.
  • FIG. 4 is a schematic flowchart of another navigation processing method provided by an embodiment of the present invention.
  • the method according to the embodiment of the present invention may be performed by a mobile platform.
  • the method is used for mobile platform navigation.
  • the mobile platform includes Vision module and navigation module.
  • the mobile platform may call the vision module in S401 to collect a target image of the current environment of the mobile platform, and the target image includes an image area of a traffic sign.
  • the vision module in S401 to collect a target image of the current environment of the mobile platform, and the target image includes an image area of a traffic sign.
  • Road signs that use signs or symbols to convey guidance, restrictions, warnings, or instructions can be divided into 6 types: warning signs, prohibition signs, direction signs, road signs, tourist area signs, and road construction safety signs.
  • Each type of logo has its own color and shape.
  • warning signs which mainly serve as warning signs: the signs that warn vehicles and pedestrians to pay attention to dangerous places.
  • the colors are yellow background, black border, black pattern, and the shape is an equilateral triangle with the top angle facing upward;
  • the general colors are white background, red circle, red bar, black pattern, pattern bar, and the shape is round, eight Angular, equilateral triangle with the top angle facing down;
  • indicating signs, with a total of 29 seed types play the role of indication: signs indicating the movement of vehicles and pedestrians, the color is blue background, white pattern, the shape is divided into circles, rectangles and squares;
  • the color is brown background, white character pattern, and the shape is rectangular and square. .
  • the navigation processing method according to the present invention is used to navigate a mobile platform. Therefore, the type of traffic sign (that is, navigation type) that needs to be identified may be a road sign and/or a sign mainly serving as a guide.
  • the signpost can inform the road ahead, (such as 100 meters ahead to reach Nanshan Avenue).
  • the above image area may be identified in step S402 to determine the feature information corresponding to the image area, where the feature information includes the shape information and/or color information of the image area. Further, the mobile platform may determine the type of the target traffic sign included in the image area according to the feature information in step S403, and if the type of the target traffic sign is a navigation type, determine the navigation assistance of the mobile platform according to the target image in step S404 information.
  • step S404 For the specific implementation manner of step S404, reference may be made to the relevant description of step S202 in the foregoing embodiment, and details are not described here.
  • the navigation type may be a route sign and/or an indication sign.
  • the direction signs are generally blue background and white pattern, and the shape is divided into circle, rectangle and square; the direction signs are generally blue background and white pattern; the highway is generally green background and white pattern and the shape is generally rectangular And square.
  • the above characteristic information includes color information.
  • the mobile platform can determine whether the color indicated by the color information is the preset sign color, and if so, determine that the type of the target traffic sign included in the image area is the target type, the target type and the preset indication Card color association.
  • the color of the preset sign may be set to blue, and the type of the sign associated with the blue is a sign or a sign. In this case, if the mobile platform judges that the color indicated by the color information is the preset sign color blue, it can be determined that the type of the target traffic sign is the sign or the sign.
  • the above characteristic information may include color information and shape information.
  • the mobile platform can detect whether the color indicated by the color information is the preset sign color, and whether the edge shape of the image area is the preset sign shape, and if so, determine the target traffic sign included in the image area
  • the type is a target type, and the target type is associated with both the preset sign color and the preset sign shape.
  • the color of the associated preset sign is blue
  • the associated shape may be a circle, a rectangle, and a square.
  • the associated preset sign is blue or green
  • the associated shape may be a rectangle or a square.
  • the mobile platform detects that the color indicated by the color information is blue or green, and the shape of the edge of the image area is rectangular or square, it can be determined that the type of target traffic sign included in the image area is a road sign .
  • the mobile platform after the mobile platform determines that the type of the target traffic sign is the navigation type, it can match the image content of the image area with the standard image belonging to the navigation type in the traffic sign library, and based on the feature point matching As a result, the target standard image is determined from the standard images belonging to the navigation type, and the target standard image is associated with the target traffic sign included in the image area. Further, the mobile platform may determine navigation assistance information of the mobile platform according to the semantic information of the target standard image.
  • step S405 the navigation information of the navigation module may be optimized based on the navigation assistance information to obtain the target navigation information of the mobile platform.
  • step S405 reference may be made to the related description of step S203 in the foregoing embodiment, and details are not described here.
  • the server can also calculate the travel distance of the mobile platform based on the visual odometer, and optimize the target navigation information according to the travel distance.
  • the visual odometer may be, for example, a visual inertial odometer (Visual Inertial Odometry, VIO) and a visual odometer (Visual Odometry, VO).
  • the mobile platform may call the vision module to collect environmental images of the environment of the mobile platform at preset collection intervals, and use the visual odometer to determine the distance between adjacent environmental images
  • the similarity determines the moving distance of the vision module. That is, the travel distance of the mobile platform is determined. Further, the target navigation information is optimized according to the distance traveled.
  • the target navigation information includes the navigation distance that the mobile platform moves to the target object, and the navigation direction that the mobile platform moves to the target object.
  • the optimization process may be optimization of the navigation distance in the target navigation information.
  • the mobile platform is a car driving on a public transportation road
  • the target navigation information determined by the car includes the navigation distance of the mobile platform moving to the target object is 1.5 km
  • the navigation of the mobile platform moving to the target object The direction is right.
  • the mobile platform calculates the travel distance of the mobile platform is 0.5 km based on the visual odometer.
  • the mobile platform can use the travel distance 0.5 km to target navigation information
  • the navigation distance of 1.5km is optimized to generate new target navigation information.
  • the navigation distance of the mobile platform to the target object is updated from 1.5km to 1km. In this way, even in the case of poor navigation signals, the accuracy of navigation can still be guaranteed.
  • an embodiment of the present invention further provides a navigation processing device as shown in FIG. 5, the navigation processing device is configured on a mobile platform and is used to navigate the mobile platform,
  • the mobile platform includes a navigation module and a vision module, and the navigation processing device includes:
  • the collection module 50 is configured to call the vision module to collect a target image of the environment where the mobile platform is currently located, and the target image includes an image area of a traffic sign;
  • the processing module 51 is configured to determine navigation assistance information of the mobile platform according to the target image
  • the processing module 51 is also used to optimize the navigation information of the navigation module based on the navigation assistance information to obtain target navigation information of the mobile platform.
  • the processing module 51 is further configured to perform identification processing on the image area to determine the feature information corresponding to the image area, and the feature information includes shape information of the image area and/or Or color information; determining the type of the target traffic sign included in the image area according to the feature information; if the type of the target traffic sign is a navigation type, triggering the execution of determining the movement according to the target image Platform navigation aids.
  • the characteristic information includes the color information
  • the processing module 51 is specifically configured to determine whether the color indicated by the color information is a preset sign color; if so, determine the image area
  • the included target traffic sign type is a target type, and the target type is associated with the preset sign color.
  • the characteristic information includes the color information and the shape information
  • the processing module 51 is specifically configured to detect whether the color indicated by the color information is a preset sign color, and the Whether the shape of the edge of the image area is the shape of a preset sign; if it is, it is determined that the type of the target traffic sign included in the image area is the target type, the target type and the preset sign color and the preset The signs are all related.
  • the processing module 51 is specifically configured to match the image content of the image area with the standard image in the traffic sign library; based on the matching result of the feature points, the traffic sign library A target standard image is determined in, the target standard image is associated with a target traffic sign included in the image area; and navigation assistance information of the mobile platform is determined according to the semantic information of the target standard image.
  • the processing module 51 is specifically configured to determine the initial navigation direction according to the semantic information of the target standard image; perform text recognition on the image content of the image area to determine the image content of the image area The indicated road instruction information; based on the initial navigation direction and the road instruction information, determining navigation assistance information of the mobile platform.
  • the navigation assistance information includes steering information and distance information
  • the distance information is calculated according to a first distance and a second distance obtained from the road instruction information
  • the first distance is Refers to the distance between the mobile platform and the target traffic sign.
  • the processing module 51 is specifically configured to determine a contact point between the target traffic sign and the ground in the target image, and determine a reference according to the position information of the contact point in the target image The angle, based on the reference height of the vision module and the reference angle, calculates the distance between the mobile platform and the target traffic sign.
  • the navigation assistance information further includes identification information of at least one indicating object indicated by the target traffic sign
  • the processing module 51 is further used to obtain navigation information from the navigation module, the navigation The information includes identification information of at least one navigation object corresponding to the navigation route of the mobile platform; based on the navigation information and the navigation assistance information, it is determined whether a target object exists in the at least one indicated object, and the identification of the target object The information matches the identification information of the navigation object. If the target object exists, trigger execution of optimizing the navigation information collected by the navigation module based on the navigation assistance information.
  • the processing module 51 is specifically configured to sum the first distance and the second distance in the navigation assistance information to obtain that the mobile platform moves to the target object The navigation distance of the mobile phone; analyze the steering information in the navigation assistance information to obtain the navigation direction of the mobile platform moving to the target object; generate target navigation information corresponding to the mobile platform moving to the target object,
  • the target navigation information includes the navigation distance that the mobile platform moves to the target object, and the navigation direction that the mobile platform moves to the target object.
  • the navigation assistance information further includes: lane information analyzed and determined on the collected lane signs, and the generated target navigation information further includes determination based on the lane information and the navigation direction
  • the lane indication information is used to instruct the mobile platform to turn into a target lane that matches the navigation direction.
  • the processing module 51 is further configured to calculate the traveled distance of the mobile platform based on a visual odometer; and optimize the target navigation information according to the traveled distance.
  • FIG. 6 is a schematic block diagram of a navigation processing device according to an embodiment of the present invention.
  • the navigation processing device is configured on a mobile platform and is used to navigate the mobile platform.
  • the mobile platform includes a navigation module and a vision module
  • the navigation processing device may include a processor 60, a communication interface 61, and a memory 62.
  • the processor 60, the communication interface 61, and the memory 62 are connected through a bus.
  • the memory 62 is used to store program instructions and image data (such as target images) ).
  • the memory 62 may include volatile memory (volatile memory), such as random-access memory (RAM); the memory 62 may also include non-volatile memory (non-volatile memory), such as flash memory (flash memory), solid-state drive (SSD), etc.; the memory 62 may also be a double-rate synchronous dynamic random access memory (Double Data Rate SDRAM, DDR); the memory 62 may also include a combination of the above types of memory.
  • volatile memory volatile memory
  • RAM random-access memory
  • non-volatile memory such as flash memory (flash memory), solid-state drive (SSD), etc.
  • flash memory flash memory
  • SSD solid-state drive
  • DDR double-rate synchronous dynamic random access memory
  • DDR double-rate synchronous dynamic random access memory
  • the memory 62 is used to store a computer program
  • the computer program includes program instructions
  • the processor 60 is configured to execute when the program instructions are called: call the vision module to collect the movement A target image of the environment where the platform is currently located, the target image including an image area of a traffic sign; determining navigation assistance information of the mobile platform according to the target image; navigating the navigation module based on the navigation assistance information
  • the information is optimized to obtain target navigation information of the mobile platform.
  • the processor 60 is further configured to perform identification processing on the image area to determine the feature information corresponding to the image area, the feature information includes shape information of the image area and/or Or color information; determining the type of the target traffic sign included in the image area according to the feature information; if the type of the target traffic sign is a navigation type, triggering the execution of determining the movement according to the target image Navigation assistance information for the platform.
  • the characteristic information includes the color information
  • the processor 60 is specifically configured to determine whether the color indicated by the color information is a preset sign color; if so, determine the image area
  • the included target traffic sign type is a target type, and the target type is associated with the preset sign color.
  • the feature information includes the color information and the shape information.
  • the processor 60 is specifically configured to detect whether the color indicated by the color information is a preset sign color, and the Whether the shape of the edge of the image area is the shape of a preset sign; if it is, it is determined that the type of the target traffic sign included in the image area is the target type, the target type and the preset sign color and the preset The signs are all related.
  • the processor 60 is specifically configured to perform feature point matching between the image content of the image area and the standard image in the traffic sign library; based on the matching result of the feature points, from the traffic sign library A target standard image is determined in, the target standard image is associated with a target traffic sign included in the image area; and navigation assistance information of the mobile platform is determined according to the semantic information of the target standard image.
  • the processor 60 is specifically configured to determine the initial navigation direction based on the semantic information of the target standard image; perform text recognition on the image content of the image area to determine the image content of the image area The indicated road instruction information; based on the initial navigation direction and the road instruction information, determining navigation assistance information of the mobile platform.
  • the navigation assistance information includes steering information and distance information
  • the distance information is calculated according to a first distance and a second distance obtained from the road instruction information
  • the first distance is Refers to the distance between the mobile platform and the target traffic sign.
  • the processor 60 is specifically configured to determine the contact point between the target traffic sign and the ground in the target image, and determine the location of the contact point in the target image The reference angle, and further based on the reference height of the vision module and the reference angle, calculate the distance between the mobile platform and the target traffic sign.
  • the navigation assistance information further includes identification information of at least one indicating object indicated by the target traffic sign
  • the processor 60 is further configured to obtain navigation information from the navigation module, the navigation The information includes identification information of at least one navigation object corresponding to the navigation route of the mobile platform; based on the navigation information and the navigation assistance information, it is determined whether a target object exists in the at least one indicated object, and the identification of the target object The information matches the identification information of the navigation object. If the target object exists, trigger execution of optimizing the navigation information collected by the navigation module based on the navigation assistance information.
  • the processor 60 is specifically configured to sum the first distance and the second distance in the navigation assistance information to obtain that the mobile platform moves to the target object The navigation distance of the mobile phone; analyze the steering information in the navigation assistance information to obtain the navigation direction of the mobile platform moving to the target object; generate target navigation information corresponding to the mobile platform moving to the target object,
  • the target navigation information includes the navigation distance that the mobile platform moves to the target object, and the navigation direction that the mobile platform moves to the target object.
  • the navigation assistance information further includes: lane information analyzed and determined on the collected lane signs, and the generated target navigation information further includes determination based on the lane information and the navigation direction
  • the lane indication information is used to instruct the mobile platform to turn into a target lane that matches the navigation direction.
  • the processor 60 is further configured to calculate the traveled distance of the mobile platform based on a visual odometer; and optimize the target navigation information according to the traveled distance.
  • the storage medium may be a magnetic disk, an optical disk, a read-only memory (Read-Only Memory, ROM) or a random storage memory (Random Access Memory, RAM), etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)

Abstract

L'invention concerne un procédé et un appareil de traitement de navigation et un dispositif de traitement de navigation, le procédé étant utilisé pour la navigation d'une plateforme mobile (10), la plateforme mobile (10) comprenant un module de navigation (101) et un module visuel (100), et le procédé consistant : à invoquer le module visuel (100) afin de collecter des images cibles de l'environnement dans lequel la plateforme mobile (10) est actuellement localisée ; en fonction des images cibles, à déterminer des informations d'aide à la navigation de la plateforme mobile (10) ; et, en fonction des informations d'aide à la navigation, à optimiser les informations de navigation du module de navigation (101) afin d'obtenir des informations de navigation cibles de la plateforme mobile (10) ; des images peuvent être incorporées pour aider la navigation du module de navigation (101), ce qui facilite l'amélioration de la précision de la navigation de la plateforme mobile (10).
PCT/CN2018/119616 2018-12-06 2018-12-06 Procédé et appareil de traitement de navigation et dispositif de traitement de navigation WO2020113528A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2018/119616 WO2020113528A1 (fr) 2018-12-06 2018-12-06 Procédé et appareil de traitement de navigation et dispositif de traitement de navigation
CN201880065696.2A CN111213031A (zh) 2018-12-06 2018-12-06 一种导航处理方法、装置及导航处理设备

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/119616 WO2020113528A1 (fr) 2018-12-06 2018-12-06 Procédé et appareil de traitement de navigation et dispositif de traitement de navigation

Publications (1)

Publication Number Publication Date
WO2020113528A1 true WO2020113528A1 (fr) 2020-06-11

Family

ID=70790034

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/119616 WO2020113528A1 (fr) 2018-12-06 2018-12-06 Procédé et appareil de traitement de navigation et dispositif de traitement de navigation

Country Status (2)

Country Link
CN (1) CN111213031A (fr)
WO (1) WO2020113528A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113327447B (zh) * 2021-07-20 2022-08-19 北京百度网讯科技有限公司 导航提醒方法、装置、设备、车辆及存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102789693A (zh) * 2012-08-10 2012-11-21 深圳市路畅科技股份有限公司 一种道路标志牌自动识别方法及车载识别装置
CN103292804A (zh) * 2013-05-27 2013-09-11 浙江大学 一种单目自然视觉路标辅助的移动机器人定位方法
CN105393087A (zh) * 2013-07-15 2016-03-09 奥迪股份公司 用于运行导航设备的方法、导航设备和机动车
CN105809095A (zh) * 2014-12-31 2016-07-27 博世汽车部件(苏州)有限公司 交通路口通行状态的确定
US20180039281A1 (en) * 2016-08-04 2018-02-08 Hon Hai Precision Industry Co., Ltd. Autonomous mobile device and method of forming guiding path
CN108254776A (zh) * 2017-12-25 2018-07-06 东风汽车集团有限公司 基于路沿荧光反射和双目相机的隧道定位系统及方法

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103729837A (zh) * 2013-06-25 2014-04-16 长沙理工大学 一种单个路况摄像机的快速标定方法
CN104792312A (zh) * 2014-01-20 2015-07-22 广东工业大学 以定距三球为视觉标志物的室内自动运输车定位系统
CN105955259B (zh) * 2016-04-29 2019-04-23 南京航空航天大学 基于多窗口实时测距的单目视觉agv的精确定位方法
CN106092114A (zh) * 2016-06-22 2016-11-09 江苏大学 一种图像识别的汽车实景导航装置及方法
CN106651953B (zh) * 2016-12-30 2019-10-18 山东大学 一种基于交通指示牌的车辆位姿估计方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102789693A (zh) * 2012-08-10 2012-11-21 深圳市路畅科技股份有限公司 一种道路标志牌自动识别方法及车载识别装置
CN103292804A (zh) * 2013-05-27 2013-09-11 浙江大学 一种单目自然视觉路标辅助的移动机器人定位方法
CN105393087A (zh) * 2013-07-15 2016-03-09 奥迪股份公司 用于运行导航设备的方法、导航设备和机动车
CN105809095A (zh) * 2014-12-31 2016-07-27 博世汽车部件(苏州)有限公司 交通路口通行状态的确定
US20180039281A1 (en) * 2016-08-04 2018-02-08 Hon Hai Precision Industry Co., Ltd. Autonomous mobile device and method of forming guiding path
CN108254776A (zh) * 2017-12-25 2018-07-06 东风汽车集团有限公司 基于路沿荧光反射和双目相机的隧道定位系统及方法

Also Published As

Publication number Publication date
CN111213031A (zh) 2020-05-29

Similar Documents

Publication Publication Date Title
US11959771B2 (en) Creation and use of enhanced maps
EP3673407B1 (fr) Détection de dissimulation automatique dans des données de réseau routier
CN110174093B (zh) 定位方法、装置、设备和计算机可读存储介质
CN106352867B (zh) 用于确定车辆自身位置的方法和设备
US8379923B2 (en) Image recognition processing device, method, and program for processing of image information obtained by imaging the surrounding area of a vehicle
KR101241651B1 (ko) 영상 인식 장치 및 그 방법, 영상 기록 장치 또는 그방법을 이용한 위치 판단 장치, 차량 제어 장치 및네비게이션 장치
JP4513740B2 (ja) 経路案内システム及び経路案内方法
US20160347327A1 (en) Autonomous vehicle driving assist system, method, and program
WO2015096717A1 (fr) Procédé et dispositif de positionnement
US10942519B2 (en) System and method for navigating an autonomous driving vehicle
CN111710159B (zh) 一种基于虚拟车道线的交叉路口车辆路径规划方法和装置
WO2021056841A1 (fr) Procédé de positionnement, procédé et appareil de détermination de trajet, robot et support de stockage
CN102208012A (zh) 风景匹配参考数据生成系统及位置测量系统
JP2006177862A (ja) ナビゲーション装置
WO2022088722A1 (fr) Procédé de navigation, appareil, dispositif de pilotage intelligent et support d'enregistrement
WO2022001618A1 (fr) Procédé et appareil de commande de suivi de voie, et système associé pour véhicule
JP2018200501A (ja) 車線情報出力方法および車線情報出力装置
KR20200071792A (ko) 로드뷰 또는 항공뷰 맵 정보를 이용한 자율주행 방법 및 그 시스템
JP2020060369A (ja) 地図情報システム
JP2014067165A (ja) 運転支援装置
WO2023179028A1 (fr) Procédé et appareil de traitement d'image, dispositif et support de stockage
CN114518122A (zh) 行车导航方法、装置、计算机设备、存储介质和计算机程序产品
US20230018996A1 (en) Method, device, and computer program for providing driving guide by using vehicle position information and signal light information
CN112781600A (zh) 一种车辆导航方法、装置及存储介质
JP2012215442A (ja) 自位置特定システム、自位置特定プログラム及び自位置特定方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18942628

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18942628

Country of ref document: EP

Kind code of ref document: A1