CN108981726A - Unmanned vehicle semanteme Map building and building application method based on perceptual positioning monitoring - Google Patents

Unmanned vehicle semanteme Map building and building application method based on perceptual positioning monitoring Download PDF

Info

Publication number
CN108981726A
CN108981726A CN201810590558.8A CN201810590558A CN108981726A CN 108981726 A CN108981726 A CN 108981726A CN 201810590558 A CN201810590558 A CN 201810590558A CN 108981726 A CN108981726 A CN 108981726A
Authority
CN
China
Prior art keywords
map
unmanned vehicle
entity
barrier
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810590558.8A
Other languages
Chinese (zh)
Inventor
项卫锋
季彩玲
王池如
王晓彬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anhui Yufeng Intelligent Technology Co Ltd
Original Assignee
Anhui Yufeng Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anhui Yufeng Intelligent Technology Co Ltd filed Critical Anhui Yufeng Intelligent Technology Co Ltd
Priority to CN201810590558.8A priority Critical patent/CN108981726A/en
Publication of CN108981726A publication Critical patent/CN108981726A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • G01C21/32Structuring or formatting of map data

Abstract

The invention discloses the unmanned vehicle semanteme Map building monitored based on perceptual positioning and building application methods, are related to unmanned technical field.In the present invention: including two conceptual modules: entity and attribute in unmanned vehicle semanteme Map building;Entity includes from vehicle entity, road network entity and barrier entity;Road network entity includes domain entities and point entity;Attribute includes point coordinate, regional scope and constraint;Semantic relation is divided into object properties and data attribute two parts.The present invention is by constructing a set of map element data hierarchy suitable for unmanned vehicle, by devising sufficient semantic relation between map element, consequently facilitating generative semantics map;And it carries out semantic reasoning by semantic map, Global motion planning path, the real-time obstacle information of the current pose of unmanned vehicle and periphery and obtains unmanned vehicle part scene information, unmanned vehicle is assisted to carry out behaviour decision making, understanding of the unmanned vehicle to driving scene element is efficiently completed, the association search efficiency of map element is improved.

Description

Unmanned vehicle semanteme Map building and building application method based on perceptual positioning monitoring
Technical field
The present invention relates to unmanned technical fields, more particularly to the unmanned vehicle semanteme map based on perceptual positioning monitoring is built Mould and building application method.
Background technique
In recent years, the unmanned extensive concern for having obtained domestic and international academia and industry, Related Supporting Technologies There is quick development.From the point of view of unpiloted application product, generally can be divided into industrial production using unmanned product and Unmanned product two major classes are applied in personal consumption.It, generally can be by Unmanned Systems on unmanned technological system composition The submodules such as environment sensing, decision rule and motion control are divided into, wherein environment sensing is obtained by various sensors Take the real-time scene information of traffic environment and build environment model (i.e. perception map);Decision rule is on the basis of environmental model On, it makes and meets traffic rules, the behaviour decision making of safety and corresponding avoidance driving trace;The rail that motion control will be planned The discrete control instruction for turning to the practically necessary execution of unmanned vehicle of mark, such as throttle, brake, steering wheel angle, and it is sent to nothing People's vehicle executes system and executes, and realizes autonomous driving behavior.Wherein, environment sensing is equivalent to the function of " eyes " of unmanned vehicle, ring The content of border perception includes autonomous positioning, road Identification, quiet dynamic disorder analyte detection etc., wherein it is crucial that autonomous positioning.
The implementation method of unpiloted autonomous positioning navigation at present has: magnetic stripe colour band of early stage etc. tracks navigation, outdoor Satellite navigation, it is synchronous to position and build figure navigation and numerical map navigation.Current domestic and international unmanned field, specified Scene is mainstream technology using the navigation of height essence numerical map.
Unmanned transfer robot is the important component of industrial automatic driving vehicle, and current unmanned transfer robot is pressed Application scenarios divide, and can be divided into indoor unmanned transfer robot and outdoor transfer robot.Location technology is mobile robot Core technology, the location navigation principle of indoor unmanned transfer robot mainly has, laser beacon navigation, laser SLAM navigation, Vision guided navigation and the navigation of magnetic stripe colour band etc..The above navigator fix technology covers substantially indoor most scenes, and takes Obtain better effects.In terms of outdoor unmanned transfer robot, current product is outdoor also based on major port magnetic conductance formula technology The technical maturity of laser and vision guided navigation is low, mostly can not commercialization.For the scene of outdoor carrying, we have proposed utilizations Multi-line laser radar constructs the smart beacon map of three-dimensional height of large scene as navigation core sensor.It should be pointed out that should High-precision map not only can satisfy the location navigation demand of unmanned transfer robot, while being also able to satisfy indoor positioning navigation need to It asks, thus solves indoor and outdoor, the unmanned carrying demand of indoor and outdoor mixing scene.
Existing Chinese patent: Publication No. CN104535070A (application number 20141083873.5), the patent provides one Map data structure is divided into four layers: road network, vehicle by kind accurately graph data structure, acquisition and processing system and method Road network, lane line information and specific information data, although defining the association of database level between several levels, Due to lacking semantic information, unmanned vehicle be difficult to establish in this map data structure all kinds of map elements and traffic participant it Between perfect semantic relation, differentiate unmanned vehicle real-time scene information, realize scene understanding.Meanwhile such as crossing, the information that turns around hardly possible It is also not accurate enough for lane line and being associated with for lane to be embodied in its data structure, if certain section of road may be that two lanes become The relationship of three lanes, that lane and lane line among such words three lanes will be beyond expression of words.
Existing Chinese patent: Publication No. CN104089619A (application number 201410202876.4), the patent provides A kind of accurate matching system of GPS navigation map and its operating method of pilotless automobile are determined by obtaining road information Initial point obtains vehicle location information, information matches and the accurate matching for screening this process completion navigation map, but it is matched Method mainly passes through discrete point and scans for, and not using the relevance between map element, will lead to match in this way The problem of low efficiency.
Main problem of the existing technology is that the data structure of high-precision map is not easy to unmanned vehicle progress scene understanding, The relevance for not utilizing map element well simultaneously, causes map search efficiency relatively low.How effectively efficiently to complete Unmanned vehicle becomes problem to be solved to the understanding of driving scene element and the association search efficiency of raising map element.
Summary of the invention
The present invention in order to solve the above-mentioned technical problem, provide based on perceptual positioning monitoring unmanned vehicle semanteme Map building and Application method is constructed, to efficiently complete understanding of the unmanned vehicle to driving scene element, improves the association search of map element Efficiency.
In order to solve the above technical problems, the present invention is achieved by the following technical solutions:
The present invention provides the unmanned vehicle semanteme map modeling method monitored based on perceptual positioning, including building for beacon map Mould, the building of beacon map, map element defines and the Camera calibration based on beacon map, including unmanned vehicle is semantically Figure modeling;
It include two conceptual modules: entity and attribute in unmanned vehicle semanteme Map building.
Entity includes from vehicle entity, road network entity and barrier entity;Road network entity includes domain entities and point entity; Domain entities include whole section, tie point, boundary, road separator, special area, crossing, lane line, lane, road Section;Domain entities include crossing, turn around and number of track-lines increase and decrease at join domain;Point entity includes land marking, roadside mark Knowledge and stop line;Barrier entity includes dynamic barrier, static-obstacle thing, means of transportation types of obstructions, pedestrian, moves Object, vehicle, natural obstacle and road intercept class barrier;Attribute includes point coordinate, regional scope and constraint;In attribute Point coordinate is the point coordinate of map element;Regional scope in attribute is the regional scope of map element;Being constrained in attribute Constrained type between map element.
Semantic relation comprising the corresponding concepts in two conceptual modules in unmanned vehicle semanteme Map building;Semantic relation point For object properties and data attribute two parts;Object properties part includes the inheritance and association pass between corresponding concepts System;Including establishing the hierarchical relationship between corresponding concepts;Including establishing the incidence relation between corresponding concepts;Data attribute part Including the global path planning information from vehicle.
It wherein, is the unmanned vehicle entity of unmanned vehicle itself or respective type from vehicle entity;Comprising several equidirectional in section Lane;Land marking is ground traffic sign;Roadside is identified as roadside traffic mark.
Wherein, natural obstacle includes being recessed ground noodles barrier and projectedly noodles barrier;Road intercepts class barrier Including failure nameplate, cone bucket, water horse fence, defiber and construction nameplate.
Wherein, constraint includes connection constraints, is the connection direction constraint in section and section;Connection constraints include that left steering connects Connect constraint, right turn connection constraints, the connection constraints that turn around and straight trip connection constraints.
Wherein, the incidence relation between corresponding concepts includes whole section and road separator, section, between tie point Syntagmatic;Incidence relation between corresponding concepts includes the connection relationship between section and tie point;Between corresponding concepts Incidence relation includes the positional relationship between section and road separator;Incidence relation between corresponding concepts includes section and people Positional relationship between row lateral road;Incidence relation between corresponding concepts includes the positional relationship between section and stop line;Phase Answering the incidence relation between concept includes the positional relationship between section and boundary;Incidence relation between corresponding concepts includes road Relationship between section and lane;Incidence relation between corresponding concepts includes the relationship between section and roadside mark;It is corresponding general Incidence relation between thought include between tie point and connection constraints there are relationships;Incidence relation between corresponding concepts includes Relationship between tie point and crossing;Incidence relation between corresponding concepts includes that the position between lane and lane line is closed System;Incidence relation between corresponding concepts includes the position relation between lane and other lanes;Association between corresponding concepts Relationship includes the positional relationship between lane and special area;Incidence relation between corresponding concepts includes lane and land marking Between relationship;Incidence relation between corresponding concepts includes the relationship of connection constraints and section to state connection direction;Phase Answering the incidence relation between concept includes the position relation between vehicle and barrier entity;Incidence relation between corresponding concepts Including the positional relationship between vehicle and lane;Incidence relation between corresponding concepts includes between domain entities and regional scope Relationship;Incidence relation between corresponding concepts includes the relationship between point entity and point coordinate.
Wherein, data attribute part includes the present speed from vehicle;Data attribute part include from vehicle and it is next will The distance of the tie point of arrival, crossing, stop line;Data attribute part include from vehicle at a distance from barrier entity;Number It include the present speed and pose of barrier entity according to attribute section;Data attribute part includes the data information of a coordinate; Data attribute part includes the data information of regional scope;Data attribute part include the speed limiting information in lane, lane permit Perhaps direction information, the whether most left most right lane mark in lane and lane width;Data attribute part includes the vehicle that section includes Road quantity;Data attribute part includes the type information in whole section;Data attribute part includes the basic of corresponding entity concept Attribute.
The construction method of unmanned vehicle semanteme map based on perceptual positioning monitoring: by static map data instance and Real-time barrier instantiates generative semantics map.Specific step is as follows:
The first step obtains the detailed data information of true running environment by sensory perceptual system, by map detailed data according to Concept of Map structure example turns to static road network entity;
Second step obtains real-time barrier posture information by sensor, with being instantiated as barrier by obstacle information Figure entity;
Third step, the entity established in static map and barrier map obtained in the step first step and second step are mutual Semantic relation obtains the semantic map for unmanned vehicle.
Wherein, sensory perceptual system is using laser radar, camera, GPS or photo monitoring satellite or corresponding sensing device system; Sensor uses laser radar, camera, GPS or corresponding sensing device system.
Based on perceptual positioning monitoring unmanned vehicle semanteme map application method: by semantic map, Global motion planning path, The real-time obstacle information of the current pose of unmanned vehicle and periphery carries out semantic reasoning and obtains unmanned vehicle part scene information, realizes nothing The scene understanding of people's vehicle assists unmanned vehicle decision.Specific step is as follows:
The first step obtains unmanned vehicle target travel path by unmanned vehicle Global motion planning system, and it is fixed to be positioned by GPS/INS Obtain the current pose of unmanned vehicle in real time to system;
Second step is obtained by unmanned vehicle context aware systems real-time perception periphery obstacle information by semantic reasoning Their relative poses between unmanned vehicle;
Third step passes through semantic map, Global motion planning path, the current pose of unmanned vehicle and periphery barrier relative pose It carries out semantic reasoning and obtains unmanned vehicle part scene information;
4th step assists unmanned vehicle to make different decisions according to corresponding scene information.
Compared with prior art, the beneficial effects of the present invention are:
The present invention by constructing a set of map element data hierarchy suitable for unmanned vehicle, by map element it Between devise sufficient semantic relation, consequently facilitating generative semantics map;And by semantic map, Global motion planning path, nobody The real-time obstacle information of the current pose of vehicle and periphery carries out semantic reasoning and obtains unmanned vehicle part scene information, assists unmanned vehicle Behaviour decision making is carried out, understanding of the unmanned vehicle to driving scene element is efficiently completed, improves the association search efficiency of map element.
Detailed description of the invention
Fig. 1 is the flow chart of unmanned vehicle semanteme Map building of the present invention and application;
Fig. 2 is semantic map element concept hierarchy structure chart;
Fig. 3 is semantic map element inclusion relation figure;
Fig. 4 is the concept related relational graph of semantic map element;
Fig. 5 is unmanned vehicle and barrier position relation figure;
Fig. 6 is semantic map generating process schematic diagram;
Fig. 7 is the schematic diagram of the part implementation content in whole section in semantic map;
Fig. 8 is the part implementation content schematic diagram in semantic map from vehicle;
Fig. 9 is the schematic diagram of semantic reasoning partial content.
Specific embodiment
In order to make the objectives, technical solutions, and advantages of the present invention clearer, with reference to the accompanying drawings and embodiments, right The present invention is further elaborated.It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, and It is not used in the restriction present invention.
Specific embodiment one:
As shown in Figure 1 and Figure 2, the present embodiment provides a kind of modeling method of semantic map, the concept knot including semantic map The method of structure, semantic relation and true map instantiation generative semantics map.
As shown in figure 3, Ontology is divided into two big modules: entity and attribute:
(1) entity includes having respectively represented from vehicle, road network entity and barrier entity from vehicle (unmanned vehicle) entity, road Network element element entity and barrier entity.
(11) different type unmanned vehicle can be extended to according to demand by referring to unmanned vehicle itself from vehicle.
(12) road network entity includes domain entities and point entity, respectively represents area type entity and vertex type entity.
(121) domain entities include whole section, tie point, boundary, road separator, special area, crossing, vehicle Diatom, lane, section.Wherein, whole section represents the whole section an of road, including tie point, section, boundary and road Road isolation strip;Including crossing, turn around and number of track-lines increase and decrease at region join domain;Section includes multiple equidirectional vehicles Road.
(122) point entity includes land marking, roadside mark and stop line, respectively represents ground traffic sign, roadside Traffic mark and stop line (there are one-to-one relationships with section for stop line, therefore can simplify into a point).
(13) barrier entity include dynamic barrier, static-obstacle thing, means of transportation types of obstructions, pedestrian, animal, Vehicle, natural obstacle and road intercept class barrier.Wherein natural obstacle includes recessed ground noodles barrier (such as: puddle) Projectedly noodles barrier (such as bulk stone).It includes that failure nameplate, cone bucket, water horse enclose that road, which intercepts class barrier, Column, defiber and construction nameplate.
(2) attribute includes point coordinate, regional scope and constraint, has respectively represented point coordinate, the region model of map element Enclose and map element between constrained type.Constraint includes connection constraints, and the connection direction for representing section and section constrains.Even Connecing constraint includes left steering connection constraints, right turn connection constraints, the connection constraints that turn around and straight trip connection constraints.
As shown in figure 4, contain the semantic relation in semantic map, contain in front defined between each conception of species Semantic relation.Semantic relation is divided into object properties and data attribute two parts:
(1) object properties part includes inheritance (extensive specialization) and the incidence relation between different concepts.
(11) hierarchical relationship between different concepts has been described in the above content.
(12) incidence relation between different concepts includes whole section and road separator, section, between tie point Syntagmatic (its relationship name is respectively as follows: there are road separator, there are section, there are tie points);Between section and tie point Connection relationship (its relationship name are as follows: associated connection point), between section and road separator positional relationship (its relationship name are as follows: Associated road isolation strip), the positional relationship (its relationship name are as follows: affiliated person's row lateral road) between section and crossing, section with Positional relationship (its relationship name are as follows: association stop line) between stop line, positional relationship (its relationship name between section and boundary Are as follows: association boundary), the relationship (its relationship name are as follows: there are lanes) between section and lane, the pass between section and roadside mark It is (its relationship name are as follows: there are roadside marks);Between tie point and connection constraints there are relationship (its relationship names are as follows: there is company Connect constraint), the relationship (its relationship name are as follows: there are crossings) between tie point and crossing;Between lane and lane line Positional relationship (its relationship name is respectively as follows: there are left-lane line, there are right-lane lines), the orientation between lane and other lanes Relationship (its relationship name is respectively as follows: left lane, in the same direction the right lane in the same direction), the positional relationship between lane and special area (its relationship name are as follows: there are special areas), the relationship (its relationship name are as follows: there are land markings) between lane and land marking; The relationship of connection constraints and section to state connection direction (its relationship name is respectively as follows: starting section, target road section).From vehicle with (its orientation is as shown in figure 5, its relationship name is respectively as follows: there are left back barrier, exists position relation between barrier entity Dead astern barrier, there are right back barrier, there are left front barrier, exist front barrier, there are right front barriers Hinder object, there are front-left barrier, there are front-right barriers), positional relationship between vehicle and lane (its relationship name are as follows: Affiliated lane);Relationship (its relationship name are as follows: associated region range) between domain entities and regional scope;Point entity and point are sat Relationship (its relationship name are as follows: relating dot coordinate) between mark.Physical relationship is as follows:
(2) data attribute part includes global path planning information (its attribute of a relation name are as follows: next crossing turns from vehicle To) and present speed (its relationship name are as follows: from vehicle real-time speed), it is horizontal from vehicle and next tie point that will be reached, people's row The distance (its relationship name is respectively as follows: and tie point distance and crossing distance and stop line distance) in road, stop line, from vehicle With (its relationship name are as follows: with obstacle distance) at a distance from barrier;Present speed (its relationship name are as follows: obstacle of barrier entity Object speed) and pose (its relationship name are as follows: the barrier direction of motion);Data information (its relationship name are as follows: point coordinate of point coordinate Value);The data information (its relationship name are as follows: regional scope value) of regional scope;Speed limiting information (its relationship name difference in lane Are as follows: lane the max speed, lane minimum speed), lane allow direction information (its relationship name are as follows: lane crossing turn to), lane Whether most left most right lane identifies (its relationship name is respectively as follows: most left-lane, in the same direction most right lane in the same direction) and lane width (its Relationship name are as follows: lane width);The lane quantity (its relationship name are as follows: number of track-lines contained by section) that section includes;The class in whole section Type information (its relationship name are as follows: whole road segment classification);(its relationship name is respectively as follows: entity ID, entity to the essential attribute of each concept class Name).Physical relationship is as follows:
Specific embodiment two:
As shown in fig. 6, the method for static map data instance and real-time barrier instantiation generative semantics map, Steps are as follows:
Step 1: obtaining the detailed of true running environment by sensory perceptual systems such as laser radar, camera, GPS, satellite photoes Data information, and map detailed data is turned into static road network entity according to the Concept of Map structure example;
Step 2: obtaining real-time barrier posture information by sensors such as laser radar, camera, GPS, barrier is believed Breath is instantiated as barrier map entities;
Step 3: semanteme closes the entity in static map obtained in establishment step 1, step 2 and barrier map each other System, finally obtains the semantic map for unmanned vehicle.
Specific embodiment three:
As shown in fig. 7, being the modeling example figure of one section of true map, which includes a crossroad, one to turn around, Multiple sections and other map elements, key element all use arrow logo to come out, land marking, roadside mark difference One has only been taken to be used as signal.
Firstly, obtaining map detailed data;Then map detailed data is divided into difference according to semantic map concept structure Classification map element and static road network entity is turned to according to aforementioned concepts structure example.
As shown in the figure, wherein road laterally and longitudinally represents two whole section entities, crossroad entity is Tie point 002, the entity that turns around is tie point 001, and each section is to be connected by tie point with other sections, empty among road Line arrow represents connection constraints entity, is associated with tie point 002, and tie point 002 should have 12 connection constraints entities herein, point Different directions section is not represented by the existing connection relationship of tie point 002, is only labelled with part connection constraints entity herein, Other map elements such as lane line, lane, road separator, boundary etc. have all marked in Fig. 7.
Existing semantic relation between the map element entity being previously completed is set up, such as section 003, there are lanes For lane 003 and lane 004, lane 003 is lane line 002 there are left-lane line, and left lane in the same direction is lane 004.Due to Whole correlation comparison complexity are not easy to be described in detail, and the object properties of each entity and data attribute are established one by one;Pass through sense Know that system obtains barrier posture information in real time, and barrier map entities are turned to according to aforementioned concepts structure example, will hinder Object entity and static road network entity is hindered to set up semantic relation;Finally, by static state road network entity obtained in preceding step, in real time Barrier map entities and their association are planned as a whole to get up, and obtain semantic map.
Specific embodiment four:
As shown in figure 8, the square that its cartographic semantics information all in Fig. 7, is directed toward from vehicle arrow represents unmanned vehicle present bit Set, current unmanned vehicle travel to close to tie point (tie point may include crossing, turn around and number of track-lines increase and decrease at etc. regions), The current pose of unmanned vehicle and periphery obstacle information are obtained by sensory perceptual system real-time perception, is obtained by semantic reasoning and nothing People's vehicle relative pose, and on this basis, by passing through semantic map, Global motion planning path, the current pose of unmanned vehicle and week Side barrier relative pose carries out semantic reasoning and obtains unmanned vehicle part scene information, determines so that unmanned vehicle be assisted to make behavior Plan.
There are barrier vehicle 002, (with obstacle distance be 7m, barrier speed is 0, barrier in discovery front in Fig. 8 The direction of motion is in the same direction), there are barrier vehicle 001, (with obstacle distance be 15m, barrier speed is 0, obstacle for right front The object direction of motion is in the same direction) and the right there are barrier vehicle 003 (be 2m with obstacle distance, barrier speed is 0, barrier It is in the same direction for hindering the object direction of motion), therefore judge that unmanned vehicle should stop;Meanwhile it being illustrated in figure 9 one section of reasoning process signal, Known according to global path planning from the next crossing of vehicle and turned to turn left, at the same from the affiliated lane of vehicle be lane 004, section 003 There are lane lane 004, associated connection point is tie point 002, and tie point 002 is that connection constraints 004 are (affiliated there are connection constraints Concept class: left steering connection constraints (affiliated parent: connection constraints), starting section: section 003, target road section: section 008), Therefore can carry out semantic reasoning to predict next section to be reached be section 008, by section 008 it is available its Place local map information helps unmanned vehicle that next local map information to be reached is known in advance.Specific step is as follows:
Step 1 is obtained unmanned vehicle target travel path by unmanned vehicle Global motion planning system, and is positioned by GPS/INS Orientation system obtains the current pose of unmanned vehicle in real time;
Step 2 passes through unmanned vehicle context aware systems real-time perception periphery obstacle information, obtains it by semantic reasoning Relative pose between unmanned vehicle;
Step 3 passes through semantic map, Global motion planning path, the current pose of unmanned vehicle and periphery barrier relative pose It carries out semantic reasoning and obtains unmanned vehicle part scene information;
Step 4 assists unmanned vehicle to make different decisions according to different scenes information.
It, can be in short, the present invention relates to a kind of method based on ontological unmanned vehicle semanteme cartographic model construction method It applies in unmanned vehicle software systems, unmanned vehicle is helped to understand scene information.The special needle of semantic cartographic model that the present invention constructs The cartographic information element of interest to unmanned vehicle carries out model construction, can accurately express the scene that unmanned vehicle may face, and All there is semantic relation between map element and traffic participant, the semantic map application method energy that provides through the invention Enough help its place scene of unmanned vehicle fast understanding.
The present invention passes through utilization by the unmanned vehicle semanteme Map building and building application method that monitor based on perceptual positioning The precision distance measurement characteristic of laser perception, the point cloud information of laser beacon is partitioned into from single frames point cloud, recycles preset letter Physical model is marked, the geography information of beacon is calculated, including the three-dimensional distance information relative to laser sensor.It is calculated with track Information is associated with the beacon location information that expanded Kalman filtration algorithm solves consecutive frame, obtains the three-dimensional beacon ground in region Figure.By utilizing multi-line laser radar data, the geography information of environment fixed obstacle is calculated;It is excellent by figure optimization algorithm Change, combines the sub- map delamination of each beacon, obtain global expansible beacon-barrier map.By map using index Mechanism realizes from anyon map and carries out secondary location navigation, ensure that the navigation real-time demand of unmanned mobile robot.
The present invention has the following characteristics that in application process
One, the beacon map that constructs of the present invention is a kind of to can be applied to outdoor scene and the three-dimensional of indoor scene builds figure skill Art, to outdoor wet weather, the atrocious weathers such as light differential, spray dust environmental condition has good robustness round the clock.
Two, the high resolution of present invention building map, can reach the precision of Centimeter Level in the scene of length and width km grade;? It builds during figure through point cloud characteristic matching and reckoning algorithm fusion, obtains the sub- map in the region of layering and be associated with son The global map of figure avoids the detection error and Algorithm Error of sensor, is applicable in accurately so that the overall precision of map reaches The requirement of navigation.
Three, the beacon map that the present invention constructs is 3 D stereo, and beacon is distributed with (x, y, z) three dimensions in map Information, building drawing method, not only the flat scene of road pavement is effective, equally effective to acclive outdoor scene.
Four, present invention building beacon map can be widely used in industrial carrying machine people, region pilotless automobile Location navigation, map reference is high-efficient, is able to satisfy requirement of real-time.
Part of that present invention that are not described in detail belong to the well-known technology of those skilled in the art.
The above content is the detailed descriptions for combining specific embodiment to carry out the present invention, but can not assert the present invention Specific implementation be only limited to these contents.Under the premise of not departing from the principle and spirit of the invention, those skilled in the art can To implement to carry out several adjustment, modification to these, protection scope of the present invention has appended claims and its equivalent to limit.
The foregoing is merely illustrative of the preferred embodiments of the present invention, is not intended to limit the invention, all in essence of the invention Made any modifications, equivalent replacements, and improvements etc., should all be included in the protection scope of the present invention within mind and principle.

Claims (9)

1. based on the unmanned vehicle semanteme map modeling method of perceptual positioning monitoring, modeling, beacon map including beacon map Building, map element defines and the Camera calibration based on beacon map, it is characterised in that:
Including unmanned vehicle semanteme Map building;
It include two conceptual modules: entity and attribute in unmanned vehicle semanteme Map building;
Entity includes from vehicle entity, road network entity and barrier entity;
Road network entity includes domain entities and point entity;
Domain entities include whole section, tie point, boundary, road separator, special area, crossing, lane line, vehicle Road, section;
Domain entities include crossing, turn around and number of track-lines increase and decrease at join domain;
Point entity includes land marking, roadside mark and stop line;
Barrier entity include dynamic barrier, static-obstacle thing, means of transportation types of obstructions, pedestrian, animal, vehicle, from Right obstacle and road intercept class barrier;
Attribute includes point coordinate, regional scope and constraint;
Point coordinate in attribute is the point coordinate of map element;
Regional scope in attribute is the regional scope of map element;
The constrained type being constrained between map element in attribute;
Semantic relation comprising the corresponding concepts in two conceptual modules in unmanned vehicle semanteme Map building;
Semantic relation is divided into object properties and data attribute two parts;
Object properties part includes the inheritance and incidence relation between corresponding concepts;
Including establishing the hierarchical relationship between corresponding concepts;
Including establishing the incidence relation between corresponding concepts;
Data attribute part includes the global path planning information from vehicle.
2. the unmanned vehicle semanteme map modeling method according to claim 1 based on perceptual positioning monitoring, it is characterised in that:
It is the unmanned vehicle entity of unmanned vehicle itself or respective type from vehicle entity;
It include several equidirectional lanes in section;
Land marking is ground traffic sign;
Roadside is identified as roadside traffic mark.
3. the unmanned vehicle semanteme map modeling method according to claim 1 based on perceptual positioning monitoring, it is characterised in that:
Natural obstacle includes being recessed ground noodles barrier and projectedly noodles barrier;
It includes failure nameplate, cone bucket, water horse fence, defiber and construction nameplate that road, which intercepts class barrier,.
4. the unmanned vehicle semanteme map modeling method according to claim 1 based on perceptual positioning monitoring, it is characterised in that:
Constraint includes connection constraints, is the connection direction constraint in section and section;
Connection constraints include left steering connection constraints, right turn connection constraints, the connection constraints that turn around and straight trip connection constraints.
5. the unmanned vehicle semanteme map modeling method according to claim 1 based on perceptual positioning monitoring, it is characterised in that:
Incidence relation between corresponding concepts includes whole section and road separator, section, the syntagmatic between tie point;
Incidence relation between corresponding concepts includes the connection relationship between section and tie point;
Incidence relation between corresponding concepts includes the positional relationship between section and road separator;
Incidence relation between corresponding concepts includes the positional relationship between section and crossing;
Incidence relation between corresponding concepts includes the positional relationship between section and stop line;
Incidence relation between corresponding concepts includes the positional relationship between section and boundary;
Incidence relation between corresponding concepts includes the relationship between section and lane;
Incidence relation between corresponding concepts includes the relationship between section and roadside mark;
Incidence relation between corresponding concepts include between tie point and connection constraints there are relationships;
Incidence relation between corresponding concepts includes the relationship between tie point and crossing;
Incidence relation between corresponding concepts includes the positional relationship between lane and lane line;
Incidence relation between corresponding concepts includes the position relation between lane and other lanes;
Incidence relation between corresponding concepts includes the positional relationship between lane and special area;
Incidence relation between corresponding concepts includes the relationship between lane and land marking;
Incidence relation between corresponding concepts includes the relationship of connection constraints and section to state connection direction;
Incidence relation between corresponding concepts includes the position relation between vehicle and barrier entity;
Incidence relation between corresponding concepts includes the positional relationship between vehicle and lane;
Incidence relation between corresponding concepts includes the relationship between domain entities and regional scope;
Incidence relation between corresponding concepts includes the relationship between point entity and point coordinate.
6. the unmanned vehicle semanteme map modeling method according to claim 1 based on perceptual positioning monitoring, it is characterised in that:
Data attribute part includes the present speed from vehicle;
Data attribute part include from vehicle at a distance from next tie point that will be reached, crossing, stop line;
Data attribute part include from vehicle at a distance from barrier entity;
Data attribute part includes the present speed and pose of barrier entity;
Data attribute part includes the data information of a coordinate;
Data attribute part includes the data information of regional scope;
Data attribute part includes the speed limiting information in lane, lane permission direction information, the whether most left most right lane in lane Mark and lane width;
Data attribute part includes the lane quantity that section includes;
Data attribute part includes the type information in whole section;
Data attribute part includes the essential attribute of corresponding entity concept.
7. the construction method of the unmanned vehicle semanteme map based on perceptual positioning monitoring, it is characterised in that:
Generative semantics map is instantiated by static map data instance and real-time barrier, the specific steps are as follows:
The first step obtains the detailed data information of true running environment by sensory perceptual system, by map detailed data according to map Concept structure is instantiated as static road network entity;
Second step obtains real-time barrier posture information by sensor, and it is real that obstacle information is instantiated as barrier map Body;
Third step, the entity established in static map and barrier map obtained in the step first step and second step are semantic each other Relationship obtains the semantic map for unmanned vehicle.
8. the construction method of the unmanned vehicle semanteme map according to claim 7 based on perceptual positioning monitoring, it is characterised in that:
Sensory perceptual system is using laser radar, camera, GPS or photo monitoring satellite or corresponding sensing device system;
Sensor uses laser radar, camera, GPS or corresponding sensing device system.
9. the application method of the unmanned vehicle semanteme map based on perceptual positioning monitoring, it is characterised in that:
It carries out semanteme by semantic map, Global motion planning path, the real-time obstacle information of the current pose of unmanned vehicle and periphery and pushes away Reason obtains unmanned vehicle part scene information, realizes the scene understanding of unmanned vehicle, assists unmanned vehicle decision;
Specific step is as follows:
The first step obtains unmanned vehicle target travel path by unmanned vehicle Global motion planning system, passes through GPS/INS positioning and directing system System obtains the current pose of unmanned vehicle in real time;
Second step obtains them by semantic reasoning by unmanned vehicle context aware systems real-time perception periphery obstacle information Relative pose between unmanned vehicle;
Third step is carried out by semantic map, Global motion planning path, the current pose of unmanned vehicle and periphery barrier relative pose Semantic reasoning obtains unmanned vehicle part scene information;
4th step assists unmanned vehicle to make different decisions according to corresponding scene information.
CN201810590558.8A 2018-06-09 2018-06-09 Unmanned vehicle semanteme Map building and building application method based on perceptual positioning monitoring Pending CN108981726A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810590558.8A CN108981726A (en) 2018-06-09 2018-06-09 Unmanned vehicle semanteme Map building and building application method based on perceptual positioning monitoring

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810590558.8A CN108981726A (en) 2018-06-09 2018-06-09 Unmanned vehicle semanteme Map building and building application method based on perceptual positioning monitoring

Publications (1)

Publication Number Publication Date
CN108981726A true CN108981726A (en) 2018-12-11

Family

ID=64540124

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810590558.8A Pending CN108981726A (en) 2018-06-09 2018-06-09 Unmanned vehicle semanteme Map building and building application method based on perceptual positioning monitoring

Country Status (1)

Country Link
CN (1) CN108981726A (en)

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109460042A (en) * 2018-12-29 2019-03-12 北京经纬恒润科技有限公司 A kind of automatic Pilot control method and system
CN109976332A (en) * 2018-12-29 2019-07-05 惠州市德赛西威汽车电子股份有限公司 One kind being used for unpiloted accurately graph model and autonomous navigation system
CN110008921A (en) * 2019-04-12 2019-07-12 北京百度网讯科技有限公司 A kind of generation method of road boundary, device, electronic equipment and storage medium
CN110118564A (en) * 2019-03-22 2019-08-13 纵目科技(上海)股份有限公司 A kind of data management system, management method, terminal and the storage medium of high-precision map
CN110174115A (en) * 2019-06-05 2019-08-27 武汉中海庭数据技术有限公司 A kind of method and device automatically generating high accuracy positioning map based on perception data
CN110866079A (en) * 2019-11-11 2020-03-06 桂林理工大学 Intelligent scenic spot real scene semantic map generating and auxiliary positioning method
CN110992730A (en) * 2019-12-05 2020-04-10 北京京东乾石科技有限公司 Traffic data processing method and related equipment
CN111060117A (en) * 2019-12-17 2020-04-24 苏州智加科技有限公司 Local map construction method and device, computer equipment and storage medium
CN111105480A (en) * 2019-12-20 2020-05-05 上海有个机器人有限公司 Building semantic map establishing method, medium, terminal and device
CN111142521A (en) * 2019-12-25 2020-05-12 五邑大学 VSLAM-based planning method and device for different terrains and storage medium
CN111383450A (en) * 2018-12-29 2020-07-07 阿里巴巴集团控股有限公司 Traffic network description method and device
CN111680113A (en) * 2019-03-11 2020-09-18 武汉小狮科技有限公司 Intersection vector-grid map scheme suitable for small-sized automatic driving vehicle
WO2020191642A1 (en) * 2019-03-27 2020-10-01 深圳市大疆创新科技有限公司 Trajectory prediction method and apparatus, storage medium, driving system and vehicle
CN111735462A (en) * 2019-03-25 2020-10-02 本田技研工业株式会社 Travel road link generation device and travel road link generation method
CN111928862A (en) * 2020-08-10 2020-11-13 廊坊和易生活网络科技股份有限公司 Method for constructing semantic map on line by fusing laser radar and visual sensor
CN112116654A (en) * 2019-06-20 2020-12-22 杭州海康威视数字技术股份有限公司 Vehicle pose determining method and device and electronic equipment
CN112530270A (en) * 2019-09-17 2021-03-19 北京初速度科技有限公司 Mapping method and device based on region allocation
WO2021057909A1 (en) * 2019-09-27 2021-04-01 苏州宝时得电动工具有限公司 Autonomous robot, travel path planning method and apparatus thereof, and storage medium
CN112651557A (en) * 2020-12-25 2021-04-13 际络科技(上海)有限公司 Trajectory prediction system and method, electronic device and readable storage medium
CN112990572A (en) * 2021-03-12 2021-06-18 上海交通大学 Dynamic scheduling system and method for park unmanned logistics vehicles
CN112988922A (en) * 2019-12-16 2021-06-18 长沙智能驾驶研究院有限公司 Perception map construction method and device, computer equipment and storage medium
CN113383283A (en) * 2019-12-30 2021-09-10 深圳元戎启行科技有限公司 Perception information processing method and device, computer equipment and storage medium
CN113532450A (en) * 2021-06-29 2021-10-22 广州小鹏汽车科技有限公司 Virtual parking map data processing method and system
CN114332635A (en) * 2022-03-11 2022-04-12 科大天工智能装备技术(天津)有限公司 Automatic obstacle identification method and system for intelligent transfer robot
CN114509065A (en) * 2022-02-16 2022-05-17 北京易航远智科技有限公司 Map construction method, map construction system, vehicle terminal, server side and storage medium
CN116242339A (en) * 2023-05-11 2023-06-09 天津市安定医院 5G-based hospital outpatient navigation system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015042536A1 (en) * 2013-09-20 2015-03-26 Namesforlife Llc Systems and methods for establishing semantic equivalence between concepts
CN104535070A (en) * 2014-12-26 2015-04-22 上海交通大学 High-precision map data structure, high-precision map data acquiringand processing system and high-precision map data acquiringand processingmethod
CN106802954A (en) * 2017-01-18 2017-06-06 中国科学院合肥物质科学研究院 Unmanned vehicle semanteme cartographic model construction method and its application process on unmanned vehicle
CN106980657A (en) * 2017-03-15 2017-07-25 北京理工大学 A kind of track level electronic map construction method based on information fusion
CN107145578A (en) * 2017-05-08 2017-09-08 深圳地平线机器人科技有限公司 Map constructing method, device, equipment and system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015042536A1 (en) * 2013-09-20 2015-03-26 Namesforlife Llc Systems and methods for establishing semantic equivalence between concepts
CN104535070A (en) * 2014-12-26 2015-04-22 上海交通大学 High-precision map data structure, high-precision map data acquiringand processing system and high-precision map data acquiringand processingmethod
CN106802954A (en) * 2017-01-18 2017-06-06 中国科学院合肥物质科学研究院 Unmanned vehicle semanteme cartographic model construction method and its application process on unmanned vehicle
CN106980657A (en) * 2017-03-15 2017-07-25 北京理工大学 A kind of track level electronic map construction method based on information fusion
CN107145578A (en) * 2017-05-08 2017-09-08 深圳地平线机器人科技有限公司 Map constructing method, device, equipment and system

Cited By (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109976332A (en) * 2018-12-29 2019-07-05 惠州市德赛西威汽车电子股份有限公司 One kind being used for unpiloted accurately graph model and autonomous navigation system
CN111383450B (en) * 2018-12-29 2022-06-03 阿里巴巴集团控股有限公司 Traffic network description method and device
CN109460042A (en) * 2018-12-29 2019-03-12 北京经纬恒润科技有限公司 A kind of automatic Pilot control method and system
CN111383450A (en) * 2018-12-29 2020-07-07 阿里巴巴集团控股有限公司 Traffic network description method and device
CN111680113A (en) * 2019-03-11 2020-09-18 武汉小狮科技有限公司 Intersection vector-grid map scheme suitable for small-sized automatic driving vehicle
CN110118564A (en) * 2019-03-22 2019-08-13 纵目科技(上海)股份有限公司 A kind of data management system, management method, terminal and the storage medium of high-precision map
CN110118564B (en) * 2019-03-22 2024-02-23 纵目科技(上海)股份有限公司 Data management system, management method, terminal and storage medium for high-precision map
CN111735462A (en) * 2019-03-25 2020-10-02 本田技研工业株式会社 Travel road link generation device and travel road link generation method
WO2020191642A1 (en) * 2019-03-27 2020-10-01 深圳市大疆创新科技有限公司 Trajectory prediction method and apparatus, storage medium, driving system and vehicle
CN110008921A (en) * 2019-04-12 2019-07-12 北京百度网讯科技有限公司 A kind of generation method of road boundary, device, electronic equipment and storage medium
CN110008921B (en) * 2019-04-12 2021-12-28 北京百度网讯科技有限公司 Road boundary generation method and device, electronic equipment and storage medium
CN110174115A (en) * 2019-06-05 2019-08-27 武汉中海庭数据技术有限公司 A kind of method and device automatically generating high accuracy positioning map based on perception data
CN112116654A (en) * 2019-06-20 2020-12-22 杭州海康威视数字技术股份有限公司 Vehicle pose determining method and device and electronic equipment
CN112530270A (en) * 2019-09-17 2021-03-19 北京初速度科技有限公司 Mapping method and device based on region allocation
WO2021057909A1 (en) * 2019-09-27 2021-04-01 苏州宝时得电动工具有限公司 Autonomous robot, travel path planning method and apparatus thereof, and storage medium
CN110866079B (en) * 2019-11-11 2023-05-05 桂林理工大学 Generation and auxiliary positioning method of intelligent scenic spot live-action semantic map
CN110866079A (en) * 2019-11-11 2020-03-06 桂林理工大学 Intelligent scenic spot real scene semantic map generating and auxiliary positioning method
CN110992730A (en) * 2019-12-05 2020-04-10 北京京东乾石科技有限公司 Traffic data processing method and related equipment
CN112988922A (en) * 2019-12-16 2021-06-18 长沙智能驾驶研究院有限公司 Perception map construction method and device, computer equipment and storage medium
CN111060117A (en) * 2019-12-17 2020-04-24 苏州智加科技有限公司 Local map construction method and device, computer equipment and storage medium
CN111105480A (en) * 2019-12-20 2020-05-05 上海有个机器人有限公司 Building semantic map establishing method, medium, terminal and device
CN111105480B (en) * 2019-12-20 2023-09-08 上海有个机器人有限公司 Building semantic map building method, medium, terminal and device
CN111142521A (en) * 2019-12-25 2020-05-12 五邑大学 VSLAM-based planning method and device for different terrains and storage medium
CN113383283A (en) * 2019-12-30 2021-09-10 深圳元戎启行科技有限公司 Perception information processing method and device, computer equipment and storage medium
CN111928862B (en) * 2020-08-10 2023-11-21 廊坊和易生活网络科技股份有限公司 Method for on-line construction of semantic map by fusion of laser radar and visual sensor
CN111928862A (en) * 2020-08-10 2020-11-13 廊坊和易生活网络科技股份有限公司 Method for constructing semantic map on line by fusing laser radar and visual sensor
CN112651557A (en) * 2020-12-25 2021-04-13 际络科技(上海)有限公司 Trajectory prediction system and method, electronic device and readable storage medium
CN112990572A (en) * 2021-03-12 2021-06-18 上海交通大学 Dynamic scheduling system and method for park unmanned logistics vehicles
CN113532450A (en) * 2021-06-29 2021-10-22 广州小鹏汽车科技有限公司 Virtual parking map data processing method and system
CN114509065A (en) * 2022-02-16 2022-05-17 北京易航远智科技有限公司 Map construction method, map construction system, vehicle terminal, server side and storage medium
CN114509065B (en) * 2022-02-16 2023-11-07 北京易航远智科技有限公司 Map construction method, system, vehicle terminal, server and storage medium
CN114332635B (en) * 2022-03-11 2022-05-31 科大天工智能装备技术(天津)有限公司 Automatic obstacle identification method and system for intelligent transfer robot
CN114332635A (en) * 2022-03-11 2022-04-12 科大天工智能装备技术(天津)有限公司 Automatic obstacle identification method and system for intelligent transfer robot
CN116242339A (en) * 2023-05-11 2023-06-09 天津市安定医院 5G-based hospital outpatient navigation system
CN116242339B (en) * 2023-05-11 2023-10-03 天津市安定医院 5G-based hospital outpatient navigation system

Similar Documents

Publication Publication Date Title
CN108981726A (en) Unmanned vehicle semanteme Map building and building application method based on perceptual positioning monitoring
CN106802954B (en) Unmanned vehicle semantic map model construction method and application method thereof on unmanned vehicle
US11216004B2 (en) Map automation—lane classification
US11427225B2 (en) All mover priors
US11774261B2 (en) Automatic annotation of environmental features in a map during navigation of a vehicle
US11954797B2 (en) Systems and methods for enhanced base map generation
US11953340B2 (en) Updating road navigation model using non-semantic road feature points
US10628671B2 (en) Road modeling from overhead imagery
CN104819724B (en) A kind of autonomous travel assist system of Unmanned Ground Vehicle based on GIS
CN111076731B (en) Automatic driving high-precision positioning and path planning method
US11288521B2 (en) Automated road edge boundary detection
JP2022535351A (en) System and method for vehicle navigation
CN110111566A (en) Trajectory predictions method, apparatus and storage medium
CN108010360A (en) A kind of automatic Pilot context aware systems based on bus or train route collaboration
CN106980657A (en) A kind of track level electronic map construction method based on information fusion
CN109544443A (en) A kind of route drawing generating method and device
CN114509065B (en) Map construction method, system, vehicle terminal, server and storage medium
Ballardini et al. A framework for outdoor urban environment estimation
EP3967978B1 (en) Detecting a construction zone by a lead autonomous vehicle (av) and updating routing plans for following autonomous vehicles (avs)
Hongbo et al. Relay navigation strategy study on intelligent drive on urban roads
Chipka et al. Estimation and navigation methods with limited information for autonomous urban driving
Rondinone Managing Automated Vehicles Enhances Network
Qingkai et al. Lightweight HD map construction for autonomous vehicles in non-paved roads
Qin et al. Traffic Flow-Based Crowdsourced Mapping in Complex Urban Scenario
US11566912B1 (en) Capturing features for determining routes

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20181211

RJ01 Rejection of invention patent application after publication