CN103913174B - The generation method and system of a kind of navigation information and mobile client and server end - Google Patents

The generation method and system of a kind of navigation information and mobile client and server end Download PDF

Info

Publication number
CN103913174B
CN103913174B CN201210592118.9A CN201210592118A CN103913174B CN 103913174 B CN103913174 B CN 103913174B CN 201210592118 A CN201210592118 A CN 201210592118A CN 103913174 B CN103913174 B CN 103913174B
Authority
CN
China
Prior art keywords
information
building
user
mobile client
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201210592118.9A
Other languages
Chinese (zh)
Other versions
CN103913174A (en
Inventor
乔宇
邹静
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Institute of Advanced Technology of CAS
Original Assignee
Shenzhen Institute of Advanced Technology of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Institute of Advanced Technology of CAS filed Critical Shenzhen Institute of Advanced Technology of CAS
Priority to CN201210592118.9A priority Critical patent/CN103913174B/en
Publication of CN103913174A publication Critical patent/CN103913174A/en
Application granted granted Critical
Publication of CN103913174B publication Critical patent/CN103913174B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • G01C21/32Structuring or formatting of map data

Abstract

The present invention is applicable to areas of information technology, it is provided that the generation method and system of a kind of navigation information.Method includes: mobile client receives or shoots user and is presently in the surrounding digital picture of position, automatically divide according to system after detection image is qualified or image type and key area that user demarcates extract characteristics of image, send together together with geo-localisation information to server end to retrieve the environmental information of image key area part, calculate camera site and user's Current vision angle at server end simultaneously, then environmental information and visual angle information are sent back mobile client, it is formed with Digital Image Fusion the navigation picture of the augmented reality of integrated environment information based on user's visual angle by mobile client by augmented reality means.The method can improve the precision of navigation system, provides the navigation picture of the augmented reality of integrated environment information based on user's visual angle, strengthens Consumer's Experience.

Description

The generation method and system of a kind of navigation information and mobile client and server end
Technical field
The invention belongs to areas of information technology, particularly relate to generation method and system and the shifting of a kind of navigation information Dynamic client and server end.
Background technology
Existing augmented reality onboard navigation system is that the image zooming-out according to image collecting device Real-time Collection is each Kind of road conditions feature, it is judged that surface conditions (includes traffic light feature, track feature, pedestrian's feature, front Side motor vehicles feature, bicycle feature etc.), in conjunction with navigation way set in advance, generate navigation hint Information (includes turning of motor vehicle information, motor vehicles doubling information, danger warning information, destination's distance letter Breath, road name information, status information of equipment), by this navigation hint information superposition to described current road conditions On image.It mainly provides road conditions identification and navigation feature, but fails to identify the building on image Thing, it is provided that environmental positioning information single, and be only capable of occurring obvious displacement to judge that user is current by user Direction of advance.
The navigation system of traditional facing mobile apparatus is obtained by satellite GPS or wireless Wi-Fi location mostly Current location information, and then transfer the electronic chart of current location, generate two-dimentional or three-dimensional navigation map. But the location information that this type of system provides also is the most single, the most only provide the Two-dimensional electron ground looking down angle Figure, also fails to provide user to be presently in the ambient condition information of position (such as periphery landmark title Deng), and its most unrenewable family static in the case of judge the current direction of user.But for most hands Holding mobile phone users, it is in addition to needs geo-localisation information, with greater need for by grasping significant ring around Environment information (such as landmark building etc.) assists oneself accurate understanding oneself to be presently in position, but conventional pilot Boat system only provides the street locations or latitude and longitude information that user is presently in, and it cannot allow user really manage Solve oneself present position.For example, navigation system only can point out active user to be in certain under normal circumstances Street or certain crossing, but user generally under foreign environment the most whereby information be cannot the clearest and the most definite oneself Location, if but be presently in environment building information plus providing user, user is just Can understand rapidly oneself currently position.
From the foregoing, it will be observed that in the face of level is the most complicated and increasingly three-dimensional urban traffic distribution, traditional navigation system System is simple relies on GPS to position user present position on Two-dimensional electron map, easily provides the user mistake Or fuzzy location information and navigation Service, the strong influence accuracy of navigation system.
On the other hand, existing navigation system mostly cannot provide in the case of user is static and judge that user works as Front to service, generally it is to move a certain distance by user to judge its direction of advance current, this Bring great inconvenience in actual use user particularly hand-held mobile terminal user.
Meanwhile, the navigation picture that existing navigator provides is mostly aerial view, does not meets user's actual vision angle Degree, and in built-up city, the southeast looking down angle that a lot of users cannot provide according to navigation picture Oneself correct direction of advance clear and definite is guided in direction, northwest, and this is accomplished by navigation system can provide the user one Navigation directions based on himself visual angle.
Summary of the invention
It is an object of the invention to provide generation method and system and the mobile client kimonos of a kind of navigation information Business device end, it is intended to present in solution prior art, navigation system only voucher one geo-location means are put down in two dimension User present position is positioned, location information that is that easily provide the user mistake or that obscure and navigation clothes on face Business, the problem of the accuracy of strong influence navigation system;It addition, existing navigation system is only capable of mostly Move a certain distance by user and judge its direction of advance current, it is necessary to exploitation is a kind of static user In the case of the method that judges of direction current to user;Furthermore, existing navigator the most only provides Look down the navigation picture of angle, be not inconsistent with user's particularly actual visual angle of hand-held mobile terminal user, it is impossible to The problem meeting its oneself correct direction of advance demand clear and definite.Additionally, existing navigator provides two mostly Dimensional plane or virtual map, lack explanation or the explanation that user is presently in environment, and sense of reality be relatively low, Make user more difficult and receive navigation information.
The present invention is achieved in that a kind of generation method of navigation information, said method comprising the steps of:
Mobile client receives the surrounding digital picture being presently in position of user's shooting;
Mobile client receive system automatically to shooting surrounding digital picture divide type information or connect Receive the calibration information that it is demarcated by user;
Mobile client extracts building in the surrounding digital picture of shooting according to type information or calibration information Build thing feature and/or character features;
Mobile client obtains the geographical location information that user is current;
The building feature extracted and/or character features or Text region result and user are worked as by mobile client Front geographical location information sends to server end.
Mobile client receives server end feedack, by augmented reality means by itself and digital picture Merge, form the navigation picture of the integrated environment information of augmented reality based on user's visual angle.
Another object of the present invention is to provide the generation system of a kind of navigation information, described system to include:
Digital picture reception/taking module, is presently in the surrounding number of position for receiving or shoot user Word image;
Type division receiver module, for receiving what the system surrounding digital picture automatically to shooting divided The calibration information that it is demarcated by type information or reception user;
Building/character features extraction module, for extracting shooting around according to type information or calibration information Building feature in environment digital picture and/or character features;
Geographical location information acquisition module, for obtaining the geographical location information that mobile phone users is current;
Communication module, for will extract building feature and/or character features and mobile phone users current Geographical location information sends to server end.And for receive server end feedback Context awareness information, Geography information and user's visual angle etc..
Synthesis module, for passing through the Context awareness information received, geography information and user's visual angle Augmented reality means, by itself and Digital Image Fusion, form the fusion of augmented reality based on user's visual angle The navigation picture of environmental information.
Another object of the present invention is to provide the shifting of a kind of generation system including navigation information recited above Dynamic client.
Another object of the present invention is to provide a kind of generation method of navigation information, described method to include following Step:
The building feature of received server-side mobile client transmission and/or word and mobile phone users are worked as Front geographical location information;
Server end extracts and this geographical position according to described geographical location information in building image data base Relevant building image;
Server end extracts the respective building feature of described relevant building image;
It is right that the respective building feature extracted and the building feature in the image of reception are carried out by server end Ratio, and then building is identified;
Server end extracts the relevant context information of the highest building of similarity;
Server end is carried out according to the feature of the Architectural drawing feature the received Architectural drawing the highest with similarity Comparison, according to the geographical position that building is actual, calculates user's Current vision angle.
Word after the correction that mobile client is sent by server end combines the geographical location information received and exists Geographic information database is retrieved, show that mobile phone users is presently in the relevant environment of geography information Information;
Server end extracts and this geographical position pair according to described geographical location information in geographic pattern data base The geographic pattern answered;
Above-mentioned relevant information is sent to mobile client by server end.
Above-mentioned geographic pattern and/or above-mentioned relevant context information are sent to mobile client by server end.
Another object of the present invention is to provide the generation system of a kind of navigation information, described system to include:
Receiver module, for receiving building feature and/or the word and used for mobile terminal that mobile client sends The geographical location information that family is current;
Building image zooming-out module, is used for according to described geographical location information in building image data base Extract the building image relevant to this geographical position;
Building feature extraction module, for extracting the respective building feature of described relevant building image;
Comparing module, for entering the respective building feature extracted with the building feature in the image of reception Row comparison, and then building is identified;
Environmental information extraction module, for extracting the relevant context information of the highest building of similarity;
Visual angle computing module, for the feature by image the highest with similarity for the characteristics of image that receive Compare, thus calculate user's visual angle.
Retrieval module, the word after correction mobile client sent combines the geographical position received Information is retrieved in geographic information database, show that mobile phone users is presently in the phase of geography information Close environmental information;
Geographic pattern extraction module, for extracting in geographic pattern data base according to described geographical location information The geographic pattern corresponding with this geographical position;
Sending module, for sending above-mentioned relevant context information to mobile client.
Another object of the present invention is to provide the clothes of a kind of generation system including navigation information recited above Business device end.
In the present invention, the embodiment of the present invention proposes one based on user perspective and the environment of analysis of image content Identifying and airmanship scheme, the program is taken pictures by mobile device according to user the digital image analysis of gained And identify its building information, and combine GPS location and electronic map information, will by augmented reality Information above blends with digital picture, while giving the environmental information of user's more horn of plenty, and can be from many User is three-dimensionally positioned by dimension angle, assists user's accurate understanding oneself by the image of augmented reality Location.And when user is in around without distinguishing mark building information, can be by shooting guideboard, public affairs Hand over the image of the identification Word messages such as station board, processed by character analysis and carry out character analysis and identification, and Position information in conjunction with GPS, active user present position can be accurately positioned, effectively solve existing navigation system big Only rely on the problem that geo-location means position on two dimensional surface electronic chart more.On the other hand, also avoid Because urban transportation construction or navigation map information update the Wrong localization caused and navigation information not in time Lost efficacy, improved the accuracy of navigation.
On the other hand, the present invention just can judge the reality of user in the case of user does not occur any displacement Direction, border is also identified on image, solve existing most navigator need user move one section bright The problem that could judge the actual direction of oneself after aobvious distance.
On the other hand, calculate user's Current vision angle according to image, and by augmented reality and user The digital picture of shooting blends, and provides the user navigation picture based on its actual visual angle.
On the other hand, the present invention fully take into account the navigator user interactivity of existing augmented reality the most by force with And the defect that information representation form is single, identify region merit by rich interactive interface and increase delineation key Can, enhance system location information and the accuracy of Context awareness information, and user self is provided shooting figure Picture and band user draw a circle to approve the crucial two-dimentional geographic pattern identifying region sectional drawing, from the visual angle of user self And the two kinds of different visual angles of two-dimensional map image angle looking down panorama be supplied to user more comprehensively, Directly perceived and real visual experience, enhances system location information and the ability to express of Context awareness information, makes User is easier to accept and understand self residing environment.
Additionally, the present invention also has the interactive interface of personalization, not only have higher compared with conventional navigation systems Interactivity, and also it is provided that personalized service, interface icon can be entered by user according to self actual demand Row operation, adds ease for use and the friendly of system, improves the Consumer's Experience of system, give user more Personalized service.
Accompanying drawing explanation
Fig. 1 be first embodiment of the invention provide navigation information generation method realize schematic flow sheet.
Fig. 2 is the structural representation of the generation system of the navigation information that first embodiment of the invention provides.
Fig. 3 be second embodiment of the invention provide navigation information generation method realize schematic flow sheet.
Fig. 4 is the structural representation of the generation system of the navigation information that second embodiment of the invention provides.
Fig. 5 is the mutual Organization Chart of mobile client and server end that the embodiment of the present invention provides.
Detailed description of the invention
In order to make the purpose of the present invention, technical scheme and beneficial effect clearer, below in conjunction with accompanying drawing And embodiment, the present invention is further elaborated.Should be appreciated that described herein being embodied as Example only in order to explain the present invention, is not intended to limit the present invention.
In embodiments of the present invention, the embodiment of the present invention devises one and divides based on user perspective and picture material The Context awareness technical scheme of analysis and airmanship scheme.This technical scheme can pass through mobile terminal according to user The image of shooting, is analyzed on server end and identifies landmark and the literary composition of guideboard in image Word information, so calculate image geographical position and shooting towards etc. information, determine user perspective direction, In mobile client, building information corresponding on image is identified, provides the user abundant location and ring Environment information.Simultaneously by the user perspective direction signs that calculate on image, make user just can obtain without mobile Currently towards.
Refer to Fig. 1, for first embodiment of the invention provide navigation information generation method realize flow process, It mainly comprises the steps that
In step S101, mobile client receives the surrounding numeral being presently in position of user's shooting Image;
In step s 102, mobile client receives the system surrounding digital picture division automatically to shooting Type information or receive user's calibration information that it is demarcated;
In embodiments of the present invention, it is provided that a human-computer interaction interface, user can be by this interactive interface to institute The image of shooting carries out Type division, to determine that image is by building recognition or Text region.
In step s 103, mobile client extracts the surrounding of shooting according to type information or calibration information Building feature in digital picture and/or character features;
In embodiments of the present invention, the building feature in the surrounding digital picture of shooting is extracted, specifically For:
By extracting the features such as color, texture, shape from the surrounding digital picture of shooting, to obtain figure The feature representation of picture.Additionally, utilize SIFT feature point detective operators, from image, detect SIFT feature point, And calculate the SIFT local feature of these points.Characteristics of image will upload onto the server end for further analysis and Identify picture material.The features such as color, texture, shape will be used for classifying scene, and SIFT feature will It is mainly used in building recognition.
The step of the character features in the surrounding digital picture of extraction shooting, particularly as follows:
Mainly identify the Word messages such as the guideboard in image, bus station's station board.Utilize the distribution of color and word Character area in image is detected and splits by characteristic.And then, utilize OCR technique to the literary composition in image Word is identified.Final recognition result will be identified in the picture, and end of uploading onto the server.
In step S104, mobile client obtains the geographical location information that user is current;
In embodiments of the present invention, GPS or WIFI module is utilized to obtain the geography that mobile phone users is current Positional information.
In step S105, word that building feature and/or the check and correction of extraction are completed by mobile client and The current geographical location information of user sends to server end.
In embodiments of the present invention, the generation method of described navigation information is further comprising the steps of:
Mobile client receives relevant context information and the visual angle that server end sends;
Mobile client is existing by strengthening by described relevant environment identification information, geography information and visual angle etc. Real means are fused in the digital picture of user's shooting.
As one embodiment of the present invention, in order to clearly identify object, improving recognition efficiency, user is permissible By drawing a circle to approve target recognition region on the touchscreen.It is implemented as follows:
Mobile client receives user district needing to identify of delineation in the surrounding digital picture of shooting Territory;
Mobile client extracts building feature corresponding to the region needing to identify drawn a circle to approve and/or character features.
In embodiments of the present invention, when need identify region shooting surrounding digital picture on too small When cannot draw a circle to approve, user can amplify the surrounding digital picture of described shooting, in order to user draws a circle to approve Need the region identified.
As another preferred embodiment of the present invention, need the region identified for convenience of quick delineation, according to building Thing identification and the different characteristics of Text region, provide the most different respectively and fast draw a circle to approve template, and user is clapping Fast being drawn a circle to approve by target recognition regional alignment when taking the photograph after in region and shoot, image will be carried out by system automatically Analyze and identify, it is not necessary to user additionally operates.It is implemented as follows:
Mobile client provides one fast to draw a circle to approve template;
Mobile client, when receiving the region needing to identify and falling in quick delineation template, performs shooting behaviour Make;
Mobile client extracts corresponding building and/or the character features of the image in delineation template.
As one embodiment of the present invention, the generation method of described navigation information is further comprising the steps of:
Mobile client extracts the region correspondence character features needing to identify of delineation.
Word is identified by mobile client according to the character features extracted;
The text importing that mobile client will identify that is proofreaded for user;
Word after correction is sent to server end by mobile client.
As one embodiment of the present invention, it is provided that a prompt facility, this function can be user on mutual boundary When face operates, the behavior to user is pointed out, and the image-region definition such as delineation undesirable carries Show user, thus carry out the operation behavior of specification user.
As one embodiment of the present invention, user can pass through interactive interface digitized map after augmented reality Switch between picture and geographic pattern, meet the different demands of user.
As one embodiment of the present invention, the generation method of described navigation information is further comprising the steps of:
Mobile client detection picture quality whether meet process requirement, when detect meet the requirements time, perform Step S102;When detecting undesirable, prompting user re-shoots.
Concrete examination criteria includes image size, readability etc..First digital picture is done whole detection, Detection in detail is done when image meets the identification region after basic handling requires drawn a circle to approve user, if delineation administrative division map As meeting next step analysis testing requirement, then perform the building extracting in the surrounding digital picture of shooting Thing feature and/or character features;Otherwise then returning interactive interface, prompting user re-shoots.
Refer to Fig. 2, for the structure of generation system of the navigation information that first embodiment of the invention provides.For It is easy to explanation, illustrate only the part relevant to the embodiment of the present invention.The generation system of described navigation information Including: digital picture reception/taking module 101, Type division receiver module 102, building/character features Extraction module 103, geographical location information acquisition module 104 and communication module 105.Described navigation information Generation system can be built in mobile client software unit, hardware cell or software and hardware knot The unit closed.
Digital picture reception/taking module 101, is presently in surrounding's ring of position for receiving or shoot user Border digital picture;
Type division receiver module 102, draws the surrounding digital picture of shooting automatically for receiving system The calibration information that it is demarcated by the type information divided or reception user;
Building/character features extraction module 103, for extracting shooting according to type information or calibration information Building feature in surrounding digital picture and/or character features;
Geographical location information acquisition module 104, for obtaining the geographical location information that mobile phone users is current;
Communication module 105, for will extract building feature and/or word and mobile phone users current Geographical location information sends to server end.
In embodiments of the present invention, the generation system of described navigation information also includes: synthesis module.
Communication module, is additionally operable to receive relevant context information and the visual angle that server end sends;
Synthesis module, for passing through to strengthen by described relevant environment identification information, geography information and visual angle Reality means are fused in the digital picture of user's shooting.
As one embodiment of the present invention, in order to clearly identify object, improving recognition efficiency, user is permissible By drawing a circle to approve target recognition region on the touchscreen.The generation system of described navigation information also includes: region connects Receive module.
Region receiver module, for receive user shooting surrounding digital picture on delineation need know Other region;
Building/character features extraction module 103, the region needing to identify being additionally operable to extract delineation is corresponding Building feature and/or character features.
In embodiments of the present invention, the generation system of described navigation information also includes: provide module.
Masterplate module, for providing one fast to draw a circle to approve template;
Digital image capture module, is additionally operable to when the region receiving needs identification falls in quick delineation template Time, perform shooting operation;
Building/character features extraction module 103, is additionally operable to extract corresponding the building of the image in delineation template Build thing and/or character features.
As one embodiment of the present invention, the generation system of described navigation information also includes: image detection mould Block and reminding module.
Image detection module, is used for detecting whether image meets process requirement;
Type division receiver module 102, be additionally operable to when detect meet the requirements time, receive system automatically to bat The calibration information that it is demarcated by the type information of the surrounding digital picture division taken the photograph or reception user;
Reminding module, for when detecting undesirable, prompting user re-shoots.
Refer to Fig. 3, for second embodiment of the invention provide navigation information generation method realize flow process, It mainly comprises the steps that
In step s 201, received server-side mobile client send building feature and/or word and The geographical location information that mobile phone users is current;
In step S202, server end carries in building image data base according to described geographical location information Take the building image relevant to this geographical position;
In step S203, server end extracts the respective building feature of described relevant building image;
In step S204, server end is by the building in the image of the respective building feature extracted and reception Thing feature contrasts, and then is identified building;
In step S205, server end extracts the relevant context information of the highest building of similarity;
In embodiments of the present invention, relevant context information mainly includes building name etc..
In step S206, the word combination after the correction that mobile client is sent by server end receives Geographical location information is retrieved in geographic information database, show that mobile phone users is presently in geography The relevant context information of information;
In embodiments of the present invention, this relevant context information specifically includes that guideboard and public transport stop board information etc..
In step S207, server end extracts in geographic pattern data base according to described geographical location information The geographic pattern corresponding with this geographical position;
In step S208, above-mentioned relevant context information is sent to mobile client by server end.
As one embodiment of the present invention, the generation method of described navigation information also includes:
Server end compares building feature in shooting image and calculates with the building information in GIS-Geographic Information System Go out user and shoot visual angle and the position of image.
Refer to Fig. 4, for the structure of generation system of the navigation information that second embodiment of the invention provides.For It is easy to explanation, illustrate only the part relevant to the embodiment of the present invention.The generation system of described navigation information Including: receiver module 201, building image zooming-out module 202, building feature extraction module 203, ratio To module 204, environmental information extraction module 205, retrieval module 206, geographic pattern extraction module 207, Sending module 208.The generation system of described navigation information can be the software list being built in mobile client Unit, hardware cell or the unit of software and hardware combining.
Receiver module 201, for receiving building feature and/or the word of mobile client transmission and moving eventually The geographical location information that end subscriber is current;
Building image zooming-out module 202, is used for according to described geographical location information in building view data Storehouse is extracted the building image relevant to this geographical position;
Building feature extraction module 203, for extracting the respective building of described relevant building image Feature;
Comparing module 204, the building in the image of the respective building feature that will extract and reception is special Levy and contrast, and then building is identified;
Environmental information extraction module 205, for extracting the relevant context information of the highest building of similarity;
Retrieval module 206, the word after correction mobile client sent combines the geography received Positional information is retrieved in geographic information database, show that mobile phone users is presently in geography information Relevant context information;
In embodiments of the present invention, this relevant context information specifically includes that guideboard and public transport stop board information etc..
Geographic pattern extraction module 207, is used for according to described geographical location information in geographic pattern data base Extract the geographic pattern corresponding with this geographical position;
Sending module 208, for sending above-mentioned relevant context information to mobile client.
As one embodiment of the present invention, the generation system of described navigation information also includes: visual angle meter Calculate module.
Visual angle computing module, for comparing in shooting image in building feature and GIS-Geographic Information System Building information calculates user and shoots visual angle and the position of image.
Refer to Fig. 5, the interaction of mobile client and server end be described below in detail:
Mobile phone users uses mobile terminal to shoot its surrounding digital picture being presently in position;Figure First this digital picture is carried out the detections such as image size, readability as detection module, when image meets base Detection in detail is done in the identification region that user is drawn a circle to approve after requiring by present treatment, if delineation area image meets next step Analysis testing requirement, then obtain user by interactive interface, captured image is carried out Type division, with Determine image be by building recognition or Text region (if employing system provide fast draw a circle to approve mould Plate, user shoots after fast being drawn a circle to approve in region by target recognition regional alignment when shooting, and system will be from Dynamic be analyzed image identifies, it is not necessary to user carries out type by interactive interface to captured image again and draws Point, to determine that image is by building recognition or Text region.);Otherwise then return interactive interface, Prompting user re-shoots.Then, building/character features extraction module digital picture is carried out building or Character features extracts;Word is identified by mobile client;The word that mobile client will identify that passes through Interactive interface is displayed for user and carries out check and correction amendment;The building that mobile client will be extracted by communication module Word and the current geographical location information of user after thing feature and/or check and correction send to server end.
Server end positions information according to the GPS (or WIFI) received from client (mobile client), The approximate range of Primary Location image taking, the spy built in then extracting this regional extent from data base Levy data.Then, the characteristic point that the point of interest feature obtained is built with standard is mated in image, Position and identify the key building in image.Utilize these key building positions in the picture simultaneously, Identify visual angle information during image taking.The key building that finally will identify that and orientation information are sent out Deliver to synthesis module.If image recognition failure, failure information is returned to client.
It addition, server end receives GPS (or WIFI) the location information and word sent from client, will It compares with guideboard in geographic information database and public transport stop board information, show that mobile phone users is current The specific geographic environmental location information of position.And by this information and the information one in building image recognition And it is sent to synthesis module.
The function of synthesis module is to utilize the GPS (or WIFI) received location information from map data base Transfer corresponding map image, and by the key building information received, user's vision orientation information, Text region information, geographical environment positional information etc. send to mobile client, and then mobile client is this A little information are merged by augmented reality on the photo being shown to shooting.
Mobile client receives the building recognition information of server end feedback, Text region information, geographical ring The information such as environment information, user's visual angle, camera site, are synthesized to user by augmented reality In the digital picture of shooting, then shown by interactive interface so that user can browse to comprise abundant The augmented reality image of environmental information.
It addition, the building word etc. that mobile client receives server end feedback identifies information and geographical letter Breath, is synthesized in digital picture by augmented reality, and the environmental information strengthening map image is expressed Ability.
In sum, the embodiment of the present invention proposes environment based on a user perspective and analysis of image content knowledge Not and airmanship scheme, the program according to user by mobile device take pictures gained digital image analysis also Identifying its building information, in conjunction with GPS location and electronic map information, analysis simultaneously show that user is actual and regards Feel angle and current direction, blended above-mentioned information with digital picture by augmented reality, giving While the environmental information of user's more horn of plenty, three-dimensionally user can be positioned from multidimensional angle, pass through The location of image auxiliary user's accurate understanding oneself of augmented reality.And when user is in around without distinguishing mark Property building information time, can be logical by shooting the image of the identification Word message such as guideboard, public transport stop board Cross character analysis process and carry out character analysis and identification, and combine GPS location information, can be accurately positioned current User present position, effectively solves existing navigation system and mostly only relies on geo-location means at two dimensional surface electricity The problem of location on sub-map.On the other hand, it also avoid because of urban transportation construction or navigation map information more The new Wrong localization caused and navigation information inefficacy not in time, improves the accuracy of navigation.
On the other hand, user's Current vision angle can be calculated according to characteristics of image at server end, and pass through Augmented reality is identified on image, even if making the user can also be clear and definite in the case of transfixion The direction that oneself is current.
On the other hand, the present invention fully take into account the navigator user interactivity of existing augmented reality the most by force with And the defect that information representation form is single, identify region merit by rich interactive interface and increase delineation key Can, enhance system location information and the accuracy of Context awareness information, and user self is provided shooting figure Picture and band user draw a circle to approve the crucial two-dimentional geographic pattern identifying region sectional drawing, from the visual angle of user self And the two kinds of different visual angles of two-dimensional map image angle looking down panorama be supplied to user more comprehensively, Directly perceived and real visual experience, enhances system location information and the ability to express of Context awareness information, makes User is easier to accept and understand self residing environment.
Additionally, the present invention also has the interactive interface of personalization, not only have higher compared with conventional navigation systems Interactivity, and also it is provided that personalized service, interface icon can be entered by user according to self actual demand Row operation, adds ease for use and the friendly of system, improves the Consumer's Experience of system, give user more Personalized service.
One of ordinary skill in the art will appreciate that all or part of step realizing in above-described embodiment method is Can instruct relevant hardware by program to complete, described program can be stored in a computer-readable Taking in storage medium, described storage medium, such as ROM/RAM, disk, CD etc..
The foregoing is only presently preferred embodiments of the present invention, not in order to limit the present invention, all at this Any amendment, equivalent and the improvement etc. made within bright spirit and principle, should be included in the present invention Protection domain within.

Claims (15)

1. the generation method of a navigation information, it is characterised in that said method comprising the steps of:
Mobile client receives the surrounding digital picture being presently in position of user's shooting;
Mobile client receives relevant context information and the visual angle that server end sends;
Mobile client passes through augmented reality means by related building textual environment identification information, geography information It is synthesized in the digital picture that user shoots with visual angle;
Mobile client receive system automatically to shooting surrounding digital picture divide type information or connect Receive the calibration information that it is demarcated by user;
Mobile client extracts building in the surrounding digital picture of shooting according to type information or calibration information Build thing feature and/or character features;
Mobile client obtains the geographical location information that user is current;
The building feature extracted and/or character features or Text region result and user are worked as by mobile client Front geographical location information sends to server end.
2. the method for claim 1, it is characterised in that the generation method of described navigation information is also wrapped Include following steps:
Mobile client receives user district needing to identify of delineation in the surrounding digital picture of shooting Territory;
Mobile client extracts building feature corresponding to the region needing to identify drawn a circle to approve and/or character features.
3. the method for claim 1, it is characterised in that the generation method of described navigation information is also wrapped Include following steps:
Mobile client provides one fast to draw a circle to approve template;
Mobile client, when receiving the region needing to identify and falling in quick delineation template, performs shooting behaviour Make;
Mobile client extracts corresponding building and/or the character features of the image in delineation template.
4. the method for claim 1, it is characterised in that the generation method of described navigation information is also wrapped Include following steps:
Whether mobile client detection picture quality meets process requirement;
When detect meet the requirements time, perform step receive system automatically to shooting surrounding digital picture The calibration information that it is demarcated by the type information divided or reception user;
When detecting undesirable, prompting user re-shoots.
5. the method for claim 1, it is characterised in that the generation method of described navigation information is also wrapped Include following steps:
Mobile client extracts the region correspondence character features needing to identify of delineation;
Word is identified by mobile client or server end according to the character features extracted;
The text importing that mobile client will identify that is proofreaded for user;
Character features or Text region result are sent to server end by mobile client.
6. the generation system of a navigation information, it is characterised in that described system includes:
Digital picture reception/taking module, is presently in the surrounding number of position for receiving or shoot user Word image;
Communication module, is additionally operable to receive relevant context information and the visual angle that server end sends;
Synthesis module, for believing building Text region result, relevant context information and shooting angle place Cease and be synthesized in the digital picture of user's shooting by augmented reality means;
Type division receiver module, for receiving what the system surrounding digital picture automatically to shooting divided The calibration information that it is demarcated by type information or reception user;
Building/character features extraction module, for extracting shooting around according to type information or calibration information Building feature in environment digital picture and/or character features;
Geographical location information acquisition module, for obtaining the geographical location information that mobile phone users is current;
Communication module, for the building feature that will extract and/or character features or Text region result and movement The current geographical location information of terminal use sends to server end.
7. system as claimed in claim 6, it is characterised in that the generation system of described navigation information is also wrapped Include:
Region receiver module, for receive user shooting surrounding digital picture on delineation need know Other region;
Building/character features extraction module, is additionally operable to extract the building that the region needing to identify drawn a circle to approve is corresponding Thing feature and/or character features.
8. system as claimed in claim 6, it is characterised in that the generation system of described navigation information is also wrapped Include:
Formwork module, for providing one fast to draw a circle to approve template;
Digital image capture module, is additionally operable to when the region receiving needs identification falls in quick delineation template Time, perform shooting operation;
Building/character features extraction module, is additionally operable to extract the corresponding building of the image in delineation template And/or character features.
9. system as claimed in claim 6, it is characterised in that the generation system of described navigation information is also wrapped Include:
Image detection module, is used for detecting whether picture quality meets process requirement;
Type division receiver module, be additionally operable to when detect meet the requirements time, receive system automatically to shooting The calibration information that it is demarcated by the type information of surrounding digital picture division or reception user;
Reminding module, for when detecting undesirable, prompting user re-shoots.
10. the mobile visitor of the generation system of the navigation information included described in any one of claim 6 to 9 Family end.
The generation method of 11. 1 kinds of navigation informations, it is characterised in that said method comprising the steps of:
Building feature that received server-side mobile client sends and/or word and mobile client end subscriber Current geographical location information;
Server end extracts and this ground according to received geographical location information in building image data base The building image that reason position is relevant;
Server end extracts the respective building feature of described relevant building image;
It is right that the respective building feature extracted and the building feature in the image of reception are carried out by server end Ratio, and then building is identified;
Server end extracts the relevant context information of the highest building of similarity;
Word after the correction that mobile client is sent by server end combines the geographical location information received and exists Geographic information database is retrieved, show that mobile client end subscriber is presently in the phase of geographical location information Close environmental information;
Server end extracts and this geographical position pair according to described geographical location information in geographic pattern data base The geographic pattern answered;
Above-mentioned relevant context information is sent to mobile client by server end.
12. methods as claimed in claim 11, it is characterised in that the generation method of described navigation information is also Comprise the following steps:
Server end compares building feature in shooting image and calculates with the building information in GIS-Geographic Information System Go out user and shoot visual angle and the position of image.
The generation system of 13. 1 kinds of navigation informations, it is characterised in that described system includes:
Receiver module, is used for receiving the building feature of mobile client transmission and/or word and mobile client The geographical location information that user is current;
Building image zooming-out module, is used for according to described geographical location information in building image data base Extract the building image relevant to this geographical position;
Building feature extraction module, for extracting the respective building feature of described relevant building image;
Comparing module, for entering the respective building feature extracted with the building feature in the image of reception Row comparison, and then building is identified;
Environmental information extraction module, for extracting the relevant context information of the highest building of similarity;
Retrieval module, the word after correction mobile client sent combines the geographical position received Information is retrieved in geographic information database, show that mobile client end subscriber is presently in geographical position letter The relevant context information of breath;
Geographic pattern extraction module, for extracting in geographic pattern data base according to described geographical location information The geographic pattern corresponding with this geographical position;
Sending module, for sending above-mentioned relevant context information to mobile client.
14. systems as claimed in claim 13, it is characterised in that the generation system of described navigation information is also Including:
Visual angle computing module, for by building feature and building in GIS-Geographic Information System in shooting image Build object image feature to compare thus calculate user and shoot visual angle and the position of image.
The service of the generation system of 15. 1 kinds of navigation informations included described in any one of claim 13 or 14 Device end.
CN201210592118.9A 2012-12-31 2012-12-31 The generation method and system of a kind of navigation information and mobile client and server end Active CN103913174B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210592118.9A CN103913174B (en) 2012-12-31 2012-12-31 The generation method and system of a kind of navigation information and mobile client and server end

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210592118.9A CN103913174B (en) 2012-12-31 2012-12-31 The generation method and system of a kind of navigation information and mobile client and server end

Publications (2)

Publication Number Publication Date
CN103913174A CN103913174A (en) 2014-07-09
CN103913174B true CN103913174B (en) 2016-10-19

Family

ID=51039038

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210592118.9A Active CN103913174B (en) 2012-12-31 2012-12-31 The generation method and system of a kind of navigation information and mobile client and server end

Country Status (1)

Country Link
CN (1) CN103913174B (en)

Families Citing this family (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104266654A (en) * 2014-09-26 2015-01-07 广东好帮手电子科技股份有限公司 Vehicle real scene navigation system and method
CN104819723B (en) * 2015-04-29 2017-10-13 京东方科技集团股份有限公司 A kind of localization method and location-server
CN105320725B (en) * 2015-05-29 2019-02-22 杨振贤 Obtain the method and device of the geographic object in acquisition point image
CN106652556A (en) * 2015-10-28 2017-05-10 中国移动通信集团公司 Human-vehicle anti-collision method and apparatus
CN105373610A (en) * 2015-11-17 2016-03-02 广东欧珀移动通信有限公司 Indoor map updating method and server
CN107481342A (en) * 2016-06-07 2017-12-15 腾讯科技(深圳)有限公司 Attendance checking system, method, server and terminal
WO2018032180A1 (en) * 2016-08-14 2018-02-22 阮元 Method and system for stopping visual information push according to market feedback
KR102262480B1 (en) 2016-12-30 2021-06-08 구글 엘엘씨 Hash-based dynamic constraint on information resources
CN106683473A (en) * 2017-03-30 2017-05-17 深圳市科漫达智能管理科技有限公司 Reverse-direction vehicle-finding navigation method and mobile terminal
CN107220726A (en) * 2017-04-26 2017-09-29 消检通(深圳)科技有限公司 Fire-fighting equipment localization method, mobile terminal and system based on augmented reality
DE102017207544A1 (en) * 2017-05-04 2018-11-08 Volkswagen Aktiengesellschaft METHOD, DEVICES AND COMPUTER READABLE STORAGE MEDIUM WITH INSTRUCTIONS FOR LOCATING A DATE MENTIONED BY A MOTOR VEHICLE
CN107764273B (en) * 2017-10-16 2020-01-21 北京耘华科技有限公司 Vehicle navigation positioning method and system
CN107948956A (en) * 2017-11-07 2018-04-20 北京小米移动软件有限公司 Localization method and device
CN110019608B (en) * 2017-11-16 2022-08-05 腾讯科技(深圳)有限公司 Information acquisition method, device and system and storage equipment
CN108287549A (en) * 2017-11-24 2018-07-17 广东康云多维视觉智能科技有限公司 A kind of method and system improving spacescan time performance
CN109992089A (en) * 2017-12-29 2019-07-09 索尼公司 Electronic equipment, wireless communication system, method and computer readable storage medium
CN108507541A (en) * 2018-03-01 2018-09-07 广东欧珀移动通信有限公司 Building recognition method and system and mobile terminal
CN108427947A (en) * 2018-03-16 2018-08-21 联想(北京)有限公司 A kind of image-recognizing method and electronic equipment
CN108917744A (en) * 2018-05-11 2018-11-30 中国地质大学(武汉) A kind of accurate 3-D positioning method based on GPS and street view database
CN108833834B (en) * 2018-06-21 2020-10-30 浙江金果知识产权有限公司 Finding system for preventing children from getting lost
CN109635145A (en) * 2018-11-23 2019-04-16 积成电子股份有限公司 Power equipment inspection information identifying method based on Multidimensional Comprehensive information
CN109857945A (en) * 2019-01-02 2019-06-07 珠海格力电器股份有限公司 A kind of information display method, device, storage medium and terminal
EP4332506A3 (en) * 2019-05-24 2024-03-20 Google Llc Method and device for navigating two or more users to a meeting location
CN110470315A (en) * 2019-06-27 2019-11-19 安徽四创电子股份有限公司 A kind of sight spot tourist air navigation aid
CN113091763B (en) * 2021-03-30 2022-05-03 泰瑞数创科技(北京)有限公司 Navigation method based on live-action three-dimensional map
CN113484889A (en) * 2021-07-07 2021-10-08 中国人民解放军国防科技大学 Immersive navigation system based on augmented reality and satellite positioning of mobile terminal
CN113792726B (en) * 2021-11-16 2022-03-04 北京长隆讯飞科技有限公司 Method and system for rapidly generating POI (Point of interest) based on visual image

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1612707A2 (en) * 2004-06-30 2006-01-04 Navteq North America, LLC Method of collecting information for a geographic database for use with a navigation system
CN1854757A (en) * 2005-04-28 2006-11-01 中国科学院遥感应用研究所 Remote-sensing imaging set interpretation system method
CN101340661A (en) * 2008-08-14 2009-01-07 北京中星微电子有限公司 Guide control implementing mobile apparatus and server, guide control method
CN101945327A (en) * 2010-09-02 2011-01-12 郑茂 Wireless positioning method and system based on digital image identification and retrieve
CN102243078A (en) * 2010-05-13 2011-11-16 上海宾华信息科技有限公司 Release-upon-vision navigation device and realization method thereof
CN102456132A (en) * 2010-10-22 2012-05-16 英业达股份有限公司 Location method and electronic device applying same

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4935145B2 (en) * 2006-03-29 2012-05-23 株式会社デンソー Car navigation system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1612707A2 (en) * 2004-06-30 2006-01-04 Navteq North America, LLC Method of collecting information for a geographic database for use with a navigation system
CN1854757A (en) * 2005-04-28 2006-11-01 中国科学院遥感应用研究所 Remote-sensing imaging set interpretation system method
CN101340661A (en) * 2008-08-14 2009-01-07 北京中星微电子有限公司 Guide control implementing mobile apparatus and server, guide control method
CN102243078A (en) * 2010-05-13 2011-11-16 上海宾华信息科技有限公司 Release-upon-vision navigation device and realization method thereof
CN101945327A (en) * 2010-09-02 2011-01-12 郑茂 Wireless positioning method and system based on digital image identification and retrieve
CN102456132A (en) * 2010-10-22 2012-05-16 英业达股份有限公司 Location method and electronic device applying same

Also Published As

Publication number Publication date
CN103913174A (en) 2014-07-09

Similar Documents

Publication Publication Date Title
CN103913174B (en) The generation method and system of a kind of navigation information and mobile client and server end
US20210374407A1 (en) Augmented reality interface for facilitating identification of arriving vehicle
JP4591353B2 (en) Character recognition device, mobile communication system, mobile terminal device, fixed station device, character recognition method, and character recognition program
KR100533033B1 (en) Position tracing system and method using digital video process technic
KR101147748B1 (en) A mobile telecommunication device having a geographic information providing function and the method thereof
US9080882B2 (en) Visual OCR for positioning
US20200167603A1 (en) Method, apparatus, and system for providing image labeling for cross view alignment
US8175618B2 (en) Mobile device product locator
CN104376118A (en) Panorama-based outdoor movement augmented reality method for accurately marking POI
KR101868125B1 (en) Method and server for Correcting GPS Position in downtown environment using street view service
CN104298965A (en) Intelligent scenic spot and scenery recognition system and method oriented to mobile terminal
US8805078B2 (en) Methods for digital mapping and associated apparatus
Basiri et al. Seamless pedestrian positioning and navigation using landmarks
CN106652539B (en) Shared vehicle and parking position indication method, client and system thereof
US20130135446A1 (en) Street view creating system and method thereof
CN104748739A (en) Intelligent machine augmented reality implementation method
WO2023160722A1 (en) Interactive target object searching method and system and storage medium
CN105845020B (en) A kind of live-action map production method and device
CN102629270A (en) Three-dimensional presentation method and device for geographic information of smarter cities
KR20150077607A (en) Dinosaur Heritage Experience Service System Using Augmented Reality and Method therefor
CN113608614A (en) Display method, augmented reality device, equipment and computer-readable storage medium
CN105091895A (en) Concern prompt system and method
Antigny et al. Hybrid visual and inertial position and orientation estimation based on known urban 3D models
TWI421785B (en) A system and a method for real-time recognizing signboard and displaying shop information
CN201503271U (en) Navigation device with instant information display function

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant