CN109752001A - Navigation system, method and apparatus - Google Patents
Navigation system, method and apparatus Download PDFInfo
- Publication number
- CN109752001A CN109752001A CN201711081993.XA CN201711081993A CN109752001A CN 109752001 A CN109752001 A CN 109752001A CN 201711081993 A CN201711081993 A CN 201711081993A CN 109752001 A CN109752001 A CN 109752001A
- Authority
- CN
- China
- Prior art keywords
- user
- identity
- target position
- user information
- server
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Landscapes
- Navigation (AREA)
Abstract
The embodiment of the present application discloses navigation system, method and apparatus.One specific embodiment of the system includes: Cloud Server, for receiving the navigation requests of user's transmission, wherein navigation requests include destination locations and the identity of user;Determine whether user is currently located at target position region;In response to determining that user is currently located at target position region, navigation requests are sent to server corresponding with target position region;Server, for choosing user information corresponding with identity from pre-stored user information set, wherein user information includes the identity and external physical characteristic of user according to identity;It receives and is located at the image that the photographic device in target position region currently acquires;According to the external physical characteristic of image and selection, determine user in the current location in target position region;Navigation data is generated according to current location and destination locations, and is sent to user.The embodiment helps to improve the accuracy of location navigation.
Description
Technical field
The invention relates to field of computer technology, and in particular to field of navigation technology more particularly to navigation system,
Method and apparatus.
Background technique
Navigation is a certain equipment of guidance, from the method for a little moving to another point of advertised route.Navigation is usually divided to two classes:
(1) it self-aid navigation: is navigated with the equipment on aircraft or ship, there is inertial navigation, Doppler navigation and celestial navigation etc.;
(2) it anti-selfcontained navigation: matches and leads with related ground or aerial device for transit equipments such as aircraft, ship, automobiles
Boat, there is radionavigation, satellite navigation.
Navigation map, i.e. numerical map, are to utilize computer technology, the map for storing and consulting in a digital manner.Mainly
It is the realization in the planning and navigation feature for path.
Summary of the invention
The embodiment of the present application proposes a kind of navigation system, method and apparatus.
In a first aspect, the embodiment of the present application provides a kind of navigation system, which includes: Cloud Server, service
Device and photographic device;Cloud Server, for receive user transmission navigation requests, wherein navigation requests include destination locations and
The identity of user;Determine whether user is currently located at target position region;In response to determining that user is currently located at target position
Region is set, navigation requests are sent to server corresponding with target position region;Server, for according to identity, from
User information corresponding with identity is chosen in pre-stored user information set, wherein user information includes user's
Identity and external physical characteristic;It receives and is located at the image that the photographic device in target position region currently acquires;According to image and choosing
The external physical characteristic taken determines user in the current location in target position region;Navigation is generated according to current location and destination locations
Data, and it is sent to user.
In some embodiments, server is also used to: receiving the photographic device for being located at the entry position in target position region
The facial image of acquisition and humanoid image;Face characteristic and external physical characteristic are extracted, and face characteristic is sent to Cloud Server.
In some embodiments, Cloud Server is also used to: by face characteristic and pre-stored sample face characteristic set
It is matched;The sample face characteristic of target user and identity are sent to server, wherein target user is sample people
In face characteristic set with user corresponding to the matched sample face characteristic of face characteristic;Generate target user identity with
The corresponding relationship of the information in target position corresponding with server region.
In some embodiments, determine whether user is currently located at target position region, comprising: determine whether exist
The information in target position corresponding with the identity of user region;If it exists, it is determined that user is currently located at the target position
Region.
In some embodiments, server is also used to: being obtained special with the matched face of the sample face characteristic of target user
Levy the external physical characteristic of corresponding user;According to the external physical characteristic of the identity of target user and acquisition, user information collection is generated
It closes.
In some embodiments, the user information in user information set further includes the sample face characteristic of user;And
Server is also used to: receiving the facial image for being located at the photographic device acquisition of the outlet port in target position region, and by the people
Face image is matched with user information set;Remove user information set in the matched user information of the facial image;It will
Identity in the user information of removing is sent to Cloud Server.
In some embodiments, Cloud Server is also used to remove target position corresponding with the identity that server is sent
The information in region.
Second aspect, the embodiment of the present application provide a kind of air navigation aid, this method comprises: receiving the navigation that user sends
Request, wherein navigation requests include destination locations and the identity of user;According to identity, from pre-stored user
User information corresponding with identity is chosen in information aggregate, wherein user information includes the identity and external form of user
Feature;It receives and is located at the image that the photographic device in target position region currently acquires;According to the external physical characteristic of image and selection, really
User is determined in the current location in target position region;Navigation data is generated according to current location and destination locations, and is sent to use
Family.
In some embodiments, this method further include: receive the photographic device for being located at the entry position in target position region
The facial image of acquisition and humanoid image, and extract face characteristic and external physical characteristic;Obtain the body of the corresponding user of face characteristic
Part mark;According to the identity and external physical characteristic of acquisition, user information set is generated.
In some embodiments, the user information in user information set further includes the sample face characteristic of user;And
This method further include: receive the facial image for being located at the photographic device acquisition of the outlet port in target position region, and by the people
Face image is matched with user information set;Remove user information set in the matched user information of the facial image.
The third aspect, the embodiment of the present application provide a kind of navigation device, which includes: the first receiving unit, configuration
For receiving the navigation requests of user's transmission, wherein navigation requests include destination locations and the identity of user;It chooses single
Member is configured to according to identity, and user's letter corresponding with identity is chosen from pre-stored user information set
Breath, wherein user information includes the identity and external physical characteristic of user;Second receiving unit is configured to receive positioned at mesh
The image that the photographic device in cursor position region currently acquires;Determination unit is configured to the external physical characteristic according to image and selection,
Determine user in the current location in target position region;Generation unit is configured to be generated according to current location and destination locations
Navigation data, and it is sent to user.
In some embodiments, device further include: the 4th receiving unit is configured to receive positioned at target position region
Entry position photographic device acquisition facial image and humanoid image, and extract face characteristic and external physical characteristic;It obtains single
Member is configured to obtain the identity of the corresponding user of face characteristic;Information aggregate generation unit, is configured to according to acquisition
Identity and external physical characteristic, generate user information set.
In some embodiments, the user information in user information set further includes the sample face characteristic of user;And
The device further include: third receiving unit, the photographic device for being configured to receive the outlet port for being located at target position region are adopted
The facial image of collection, and the facial image is matched with user information set;Clearing cell is configured to remove user's letter
Breath set in the matched user information of the facial image.
Fourth aspect, the embodiment of the present application provide a kind of electronic equipment, comprising: one or more processors;Camera shooting dress
It sets, for acquiring image;Storage device, for storing one or more programs, when one or more programs are one or more
Processor executes, so that one or more processors realize the method as described in any embodiment in above-mentioned second aspect.
5th aspect, the embodiment of the present application provide a kind of computer readable storage medium, are stored thereon with computer journey
Sequence.The method as described in any embodiment in above-mentioned second aspect is realized when the computer program is executed by processor.
Navigation system provided by the embodiments of the present application, method and apparatus receive the navigation of user's transmission in Cloud Server
After request, it can determine whether the user is currently located at target position region first.Then, however, it is determined that the user is located at target position
Region is set, then navigation requests can be sent to server corresponding with the target position region by Cloud Server.At this point, server
It can be chosen and the identity pair from pre-stored user information set according to the identity in navigation requests first
The user information answered, wherein user information may include the identity and external physical characteristic of user;Then, server can connect
It receives and is located at the image that the photographic device in target position region currently acquires;Later, can according to the external physical characteristic of image and selection,
Determine user in the current location in target position region;Finally, navigation data can be generated according to current location and destination locations,
And it is sent to user.The accuracy of location navigation is helped to improve in this way.
Detailed description of the invention
By reading a detailed description of non-restrictive embodiments in the light of the attached drawings below, the application's is other
Feature, objects and advantages will become more apparent upon:
Fig. 1 is that this application can be applied to exemplary system architecture figures therein;
Fig. 2 is the timing diagram according to one embodiment of the navigation system of the application;
Fig. 3 is the timing diagram according to another embodiment of the navigation system of the application;
Fig. 4 is the timing diagram according to the further embodiment of the navigation system of the application;
Fig. 5 is the flow chart according to one embodiment of the air navigation aid of the application;
Fig. 6 is the structural schematic diagram according to one embodiment of the navigation device of the application;
Fig. 7 is adapted for the structural schematic diagram for the computer system for realizing the electronic equipment of the embodiment of the present application.
Specific embodiment
The application is described in further detail with reference to the accompanying drawings and examples.It is understood that this place is retouched
The specific embodiment stated is used only for explaining related invention, rather than the restriction to the invention.It also should be noted that in order to
Convenient for description, part relevant to related invention is illustrated only in attached drawing.
It should be noted that in the absence of conflict, the features in the embodiments and the embodiments of the present application can phase
Mutually combination.The application is described in detail below with reference to the accompanying drawings and in conjunction with the embodiments.
Fig. 1 is shown can be using the exemplary system architecture of the navigation system of the application, air navigation aid or navigation device
100。
As shown in Figure 1, system architecture 100 may include terminal 101,102, network 103,106, server 104, cloud service
Device 105 and photographic device 107,108.Network 103 in terminal 101,102, server 104 and 105 three of Cloud Server it
Between provide communication link medium.Network 106 is to provide communication link between server 104 and photographic device 107,108
Medium.Network 103,106 may include various connection types, such as wired, wireless communication link or fiber optic cables etc..
User can be used terminal 101,102 and be interacted by network 103 with server 104, Cloud Server 105, with
Receive or send message etc..Various client applications can be installed, such as navigation type is applied, webpage is clear in terminal 101,102
Device of looking at application, face recognition application and shopping class application etc..
Terminal 101,102 can be the various electronic equipments with display screen, including but not limited to smart phone, plate electricity
Brain, E-book reader, pocket computer on knee and desktop computer etc..
Photographic device 107,108 can be various for acquiring the device of image, such as camera, camera.Camera shooting dress
Facial image and humanoid image etc. can be acquired by setting 107,108.Wherein, it not only may include face's figure of user in humanoid image
Picture can also include at least partly body image of user, such as image of the upper part of the body comprising user, or the whole body comprising user
Image.
Server 104 can be to provide the server of various services, such as to the face figure that photographic device 107,108 acquires
As the Analysis server being analyzed and processed.
Cloud Server 105 is also possible to provide the server of various services, such as the navigation to showing in terminal 101,102
The background server supported using offer.The navigation requests that background server can send terminal 101,102 are analyzed etc.
Processing, and processing result (such as navigation circuit data corresponding with navigation requests) can be sent to terminal 101,102.
It should be noted that air navigation aid provided by the embodiment of the present application is generally executed by server 104, correspondingly,
Navigation device is generally positioned in server 104.
It should be pointed out that system architecture 100 can be not provided with when Cloud Server 105 has the function of server 104
Server 104.
It should be understood that the number of terminal, network, server, Cloud Server and photographic device in Fig. 1 is only schematic
's.According to needs are realized, any number of terminal, network, server, Cloud Server and photographic device can have.
With continued reference to Fig. 2, it illustrates the timing diagrams according to one embodiment of the navigation system of the application.
In the present embodiment, navigation system may include Cloud Server, server and photographic device;Cloud Server is used for
Receive the navigation requests that user sends, wherein navigation requests may include destination locations and the identity of user;Determine user
It is current whether to be located at target position region;In response to determining that user is currently located at target position region, navigation requests are sent to
Server corresponding with target position region;Server is used for according to identity, from pre-stored user information set
Choose user information corresponding with identity, wherein user information may include the identity and external physical characteristic of user;It connects
It receives and is located at the image that the photographic device in target position region currently acquires;According to the external physical characteristic of image and selection, user is determined
In the current location in target position region;Navigation data is generated according to current location and destination locations, and is sent to user.
As shown in Fig. 2, in step 201, Cloud Server receives the navigation requests that user sends.
In the present embodiment, Cloud Server (such as Cloud Server 105 shown in FIG. 1) can by wired connection mode or
Person's radio connection receives user and passes through the navigation requests that terminal (such as terminal shown in FIG. 1 101,102) is sent.Wherein,
Navigation requests may include destination locations and the identity of user.Herein, purpose position can be indicated in several ways
It sets, such as title, mailing address or the latitude and longitude coordinates of destination locations.The identity of user typically refers to uniquely determine
The mark of user identity.Identity may include one of characters such as number, letter, text and symbol or at least two.
Such as identity may include at least one of following: name, identification card number, cell-phone number or SID (system identification,
Security Identifiers) etc..
In step 202, Cloud Server determines whether user is currently located at target position region.
In the present embodiment, Cloud Server can determine whether user is currently located at target position area by a variety of methods
Domain.The goal band of position is not intended to limit in this application, such as target position region can be certain shop or market institute
The band of position;Such as target position region can be user periphery there are the bands of position of photographic device, such as road again.
In some optionally implementations of the present embodiment, Cloud Server can be positioned by mobile phone and obtain working as user
Front position information, to determine whether current location information is located at target position region.Mobile Location Technology may include based on GPS
The positioning of (global positioning system, Global Positioning System), the positioning based on the base station for moving operation network, also
It may include the positioning using WiFi (being wirelessly connected, Wireless Fidelity).These location technologies are in people's lives
In be used widely, details are not described herein again for concrete operating principle.
Optionally, user can use the camera installed in terminal to acquire the image of present position, and by its
It is sent to Cloud Server.Cloud Server can match the image in pre-stored image data base.If picture number
It, then can be using the corresponding position of matched image as the present bit of user when according to there is the image with the images match in library
It sets, so that it is determined that whether user is currently located at target position region.
As an example, Cloud Server can also be by determining whether there is target corresponding with the identity of user
The information of the band of position, to determine whether the user is currently located at target position region.If it exists, then it can determine that the user works as
Anteposition is in the target position region.The generating process of the corresponding relationship of the identity and target position region of user, specifically may be used
With embodiment shown in Figure 3.
It should be noted that when Cloud Server determines user currently without target position region is located at, it can be to terminal
Prompt information is sent, to indicate to stop the processing of navigation requests.
In step 203, Cloud Server sends navigation requests in response to determining that user is currently located at target position region
To server corresponding with target position region.
In the present embodiment, Cloud Server can pass through in the case where determining that user is currently located at target position region
The navigation requests that the user sends are sent to the mesh being currently located with the user by wired connection mode or radio connection
The corresponding server in cursor position region (such as server 104 shown in FIG. 1).
As an example, Cloud Server is after determining that user is currently located at the band of position in the shop A, it can be pre-stored
The mark in the shop A is searched in store information table, and then obtains IP (Internet protocol address, the Internet of the server in the shop A
Protocol Address), navigation requests are sent to the server in the shop A.Wherein, store information table is for describing shop
Mark and server IP corresponding relationship.The mark in shop refers generally to uniquely determine the mark in shop.Mark is same
It may include at least one of characters such as number, letter, text and symbol.
In step 204, server is chosen and identity mark from pre-stored user information set according to identity
Know corresponding user information.
In the present embodiment, server (such as server 104 shown in FIG. 1) can be asked in the navigation that Cloud Server is sent
Ask the middle identity for obtaining user.And according to the identity, can be chosen from pre-stored user information set with
The corresponding user information of the identity.Wherein, user information may include the identity and external physical characteristic of user.Here
External physical characteristic not only may include the physical characteristic (such as high, short, fat, thin) of user, and for the ease of the identification of user, external form is special
Sign can also include the dressing information of user, to improve recognition efficiency.Dressing information may include (but being not limited to) with down toward
One item missing: types of garments (such as skirt, fitted pants), clothing color, ornaments type (such as cap, scarf), ornaments color, shoes class
Type (such as long cylinder, short cylinder, leather shoes and sport footwear) and shoes color.
In some optionally implementations of the present embodiment, server can obtain the use of all registered users in advance
User data.User data usually may include identity and standard faces data.Wherein, standard faces data refer generally to front
Human face data.User can upload an at least facial image in registration, such as front face image and/or side face figure
Picture, to obtain the standard faces data of user.In this way, being located at the shop A when user enters target position region (such as shop A)
The photographic device of paving entry position can acquire the facial image and humanoid image of user.Then, server can be by user's
Facial image is matched with the standard faces data in user data.If successful match, server can be according to the user
Humanoid image, successful match standard faces data and corresponding identity, generate user information set.
It is understood that the storage location of above-mentioned user data is not intended to limit in this application.It, which can store, is taking
The local of business device, also can store on Cloud Server.
Optionally, server can also obtain user information set by other means.Specifically it may refer to shown in Fig. 3
Embodiment.
In step 205, server, which receives, is located at the image that the photographic device in target position region currently acquires.
In the present embodiment, server can receive by wired connection mode or radio connection and be located at target position
Set the image that the photographic device (such as photographic device shown in FIG. 1 107,108) in region currently acquires.Wherein, photographic device can
Think various for acquiring the device of image.
It is understood that at least one photographic device has can be set in target position region for the ease of identifying user,
And there is overlapping region between these photographic devices region collected.That is, these photographic device acquired images
Between have intersection.Such as photographic device can be installed respectively in the quadrangle on the roof in the shop A and center, in this way for
User in the shop A, it is ensured that while having at least two photographic devices that can take the user.
In step 206, server determines user's working as in target position region according to the external physical characteristic of image and selection
Front position.
In the present embodiment, the external physical characteristic that server can be chosen according to step 204, is currently adopted in each photographic device
It is searched and the matched humanoid image of the external physical characteristic in the image of collection.In turn, the position according to matched humanoid image in the picture
Set and synchronization collect the humanoid image each photographic device position, server can determine the external physical characteristic institute it is right
Current location of the user answered in target position region.
It, can be with it is understood that the photographic device by target position region positions the current location of user
Improve the accuracy of positioning, the especially accuracy of indoor positioning.And can be reduced or avoided the configuration in using terminal into
Row positioning, to facilitate the power consumption of reduction terminal.Further, since not needing to realize user's by the configuration in terminal
Positioning can reduce the hardware configuration requirement to terminal, and then reduce the production cost of terminal.
In step 207, server generates navigation data according to current location and destination locations, and is sent to user.
In the present embodiment, server can obtain the destination locations information of user from the navigation requests of step 204.This
When, according to the current location of the user determined by the destination locations and step 206, server can carry out layout of roads, raw
User's (sending the user of navigation requests) is sent at the navigation data comprising routing information, and by navigation data.Path
Information may include at least one of following: road name, operating range, Bus information etc..Herein, navigation data can be two-dimemsional number
It is believed that breath, is also possible to three-dimensional data information.
It should be noted that navigation data directly can be sent to user by server, can also be sent out by Cloud Server
Give user.That is, navigation data can be sent to Cloud Server by server, Cloud Server again sends navigation data
To corresponding terminal used by a user.In this way, being carried out using the corresponding server in target position region is currently located with user
Navigation circuit planning can reduce the load of Cloud Server, help to improve the treatment effeciency of navigation requests, to reduce user
Waiting time, promoted user experience.
Navigation system provided by the embodiments of the present application, after the navigation requests that Cloud Server receives user's transmission, first
It can determine whether the user is currently located at target position region.Then, however, it is determined that the user is located at target position region, then cloud
Navigation requests can be sent to server corresponding with the target position region by server.At this point, server first can root
According to the identity in navigation requests, user's letter corresponding with the identity is chosen from pre-stored user information set
Breath, wherein user information may include the identity and external physical characteristic of user;Then, server can receive positioned at target
The image that the photographic device of the band of position currently acquires;Later, it can determine that user exists according to the external physical characteristic of image and selection
The current location in target position region;Finally, navigation data can be generated according to current location and destination locations, and it is sent to use
Family.The accuracy of location navigation is helped to improve in this way.
With further reference to Fig. 3, it illustrates the timing diagrams of another embodiment of navigation system provided by the present application.
As shown in figure 3, in step 301, server receives the photographic device for being located at the entry position in target position region
The facial image of acquisition and humanoid image.
In the present embodiment, server (such as server 104 shown in FIG. 1) can pass through wired connection or wireless connection
Mode, in real time or periodically receive and be located at the photographic device of the entry position in target position region and (such as shown in FIG. 1 take the photograph
As device 107,108) facial image and humanoid image acquired.It, can be in the outdoors or indoor of shop A such as shop A
The different photographic device of two height is set corresponding to entry position, to acquire facial image and humanoid image respectively.
In step 302, server extracts face characteristic and external physical characteristic, and face characteristic is sent to Cloud Server.
In the present embodiment, server can extract skill using existing human face characteristic point according to the facial image of user
Art extracts face characteristic.And it can be according to humanoid image, to extract the external physical characteristic of user.Here external physical characteristic can be with
Identical as above-mentioned external physical characteristic, details are not described herein again.
In step 303, Cloud Server matches face characteristic with pre-stored sample face characteristic set.
In the present embodiment, the face that Cloud Server (such as Cloud Server 105 shown in FIG. 1) can send server
Feature is matched with the sample face characteristic in pre-stored sample face characteristic set.Wherein, sample face characteristic master
Want the quasi- front face feature of index.Herein, Cloud Server can obtain in advance all registered users user data (with it is upper
It is identical to state user data).In this way, sample face characteristic set can be generated according to the standard faces data in user data.
In step 304, the sample face characteristic of target user and identity are sent to server by Cloud Server.
In the present embodiment, according to the matching result in step 303, Cloud Server can be by the sample face of target user
Feature and identity are sent to server.Wherein, target user is matched with face characteristic in sample face characteristic set
User corresponding to sample face characteristic.
In step 305, Cloud Server generate the identity of target user with and the corresponding target position area of server
The corresponding relationship of the information in domain.
In the present embodiment, according to the matching result in step 303, Cloud Server can be by the identity of target user
Corresponding relationship is established between the information in expression target position region.Wherein, which is and sends above-mentioned face
The corresponding target position region of the server of feature.
Such as user message table is can store in Cloud Server.Wherein, user message table is used to indicate the identity of user
Corresponding relationship between mark, sample face characteristic and the information in shop.In this way, working as Cloud Server for the sample people of target user
Face feature and identity are sent to the server of shop A simultaneously, and Cloud Server can be by information (such as mark of shop A of shop A
Knowledge, title, business license number etc.) one column of store information corresponding with the identity of target user in user message table is written
In.At this point, illustrating that user is currently located in the A of shop.
In this way, Cloud Server is determining whether user is currently located at target position region, following steps can be passed through: cloud clothes
Business device can determine whether the information in the presence of target position corresponding with the identity of user region;If it exists, then
Determine that the user is currently located at the target position region.That is, Cloud Server can check the shop letter of user message table
Whether cease current in a column is empty (NULL);If not empty, then illustrate that user is currently located in shop indicated by store information.
Within step 306, server obtains use corresponding with the matched face characteristic of sample face characteristic of target user
The external physical characteristic at family, and according to the external physical characteristic of the identity of target user and acquisition, generate user information set.
In the present embodiment, server receives identity and the sample face spy of the target user of Cloud Server transmission
After sign, the external physical characteristic of user corresponding to matching face characteristic can be obtained according to sample face characteristic first;It
User information set can be generated according to the identity of the target user and the external physical characteristic of acquisition afterwards.
It should be noted that if server does not receive the identity of the target user of Cloud Server transmission, then it can be with
Illustrate that the corresponding user of face characteristic may be nonregistered user.So, server can be deleted corresponding to the face characteristic
The related data of user, such as facial image and external physical characteristic.
Navigation system provided in this embodiment can be by being located at entry position when user enters target position region
Photographic device acquire its facial image and humanoid image.At this point, server corresponding with target position region can extract this
The face characteristic and external physical characteristic of user.And according to the face characteristic of the user, server can obtain the use from Cloud Server
The identity at family, with corresponding with its external physical characteristic.In this way, the photographic device using target position region can be determined in real time
Position user improves the accuracy of positioning in the current location in target position region.Meanwhile Cloud Server is receiving server hair
After the face characteristic sent, it can confirm that the user is currently located at target position region.That is, being made without using user
Terminal can determine the current location of user, so as to reduce the configuration needs and energy consumption of terminal.In addition, server
It is mainly used for face characteristic extraction and user's positioning, and Cloud Server is mainly used for face characteristic identification.It is equal by load in this way
Weighing apparatus, it is possible to reduce the data volume of the required processing of server and Cloud Server, to improve treatment effeciency.
With continued reference to Fig. 4, it illustrates the timing diagrams of the further embodiment of navigation system provided by the present application.
As shown in figure 4, in step 401, server receives the photographic device for being located at the outlet port in target position region
The facial image of acquisition, and the facial image is matched with user information set.
In the present embodiment, the user information in above-mentioned user information set can also include that the sample face of user is special
Sign.In this way, server (such as server 104 shown in FIG. 1), which can receive, to be located at when user will leave target position region
The facial image of photographic device (such as photographic device shown in FIG. 1 107,108) acquisition of the outlet port in target position region.
Then the face characteristic of the user can be extracted, so that it is special with the sample face in the user information in user information set
Sign is matched.
In step 402, server remove user information set in the matched user information of the facial image.
In the present embodiment, according to the matching result of step 401, server can remove in user information set with extraction
The matched user information of face characteristic.That is, when user leaves target position region, with the target position region pair
The relevant erasing of information of the user to reduce the occupancy of memory space, can be improved the fortune of server by the server answered
Line efficiency.
In step 403, the identity in the user information of removing is sent to Cloud Server by server.
In the present embodiment, since user has left target position region, so server can believe the user of removing
Identity in breath is sent to Cloud Server (such as Cloud Server 105 shown in FIG. 1), so that Cloud Server confirms the user
Not in target position region.
It is understood that server can also be by facial image (face characteristic of extraction) in step 401 or clear
Sample face characteristic in the user information removed is sent to Cloud Server, so that Cloud Server confirms corresponding user not in target
The band of position.
In step 404, Cloud Server removes the letter in target position region corresponding with the identity that server is sent
Breath.
In the present embodiment, after Cloud Server receives the identity in the user information of the removing of server transmission,
It can be by the erasing of information in target position corresponding with the identity region stored thereon.For example, Cloud Server can be clear
Except the relevant information in one column of store information corresponding with the identity in user message table.
Navigation system provided in this embodiment, when user will leave target position region, positioned at taking the photograph for outlet port
As device can acquire the facial image of the user.In this way, server can be removed related to the user according to the facial image
User information, to remove storage content thereon, in time to improve operational efficiency.In addition, by the identity of the user
After being sent to Cloud Server, Cloud Server can be made to know that the user has left target position region, to change in time thereon
Storage content.
Fig. 5 is referred to, it illustrates the processes 500 of one embodiment of air navigation aid provided by the present application.The navigation side
The process 500 of method, comprising the following steps:
Step 501, the navigation requests that user sends are received.
In the present embodiment, air navigation aid operation electronic equipment (such as server 104 shown in FIG. 1) thereon can be with
By wired connection or wireless connection mode, the navigation requests that user sends are received.Wherein, navigation requests may include purpose position
Set the identity with user.Here identity can be identical as the identity in the various embodiments described above, herein no longer
It repeats.
Optionally, user can use the navigation presented in used terminal (such as terminal shown in FIG. 1 101,102)
Using transmission navigation requests.At this point, providing navigation application the Cloud Server (such as Cloud Server 105 shown in FIG. 1) supported
After receiving navigation requests, electronic equipment can be sent it to.
Further, Cloud Server can first confirm that the user for sending navigation requests is current after receiving navigation requests
Whether target position region is located at.If confirming, it is currently located at target position region, navigation requests can be sent to and is somebody's turn to do
The corresponding electronic equipment in target position region.Detailed process may refer to the associated description in the various embodiments described above, herein no longer
It repeats.
Step 502, according to identity, use corresponding with identity is chosen from pre-stored user information set
Family information.
In the present embodiment, electronic equipment can be believed according to the identity in navigation requests from pre-stored user
User information corresponding with identity is chosen in breath set.Wherein, user information includes identity and the external form spy of user
Sign.Detailed process may refer to the associated description in the various embodiments described above, and details are not described herein again.
Step 503, it receives and is located at the image that the photographic device in target position region currently acquires.
In the present embodiment, electronic equipment can receive by way of wired connection or wireless connection and be located at target position
Set the image that the photographic device (such as photographic device shown in FIG. 1 107,108) in region currently acquires.
Step 504, according to the external physical characteristic of image and selection, determine user in the current location in target position region.
In the present embodiment, electronic equipment can be according to the external physical characteristic and the received figure of step 503 that step 502 is chosen
Picture, to confirm user in the current location in target position region.Detailed process may refer to the correlation in the various embodiments described above and retouch
It states, details are not described herein again.
Step 505, navigation data is generated according to current location and destination locations, and is sent to user.
In the present embodiment, electronic equipment can according in navigation requests destination locations and step 504 determine it is current
Position carries out path planning, to generate navigation data.And the navigation data is sent to user.Detailed process may refer to
Associated description in the various embodiments described above, details are not described herein again.
In some optionally implementations of the present embodiment, this method can also include: to receive to be located at target position area
The facial image and humanoid image of the photographic device acquisition of the entry position in domain, and extract face characteristic and external physical characteristic;It obtains
The identity of the corresponding user of face characteristic;According to the identity and external physical characteristic of acquisition, user information set is generated.
Optionally, the user information in user information set can also include the sample face characteristic of user;And the party
Method can also include: the facial image for receiving the photographic device acquisition for the outlet port for being located at target position region, and by the people
Face image is matched with user information set;Remove user information set in the matched user information of the facial image.
Air navigation aid provided by the embodiments of the present application can use by obtaining the identity and external physical characteristic of user
User is currently located the photographic device acquired image in target position region, to determine user in the current of target position region
Specific location.The accuracy of positioning can be improved in this way.The case where being positioned using terminal can be reduced simultaneously, so as to
Reduce the energy consumption and hardware configuration demand of terminal.
With further reference to Fig. 6, as the realization to method shown in above-mentioned each figure, this application provides a kind of navigation devices
One embodiment.The Installation practice is corresponding with embodiment of the method shown in fig. 5, which specifically can be applied to various electricity
In sub- equipment.
As shown in fig. 6, the navigation device 600 of the present embodiment may include: the first receiving unit 601, it is configured to receive
The navigation requests that user sends, wherein navigation requests include destination locations and the identity of user;Selection unit 602, configuration
For choosing user information corresponding with identity from pre-stored user information set according to identity, wherein
User information includes the identity and external physical characteristic of user;Second receiving unit 603 is configured to receive positioned at target position
The image that the photographic device in region currently acquires;Determination unit 604 is configured to the external physical characteristic according to image and selection, really
User is determined in the current location in target position region;Generation unit 605 is configured to be generated according to current location and destination locations
Navigation data, and it is sent to user.
In the present embodiment, the first receiving unit 601, selection unit 602, the second receiving unit 603, determination unit 604
With the specific implementation of generation unit 605 and the beneficial effect of generation, the step in embodiment shown in Figure 5 can be distinguished
501, the associated description of step 502, step 503, step 504 and step 505, details are not described herein again.
In some optional implementations of the present embodiment, which can also include: the 4th receiving unit (figure
In be not shown), be configured to receive be located at target position region entry position photographic device acquisition facial image and people
Shape image, and extract face characteristic and external physical characteristic;It is corresponding to be configured to acquisition face characteristic for acquiring unit (not shown)
User identity;Information aggregate generation unit (not shown) is configured to according to the identity of acquisition and outer
Type feature generates user information set.
Optionally, the user information in user information set can also include the sample face characteristic of user.And the dress
Setting 600 can also include: third receiving unit (not shown), be configured to receive the outlet position for being located at target position region
The facial image for the photographic device acquisition set, and the facial image is matched with user information set;Clearing cell is (in figure
It is not shown), it is configured to remove in user information set and the matched user information of the facial image.
Below with reference to Fig. 7, it illustrates the computer systems 700 for the electronic equipment for being suitable for being used to realize the embodiment of the present application
Structural schematic diagram.Electronic equipment shown in Fig. 7 is only an example, function to the embodiment of the present application and should not use model
Shroud carrys out any restrictions.
As shown in fig. 7, computer system 700 includes central processing unit (CPU) 701, it can be read-only according to being stored in
Program in memory (ROM) 702 or be loaded into the program in random access storage device (RAM) 703 from storage section 708 and
Execute various movements appropriate and processing.In RAM 703, also it is stored with system 700 and operates required various programs and data.
CPU 701, ROM 702 and RAM 703 are connected with each other by bus 704.Input/output (I/O) interface 705 is also connected to always
Line 704.
I/O interface 705 is connected to lower component: the importation 706 including touch screen, keyboard, photographic device etc.;Including
The output par, c 707 of cathode-ray tube (CRT), liquid crystal display (LCD) etc. and loudspeaker etc.;Depositing including hard disk etc.
Store up part 708;And the communications portion 709 of the network interface card including LAN card, modem etc..Communications portion 709
Communication process is executed via the network of such as internet.Driver 710 is also connected to I/O interface 705 as needed.It is detachable to be situated between
Matter 711, such as disk, CD, magneto-optic disk, semiconductor memory etc. are mounted on as needed on driver 710, in order to
It is mounted into storage section 708 as needed from the computer program read thereon.
Particularly, in accordance with an embodiment of the present disclosure, it may be implemented as computer above with reference to the process of flow chart description
Software program.For example, embodiment of the disclosure includes a kind of computer program product comprising be carried on computer-readable medium
On computer program, which includes the program code for method shown in execution flow chart.In such reality
It applies in example, which can be downloaded and installed from network by communications portion 709, and/or from detachable media
711 are mounted.When the computer program is executed by central processing unit (CPU) 701, limited in execution the present processes
Above-mentioned function.It should be noted that the computer-readable medium of the application can be computer-readable signal media or calculating
Machine readable storage medium storing program for executing either the two any combination.Computer readable storage medium for example can be --- but it is unlimited
In system, device or the device of --- electricity, magnetic, optical, electromagnetic, infrared ray or semiconductor, or any above combination.It calculates
The more specific example of machine readable storage medium storing program for executing can include but is not limited to: have the electrical connection, portable of one or more conducting wires
Formula computer disk, hard disk, random access storage device (RAM), read-only memory (ROM), erasable programmable read only memory
(EPROM or flash memory), optical fiber, portable compact disc read-only memory (CD-ROM), light storage device, magnetic memory device or
The above-mentioned any appropriate combination of person.In this application, computer readable storage medium can be it is any include or storage program
Tangible medium, which can be commanded execution system, device or device use or in connection.And in this Shen
Please in, computer-readable signal media may include in a base band or as carrier wave a part propagate data-signal,
In carry computer-readable program code.The data-signal of this propagation can take various forms, including but not limited to
Electromagnetic signal, optical signal or above-mentioned any appropriate combination.Computer-readable signal media can also be computer-readable
Any computer-readable medium other than storage medium, the computer-readable medium can send, propagate or transmit for by
Instruction execution system, device or device use or program in connection.The journey for including on computer-readable medium
Sequence code can transmit with any suitable medium, including but not limited to: wireless, electric wire, optical cable, RF etc. are above-mentioned
Any appropriate combination.
Flow chart and block diagram in attached drawing are illustrated according to the system of the various embodiments of the application, method and computer journey
The architecture, function and operation in the cards of sequence product.In this regard, each box in flowchart or block diagram can generation
A part of one module, program segment or code of table, a part of the module, program segment or code include one or more use
The executable instruction of the logic function as defined in realizing.It should also be noted that in some implementations as replacements, being marked in box
The function of note can also occur in a different order than that indicated in the drawings.For example, two boxes succeedingly indicated are actually
It can be basically executed in parallel, they can also be executed in the opposite order sometimes, and this depends on the function involved.Also it to infuse
Meaning, the combination of each box in block diagram and or flow chart and the box in block diagram and or flow chart can be with holding
The dedicated hardware based system of functions or operations as defined in row is realized, or can use specialized hardware and computer instruction
Combination realize.
Being described in unit involved in the embodiment of the present application can be realized by way of software, can also be by hard
The mode of part is realized.Described unit also can be set in the processor, for example, can be described as: a kind of processor packet
Include the first receiving unit, selection unit, the second receiving unit, determination unit and generation unit.Wherein, the title of these units exists
The restriction to the unit itself is not constituted in the case of certain, for example, the first receiving unit is also described as " receiving user
The unit of the navigation requests of transmission ".
As on the other hand, present invention also provides a kind of computer-readable medium, which be can be
Included in electronic equipment described in above-described embodiment;It is also possible to individualism, and without in the supplying electronic equipment.
Above-mentioned computer-readable medium carries one or more program, when said one or multiple programs are held by the electronic equipment
When row, so that the electronic equipment: receiving the navigation requests that user sends, wherein navigation requests include destination locations and user
Identity;According to identity, user information corresponding with identity is chosen from pre-stored user information set,
Wherein, user information includes the identity and external physical characteristic of user;The photographic device received positioned at target position region is current
The image of acquisition;According to the external physical characteristic of image and selection, determine user in the current location in target position region;According to current
Position and destination locations generate navigation data, and are sent to user.
Above description is only the preferred embodiment of the application and the explanation to institute's application technology principle.Those skilled in the art
Member is it should be appreciated that invention scope involved in the application, however it is not limited to technology made of the specific combination of above-mentioned technical characteristic
Scheme, while should also cover in the case where not departing from foregoing invention design, it is carried out by above-mentioned technical characteristic or its equivalent feature
Any combination and the other technical solutions formed.Such as features described above has similar function with (but being not limited to) disclosed herein
Can technical characteristic replaced mutually and the technical solution that is formed.
Claims (14)
1. a kind of navigation system, comprising: Cloud Server, server and photographic device;
The Cloud Server, for receiving the navigation requests of user's transmission, wherein the navigation requests include destination locations and use
The identity at family;Determine whether the user is currently located at target position region;In response to determination, the user is currently located at
The navigation requests are sent to server corresponding with the target position region by target position region;
The server, for being chosen and the identity from pre-stored user information set according to the identity
Identify corresponding user information, wherein the user information includes the identity and external physical characteristic of user;It receives described in being located at
The image that the photographic device in target position region currently acquires;According to the external physical characteristic of described image and selection, the use is determined
Current location of the family in the target position region;Navigation data is generated according to the current location and the destination locations, and
It is sent to the user.
2. system according to claim 1, wherein the server is also used to:
Receive the facial image and humanoid image for being located at the photographic device acquisition of the entry position in target position region;
Face characteristic and external physical characteristic are extracted, and the face characteristic is sent to the Cloud Server.
3. system according to claim 2, wherein the Cloud Server is also used to:
The face characteristic is matched with pre-stored sample face characteristic set;
The sample face characteristic of target user and identity are sent to the server, wherein the target user is institute
State in sample face characteristic set with user corresponding to the matched sample face characteristic of the face characteristic;
Generate the identity of the target user with and the corresponding target position region of the server information corresponding pass
System.
4. system according to claim 3, wherein whether the determination user is currently located at target position region,
Include:
Determine whether the information in the presence of target position corresponding with the identity of user region;
If it exists, it is determined that the user is currently located at the target position region.
5. system according to claim 4, wherein the server is also used to:
Obtain the external physical characteristic of user corresponding with the matched face characteristic of sample face characteristic of the target user;
According to the external physical characteristic of the identity of the target user and acquisition, user information set is generated.
6. system according to claim 5, wherein the user information in the user information set further includes the sample of user
This face characteristic;And
The server is also used to:
The facial image for being located at the photographic device acquisition of the outlet port in the target position region is received, and by the facial image
It is matched with the user information set;
Remove in the user information set with the matched user information of the facial image;
Identity in the user information of removing is sent to the Cloud Server.
7. system according to claim 6, wherein the Cloud Server is also used to
Remove the information in target position region corresponding with the identity that the server is sent.
8. a kind of air navigation aid, comprising:
Receive the navigation requests that user sends, wherein the navigation requests include destination locations and the identity of user;
According to the identity, user's letter corresponding with the identity is chosen from pre-stored user information set
Breath, wherein the user information includes the identity and external physical characteristic of user;
It receives and is located at the image that the photographic device in target position region currently acquires;
According to the external physical characteristic of described image and selection, determine the user in the current location in the target position region;
Navigation data is generated according to the current location and the destination locations, and is sent to the user.
9. according to the method described in claim 8, wherein, the method also includes:
The facial image and humanoid image for being located at the photographic device acquisition of the entry position in the target position region are received, and is mentioned
Take face characteristic and external physical characteristic;
Obtain the identity of the corresponding user of the face characteristic;
According to the identity of acquisition and the external physical characteristic, user information set is generated.
10. according to the method described in claim 9, wherein, the user information in the user information set further includes user's
Sample face characteristic;And
The method also includes:
The facial image for being located at the photographic device acquisition of the outlet port in the target position region is received, and by the facial image
It is matched with the user information set;
Remove in the user information set with the matched user information of the facial image.
11. a kind of navigation device, comprising:
First receiving unit is configured to receive the navigation requests that user sends, wherein the navigation requests include destination locations
With the identity of user;
Selection unit is configured to according to the identity, is chosen and the body from pre-stored user information set
Part identifies corresponding user information, wherein the user information includes the identity and external physical characteristic of user;
Second receiving unit is configured to receive the image currently acquired positioned at the photographic device in target position region;
Determination unit is configured to the external physical characteristic according to described image and selection, determines the user in the target position
The current location in region;
Generation unit is configured to generate navigation data according to the current location and the destination locations, and is sent to described
User.
12. device according to claim 11, wherein the user information in the user information set further includes user's
Sample face characteristic;And
Described device further include:
Third receiving unit is configured to receive the people of the photographic device acquisition for the outlet port for being located at the target position region
Face image, and the facial image is matched with the user information set;
Clearing cell is configured to remove in the user information set and the matched user information of the facial image.
13. a kind of electronic equipment, comprising:
One or more processors;
Photographic device, for acquiring image;
Storage device, for storing one or more programs;
When one or more of programs are executed by one or more of processors, so that one or more of processors are real
The now method as described in any in claim 8-10.
14. a kind of computer readable storage medium, is stored thereon with computer program, wherein the computer program is processed
The method as described in any in claim 8-10 is realized when device executes.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711081993.XA CN109752001B (en) | 2017-11-07 | 2017-11-07 | Navigation system, method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711081993.XA CN109752001B (en) | 2017-11-07 | 2017-11-07 | Navigation system, method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109752001A true CN109752001A (en) | 2019-05-14 |
CN109752001B CN109752001B (en) | 2021-07-06 |
Family
ID=66399601
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711081993.XA Active CN109752001B (en) | 2017-11-07 | 2017-11-07 | Navigation system, method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109752001B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111678519A (en) * | 2020-06-05 | 2020-09-18 | 北京都是科技有限公司 | Intelligent navigation method, device and storage medium |
CN112135242A (en) * | 2020-08-11 | 2020-12-25 | 科莱因(苏州)智能科技有限公司 | Building visitor navigation method based on 5G and face recognition |
CN112149454A (en) * | 2019-06-26 | 2020-12-29 | 杭州海康威视数字技术股份有限公司 | Behavior recognition method, device and equipment |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104422439A (en) * | 2013-08-21 | 2015-03-18 | 希姆通信息技术(上海)有限公司 | Navigation method, apparatus, server, navigation system and use method of system |
CN105371850A (en) * | 2015-11-17 | 2016-03-02 | 广东欧珀移动通信有限公司 | Route navigation method and mobile terminal |
CN105426476A (en) * | 2015-11-17 | 2016-03-23 | 广东欧珀移动通信有限公司 | Method for generating navigation route and terminal |
CN106370174A (en) * | 2015-07-23 | 2017-02-01 | 腾讯科技(深圳)有限公司 | Position navigation method and position navigation apparatus based on enterprise communication software |
CN106871898A (en) * | 2016-12-30 | 2017-06-20 | 山东中架工人信息技术股份有限公司 | A kind of RIM solid 3D micro navigations systems and the method for forming navigation |
CN106991839A (en) * | 2016-01-20 | 2017-07-28 | 罗伯特·博世有限公司 | Pedestrian navigation method and corresponding central computation unit and portable set |
CN107314769A (en) * | 2017-06-19 | 2017-11-03 | 成都领创先科技有限公司 | The strong indoor occupant locating system of security |
-
2017
- 2017-11-07 CN CN201711081993.XA patent/CN109752001B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104422439A (en) * | 2013-08-21 | 2015-03-18 | 希姆通信息技术(上海)有限公司 | Navigation method, apparatus, server, navigation system and use method of system |
CN106370174A (en) * | 2015-07-23 | 2017-02-01 | 腾讯科技(深圳)有限公司 | Position navigation method and position navigation apparatus based on enterprise communication software |
CN105371850A (en) * | 2015-11-17 | 2016-03-02 | 广东欧珀移动通信有限公司 | Route navigation method and mobile terminal |
CN105426476A (en) * | 2015-11-17 | 2016-03-23 | 广东欧珀移动通信有限公司 | Method for generating navigation route and terminal |
CN106991839A (en) * | 2016-01-20 | 2017-07-28 | 罗伯特·博世有限公司 | Pedestrian navigation method and corresponding central computation unit and portable set |
CN106871898A (en) * | 2016-12-30 | 2017-06-20 | 山东中架工人信息技术股份有限公司 | A kind of RIM solid 3D micro navigations systems and the method for forming navigation |
CN107314769A (en) * | 2017-06-19 | 2017-11-03 | 成都领创先科技有限公司 | The strong indoor occupant locating system of security |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112149454A (en) * | 2019-06-26 | 2020-12-29 | 杭州海康威视数字技术股份有限公司 | Behavior recognition method, device and equipment |
CN111678519A (en) * | 2020-06-05 | 2020-09-18 | 北京都是科技有限公司 | Intelligent navigation method, device and storage medium |
CN112135242A (en) * | 2020-08-11 | 2020-12-25 | 科莱因(苏州)智能科技有限公司 | Building visitor navigation method based on 5G and face recognition |
CN112135242B (en) * | 2020-08-11 | 2023-05-02 | 科莱因(苏州)智能科技有限公司 | Building visitor navigation method based on 5G and face recognition |
Also Published As
Publication number | Publication date |
---|---|
CN109752001B (en) | 2021-07-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11042751B2 (en) | Augmented reality assisted pickup | |
EP2975555B1 (en) | Method and apparatus for displaying a point of interest | |
KR102222250B1 (en) | Method and Apparatus for Providing Route Guidance using Reference Points | |
US10527446B2 (en) | System and method for determining location | |
WO2013036003A2 (en) | Method and system for collectively managing coupons using a mobile communication terminal | |
CN113282845A (en) | Interactive task pushing method and device | |
US9172767B2 (en) | Mobile terminal, data distribution server, data distribution system, and data distribution method | |
EP2988473B1 (en) | Argument reality content screening method, apparatus, and system | |
CN109374002A (en) | Air navigation aid and system, computer readable storage medium | |
CN109752001A (en) | Navigation system, method and apparatus | |
CN109872392A (en) | Man-machine interaction method and device based on high-precision map | |
CN109029466A (en) | indoor navigation method and device | |
CN109145226A (en) | Content delivery method and device | |
CN103475689A (en) | Apparatus and method for providing augmented reality service | |
CN109978220A (en) | For providing system, equipment, method and the medium of guidance information | |
CN104061925A (en) | Indoor navigation system based on intelligent glasses | |
CN111937026A (en) | Analysis system | |
CN110413869A (en) | Method and apparatus for pushed information | |
JP6442827B2 (en) | Information providing apparatus, information providing program, information providing method, and information providing system | |
CN108107457A (en) | For obtaining the method and apparatus of location information | |
CN107084728A (en) | Method and apparatus for detecting numerical map | |
CN110719324A (en) | Information pushing method and equipment | |
WO2019124851A1 (en) | Method and system for cloud sourcing geofencing-based content | |
CN110879068A (en) | Indoor map navigation method and device | |
JP6133967B1 (en) | Information processing system, information processing device, information processing method, information processing program, portable terminal device, and control program therefor |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |