CN110470312B - Navigation method based on optical label network and corresponding computing equipment - Google Patents

Navigation method based on optical label network and corresponding computing equipment Download PDF

Info

Publication number
CN110470312B
CN110470312B CN201810435184.2A CN201810435184A CN110470312B CN 110470312 B CN110470312 B CN 110470312B CN 201810435184 A CN201810435184 A CN 201810435184A CN 110470312 B CN110470312 B CN 110470312B
Authority
CN
China
Prior art keywords
navigation
optical label
navigation object
optical
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810435184.2A
Other languages
Chinese (zh)
Other versions
CN110470312A (en
Inventor
方俊
牛旭恒
李江亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Whyhow Information Technology Co Ltd
Original Assignee
Beijing Whyhow Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Whyhow Information Technology Co Ltd filed Critical Beijing Whyhow Information Technology Co Ltd
Priority to CN201810435184.2A priority Critical patent/CN110470312B/en
Priority to PCT/CN2019/085995 priority patent/WO2019214640A1/en
Priority to TW108116062A priority patent/TWI733107B/en
Publication of CN110470312A publication Critical patent/CN110470312A/en
Application granted granted Critical
Publication of CN110470312B publication Critical patent/CN110470312B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W64/00Locating users or terminals or network equipment for network management purposes, e.g. mobility management

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Navigation (AREA)

Abstract

An optical label network-based navigation method, wherein the optical label network comprises a server and a plurality of optical labels, each of the plurality of optical labels is capable of being used for transmitting identification information thereof, the server is used for storing the identification information and position information of the optical labels, the method comprises: determining a starting point and a destination point of a navigation object; providing a planned travel route for the navigation object based on the starting point and the destination point and in combination with the optical label network, wherein the travel route is provided with one or more optical labels along the travel route; obtaining position information of the navigation object obtained based on the position information of the optical label in the navigation process; and continuing to provide navigation for the navigation object based on the position information of the navigation object.

Description

Navigation method based on optical label network and corresponding computing equipment
Technical Field
The invention relates to the field of optical information technology and location service, in particular to a navigation method based on an optical label network.
Background
Bar codes and two-dimensional codes have been widely used to encode information. When these bar codes and two-dimensional codes are scanned with a specific device or software, the corresponding information is recognized. However, the identification distance of the barcode and the two-dimensional code is very limited. For example, for a two-dimensional code, when scanned by a camera of a cell phone, the cell phone must typically be placed within a relatively close distance, typically only about 15 times the width of the two-dimensional code. Thus, for long-distance identification (e.g., a distance equivalent to 200 times the width of a two-dimensional code), bar codes and two-dimensional codes are generally not realizable, or very large bar codes and two-dimensional codes must be customized, but this may lead to increased costs and in many cases may not be realizable due to other various limitations.
Optical labels, which may also be referred to as optical communication devices, convey information by emitting different lights, have the advantages of long distance, relaxed requirements for visible light conditions, strong directivity, and locatability, and the information conveyed by optical labels may dynamically change over time, thereby providing greater information capacity (e.g., optical communication devices described in chinese patent publication CN105740936A, etc.). Compared with the traditional two-dimensional code, the optical label has stronger information interaction capacity and longer identification distance, thereby providing great convenience for users and merchants.
With the application and popularization of optical labels, how to comprehensively utilize a plurality of optical labels continuously deployed in the surrounding environment to jointly provide services such as location and information inquiry, navigation and positioning has great research and commercial values.
Disclosure of Invention
The invention aims to provide a navigation method based on an optical label network.
One aspect of the present invention provides a navigation method based on an optical label network, the optical label network including a server and a plurality of optical labels, each of the plurality of optical labels being capable of communicating identification information thereof, the server being configured to store the identification information and location information of the optical labels, the method including:
determining a starting point and a destination point of a navigation object;
providing a planned travel route for the navigation object based on the starting point and the destination point and in combination with the optical label network, wherein the travel route is provided with one or more optical labels along the travel route;
obtaining position information of the navigation object obtained based on the position information of the optical label in the navigation process; and
and continuing to provide navigation for the navigation object based on the position information of the navigation object.
Preferably, the position information of the navigation object is obtained by:
the navigation object collects information transmitted by the optical label through image collection equipment carried by the navigation object so as to obtain identification information of the optical label;
the navigation object obtains the position information of the optical label from the server through the identification information of the optical label;
the navigation object determines the relative position relation between the navigation object and the optical label; and
and obtaining the position information of the navigation object based on the relative position relationship between the navigation object and the optical label and the position information of the optical label.
Preferably, the navigation object determines its relative positional relationship to the optical label by reverse positioning.
Preferably, the continuing to provide navigation for the navigation object based on the position information of the navigation object comprises:
determining whether the navigation object has deviated from the planned travel route based on the position information of the navigation object;
if there is no deviation, continuing navigation based on the planned travel route;
and if the deviation is detected, providing a planned travel route for the navigation object again based on the position information of the navigation object and the destination point and combining with the optical label network.
Preferably, the providing a planned travel route for the navigation object based on the start point and the destination point and in combination with the optical label network includes:
determining a feasible route using the starting point and the destination point; and
and providing one or more planned travel routes for the navigation object based on the deployment condition of the optical labels on each feasible route.
Preferably, the starting point is obtained by:
the navigation object collects information transmitted by the optical label through image collection equipment carried by the navigation object so as to obtain identification information of the optical label;
the navigation object obtains the position information of the optical label from the server through the identification information of the optical label;
the navigation object determines the relative position relation between the navigation object and the optical label; and
and obtaining the position information of the navigation object as the starting point based on the relative position relation between the navigation object and the optical label and the position information of the optical label.
Preferably, the navigation method further comprises: indicating how the navigation object travels from one light label to the next based on relative positional relationships between a plurality of light labels along the planned travel route.
Preferably, the navigation method further comprises: and associating the optical label with the store or building where the optical label is located so as to perform association prompting on the navigation object in the navigation process.
Preferably, the navigation method further comprises: one or more of GPS navigation, inertial navigation, or visual navigation is used in conjunction with the navigation process.
Preferably, the navigation object is a user holding the image capturing apparatus or a machine capable of autonomous movement in which the image capturing apparatus is installed.
Another aspect of the invention provides a computing device comprising a processor and a memory, the memory having stored therein a computer program which, when executed by the processor, is operable to implement the navigation method described above.
A further aspect of the invention provides a storage medium in which a computer program is stored, which computer program, when being executed, is adapted to carry out the above-mentioned navigation method.
Drawings
Embodiments of the invention are further described below with reference to the accompanying drawings, in which:
FIG. 1 is a schematic diagram of an optical label network according to one embodiment of the present invention;
fig. 2 is a flowchart illustrating a navigation method based on an optical label network according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail by embodiments with reference to the accompanying drawings. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
In embodiments of the present invention, the optical label may be any optical communication device capable of transmitting different information by emitting different light. In one embodiment, an optical label may include at least one light source and a controller for controlling different lights emitted by the light source to convey different information. For example, the controller may cause the light sources to emit different light by changing a property of the light emitted by the light sources. The property of the light may be any property that can be perceived by an optical imaging device (e.g., a CMOS or CCD imaging device); for example, it may be a property that is perceptible to the human eye, such as intensity, color, wavelength, etc., or it may be another property that is not perceptible to the human eye, such as an intensity, color, or wavelength change of electromagnetic wavelengths outside the visible range of the human eye, or any combination of the above properties. Thus, the change in the property of the light may be a change in a single property or a change in a combination of two or more properties. When selecting the intensity of the light as the property, this can be achieved simply by selecting the light source to be switched on or off.
Various forms of light sources may be used in the optical label, so long as certain properties thereof are perceptible by the optical imaging device. Various common optical devices may be included in the light source, such as light guide plates, diffuser plates, diffusers, and the like. For example, the light source may be one LED lamp, an array of a plurality of LED lamps, a display screen, or a part thereof, and even an irradiation area of light (for example, an irradiation area of light on a wall) may be used as the light source. The shape of the light source may be various shapes such as a circle, sphere, square, rectangle, bar, etc.
In one embodiment, the controller of the optical label may control the properties of the light emitted by each light source in order to convey information. For example, a "0" or a "1" of binary digital information may be represented by controlling the turning on and off of each light source, so that a plurality of light sources in the optical label may be used to represent a sequence of binary digital information. As will be appreciated by those skilled in the art, each light source may be used to represent not only a binary number, but also ternary or higher data. For example, each light source may be enabled to represent ternary or higher data by setting the intensity of light emitted by the light source to be selected from three or more levels, or by setting the color of light emitted by the light source to be selected from three or more colors, or even by employing a combination of intensity and color. Therefore, compared with the traditional two-dimensional code, the optical label can remarkably improve the data coding density.
In yet another embodiment, the controller of the optical tag may control the light source to change the properties of the light it emits at a certain frequency, and thus the optical tag of the present invention may represent different data information, e.g. different sequences of binary digital information, at different times. In this way, when the optical label of the present invention is continuously photographed using an optical imaging device (e.g., at a rate of 30 frames/second), each frame image thereof can be used to represent a set of information sequences, thereby further significantly improving the data encoding density thereof compared to the conventional static two-dimensional code.
In embodiments of the present application, the optical label may be imaged by using an optical imaging device or an image acquisition device commonly known in the art, and the information to be transferred, for example, an information sequence of binary data 1 or data 0, is determined from each frame of image, so as to realize information transfer of the optical label to the optical imaging device. The optical imaging device or image acquisition device may comprise an image acquisition element, a processor, a memory, etc. The optical imaging device or the image capturing device may be, for example, a mobile terminal with a shooting function, including a mobile phone, a tablet computer, smart glasses, and the like, which may include an image capturing apparatus and an image processing module. A user finds the optical label by naked eyes within a range of sight distance from the optical label, and scans the optical label and carries out information capturing and interpretation processing by enabling the mobile terminal imaging sensor to face the optical label. When the controller of the optical tag controls the light source to change the attribute of the light emitted therefrom at a certain frequency, the image capturing frequency of the mobile terminal may be set to be greater than or equal to 2 times the attribute changing frequency of the light source. By performing a decoding operation on the acquired image frames, a process of recognizing decoding may be completed. In one embodiment, to avoid duplication, omission, etc. of image frames, a sequence number, check bits, time stamps, etc. may be included in the information conveyed by the optical label. If desired, a start frame or an end frame, or both, may be provided in the plurality of image frames to indicate the start or end position of a complete cycle of the plurality of image frames, and the start frame or the end frame may be set to display a particular data combination, such as: all 0's or all 1's, or any particular combination that is not identical to the information that may actually be displayed.
Taking the CMOS imaging device as an example, when a continuous multi-frame image of the light source is photographed by the CMOS imaging device, it is controlled by the controller so that the switching time interval between the operation modes of the light source is equal to the time length of one complete frame imaging of the CMOS imaging device, thereby realizing frame synchronization of the light source and the imaging device. Assuming that each light source transmits 1 bit of information per frame, each light source can deliver 30 bits of information per second for a shooting speed of 30 frames/second, with a coding space of up to 230The information may include, for example, a start frame marker (header), an ID of an optical label, a password, a verification code, website address information, a timestamp, or various combinations thereof, and so forth. The data packet structure can be formed by setting the sequence relation of the various information according to a structuring method. Each time a complete packet structure is received, it is considered to obtain a complete set of data (one packet), which can be further subjected to data reading and verification analysis. Table 1 gives an example data packet structure according to one embodiment of the invention:
TABLE 1
Frame header Attribute field (optional) Data field Check bit Frame end
Compared with the traditional two-dimensional code, the optical label transmits information by emitting different lights, has the advantages of long distance, loose requirements on visible light conditions, strong directivity and positioning, and can rapidly change along with time, so that large information capacity can be provided. Therefore, the optical label has stronger information interaction capability, thereby providing great convenience for users and merchants.
In order to provide corresponding services to users and merchants based on optical labels, each optical label may be assigned an identification Information (ID) that is used to uniquely identify or identify the optical label by the manufacturer, manager, user, etc. of the optical label. Generally, the identification information of the optical label can be issued by the optical label, and the user can acquire the information (e.g., the identification information) transmitted by the optical label by using, for example, an image acquisition device or an imaging device built in the mobile phone to acquire the image of the optical label, so that the corresponding service can be accessed based on the information, for example, accessing a web page associated with the identification information of the optical label, acquiring other information associated with the identification information (e.g., the location information of the optical label corresponding to the identification information), and the like.
Referring now to FIG. 1, shown is a schematic diagram of an optical label network in accordance with one embodiment of the present invention. The optical label network includes a plurality of optical labels and at least one server. Each optical label typically includes at least one light source and a controller therein that controls the light source to emit light in different patterns to convey different information, which may include, for example, identification information of the optical label. The optical label may be a fixed optical label or a mobile optical label. Fixed optical labels generally refer to optical labels that remain substantially positionally invariant, e.g., optical labels mounted at store doorheads, on buildings. A portable optical label generally refers to an optical label whose position is changeable at any time, for example, an optical label mounted on a portable device such as a car, or an optical label worn on a person. As shown in fig. 1, information associated with each optical label may be stored on a server. For example, identification Information (ID), location information, and optionally other information, such as whether the optical label is stationary or mobile, service information associated with the optical label, other descriptive information or attributes associated with the optical label, such as physical size, orientation, etc., of the optical label, may be maintained on the server. A server may be a software program running on a computing device, or a cluster of computing devices. The optical label may be offline, i.e., the optical label does not need to communicate with the server. Of course, it will be appreciated that an online optical tag capable of communicating with a server is also possible.
The position information of the optical label may comprise an absolute position and/or a relative position. An absolute position refers to the actual position of the optical label in the physical world, which may be indicated by geographical coordinate information, for example. The relative position of an optical label refers to the position of the optical label relative to another optical label. In one example, the relative position of an optical label may be represented by a spatial displacement of the optical label relative to another optical label, that is, a position of the optical label in a coordinate system with an origin of another optical label (hereinafter, also referred to as a reference optical label) corresponding to the optical label, for example, the relative position may be represented by (x, y, z: refID), where refID is identification information of an optical label as an origin of the coordinate system, that is, identification information of a reference optical label to which the optical label is opposite, and x, y, z respectively represent displacements in three directions relative to the origin of the coordinate system. Preferably, each optical label may have one or more relative positions. For each optical label, its absolute position, or its relative position, or both, may be stored in the server. When the absolute positions of the plurality of optical labels are known, the relative positions between them can be deduced. Similarly, when the relative positions of the two optical labels and the absolute position of one of the optical labels are known, the absolute position of the other optical label can be deduced. The absolute position of each optical label may also be obtained by recursively traversing the relative positions of the optical labels. For example, for a certain optical label, if the absolute position of a corresponding one of the reference optical labels has been determined, the absolute position of the optical label may be obtained according to the relative position of the optical label to the reference optical label and the absolute position of the reference optical label itself. If the absolute positions of all the reference optical labels corresponding to the optical label are not determined, traversing all the relative positions of the reference optical labels with each reference optical label as a starting point, and if the absolute position of the reference optical label corresponding to one of the relative positions is known, obtaining the absolute position of the reference optical label as the starting point according to the relative position and the known absolute position, thereby further obtaining the absolute position of the optical label. The above process may be repeated until some determined absolute position is obtained.
In some embodiments, certain optical labels in the optical label network may be set as reference optical labels having associated absolute positions that may contain precise physical position information. For example, a certain amount of reference optical labels with fixed positions may be laid out in advance, or physical position information of some optical labels may be acquired by a precise positioning device and set as the reference optical labels. In this way, the rest of the optical labels in the cursor network can use these reference optical labels as direct or indirect reference optical labels, and then use the above-mentioned recursive process to obtain their absolute positions.
After the optical label network is constructed, the corresponding service can be obtained by scanning any optical label in the optical label network. For example, a user may scan the optical label through a mobile phone to collect information issued by the optical label and identify identification information of the optical label therefrom, and then may obtain location information of the optical label from a server through the identification information, where the location information may be used for precise positioning and navigation. In positioning and navigation based on optical labels, fixed optical labels are preferably used.
In one embodiment, when the user scans the optical label with the mobile phone, the relative position relationship between the user (more precisely, the mobile phone of the user) and the optical label can be determined by reverse positioning. Based on the relative position relationship between the user and the optical label and the position information of the optical label, the position information of the user can be determined, so that the positioning and navigation of the user can be realized.
Various reverse positioning means known in the art may be used to determine the relative positional relationship of the user and the optical label. In one embodiment, a user may perform image capture on an optical label using an imaging device built in a mobile phone carried around, obtain a relative distance between the user and the optical label based on the captured image, and may obtain current orientation information of the mobile phone through a sensor built in the mobile phone, and obtain a relative direction between the user and the optical label based on the orientation information (preferably, the relative direction between the user and the optical label may be determined more accurately by further combining a position of the optical label in the image), so that a relative positional relationship between the user and the optical label may be obtained based on the relative distance and the relative direction between the user and the optical label. Many imaging devices sold in the market at present are generally provided with a binocular camera or a depth camera, and the imaging device provided with the binocular camera or the depth camera is used for acquiring images of the optical label, so that the relative distance between the imaging device and the optical label can be easily obtained. In another embodiment, the physical size information of the optical label may be stored in the server in order to determine the relative distance of the user from the optical label. When the user recognizes the identification information of the optical label, the physical size information may be obtained from the server using the identification information. Based on the physical size information of the optical label and the imaging size of the optical label on the mobile phone of the user, the relative distance between the user and the optical label can be calculated by using an imaging formula (the larger the imaging is, the closer the distance is; the smaller the imaging is, the farther the distance is). In another embodiment, orientation information of the optical label may be stored in the server in order to determine the relative direction of the user and the optical label. After the user recognizes the identification information of the optical label, the orientation information can be obtained from the server by using the identification information, and then, based on the orientation information of the optical label and the perspective distortion of the image of the optical label on the mobile phone of the user, the relative direction between the user and the optical label can be calculated.
In another embodiment, at least two optical labels may be used in reverse positioning, and the position information of the user may be determined by triangulation based on the position information of each optical label and the relative distance of the user from each optical label. It should be noted that when using two optical labels for triangulation, two candidate positions are typically obtained. In this case, it may be necessary to select from these two candidate positions. In one embodiment, one of the candidate locations may be selected in conjunction with positioning information (e.g., GPS information) of the user's handset itself. For example, a candidate location may be selected that is closer to the GPS information. In another embodiment, the orientation information of each optical label or the orientation information of the mobile phone may be further considered to select one of the candidate positions. It will be appreciated that in the case of three or more optical labels, only one candidate position will theoretically be obtained.
Since the optical tag itself has accurate position information, more accurate user position information than the conventional GPS positioning can be obtained using the optical tag-based reverse positioning, and particularly, when the user is located close to the optical tag (for example, when the user performs reverse positioning by scanning optical tags arranged on shops or buildings on both sides of the block while walking in a commercial block), the error may be very small.
It should be noted that, although the above description is given by taking a user holding a mobile phone as an example, it is understood that the object performing the reverse positioning or navigation may not be the user, but may be a machine capable of autonomous movement, such as a drone, an unmanned automobile, a robot, and the like. The machine capable of moving autonomously can be provided with image acquisition equipment and can interact with the optical label in a manner similar to a mobile phone, so that the position information of the machine can be obtained.
Based on the optical label network and the reverse positioning function provided by the optical label network, a navigation method based on the optical label network can be realized. An optical label network-based navigation method according to an embodiment of the present invention is shown in fig. 2, and includes: step S1, determining the starting point and the destination point of the navigation object; step S2, based on the starting point and the destination point, and combining the optical label network, providing a planned traveling route for the navigation object, wherein the traveling route has one or more optical labels along the route; step S3, obtaining position information of the navigation object based on the position information of the optical label in the navigation process; and a step S4 of continuing to provide navigation for the navigation object based on the position information of the navigation object.
The individual steps in the above-described method are described in detail below.
In step S1, a start point and a destination point of the navigation object are determined.
The user may manually set a navigation start point (e.g., manually input, select on a map, etc.), or set a current location obtained through a device sensor (e.g., a current location provided by a GPS module in the handset) as a default navigation start point. In one embodiment, the navigation object may obtain its current position as the navigation starting point by acquiring an optical label near the scan, and combining the position information of the optical label and the reverse positioning result.
Similarly, the user may manually set the navigation destination (e.g., manually input, select on a map, etc.). In one embodiment, a user may select a destination optical label on the optical label net map as a navigation destination.
As mentioned before, the navigation object may be a user holding the image capturing device or an autonomously movable machine, e.g. a drone, on which the image capturing device is mounted.
And step S2, providing a planned travel route for the navigation object based on the starting point and the destination point and in combination with the optical label network.
After the navigation start point and the navigation destination point are obtained, a planned travel route with one or more optical labels along the route can be provided for the navigation object based on the deployment situation of the optical labels in the optical label network. For example, one or more recommended planned travel routes may be provided for the navigation object based on the deployment of optical labels on each route after several feasible routes have been determined using the starting and destination points. Under otherwise identical conditions, it is preferable to recommend a travel route along which more optical labels are deployed, so as to continuously navigate the navigation object through the optical labels along the route during the travel process.
In step S3, position information of the navigation object obtained based on the position information of the optical label is obtained in the navigation process.
During the traveling process of the navigation object, the information issued by the optical label can be collected by scanning the optical label along the way, the identification information of the optical label is identified from the information, and then the position information of the optical label can be obtained from the server through the identification information. In addition, when the navigation object scans the optical label, the relative position relation between the navigation object and the optical label can be determined in a reverse positioning mode. Based on the relative position relationship between the navigation object and the optical label and the position information of the optical label, the position information of the navigation object can be determined. The navigation system or navigation software may obtain the position information of the navigation object so that a subsequent navigation operation can be performed based on the position information. It will be appreciated that the optical signature scanned by the navigation object during travel need not be an optical signature along the originally planned travel route, e.g., the user may have deviated from the planned travel route during travel. Also, the navigation object does not necessarily scan all the optical labels along the planned travel route during travel, but can selectively scan based on actual needs, for example, scanning a certain optical label nearby when arriving at an intersection.
Step S4, based on the position information of the navigation object, continues to provide navigation for the navigation object.
The navigation system or navigation software may continue to provide navigation for the navigation object based on the location information of the navigation object after obtaining the location information. For example, it may be determined whether the navigation object has deviated from the originally planned travel route based on the position information of the navigation object, and if not, the navigation may be continued based on the originally planned travel route; if the deviation is already caused, the planned travel route can be provided for the navigation object again based on the position information of the navigation object and combined with the optical label network.
In one embodiment, the navigation method further includes: indicating how the navigation object travels from one light label to the next based on relative positional relationships between the plurality of light labels along the planned travel route. For example, relative distances, relative directions, etc. between a plurality of light labels along the planned travel route may be determined and optionally combined with actual road conditions to provide travel instructions to the navigation object, which may be graphical instructions, voice instructions, etc. on the navigation software or navigation system. In one embodiment, the light label deployment map and the map can be displayed in an overlapping manner in the navigation software or the navigation system, so that the navigation object can conveniently view the map. In another embodiment, the optical tag may be associated with a store or building in which the optical tag is located in navigation software or a navigation system to provide an association prompt to the navigation object, for example, a voice prompt may be provided to the navigation object: "proceed forward for 50 meters and reach the next optical label at the store.
The navigation method of the present invention can provide higher accuracy than usual GPS navigation and can be used in situations where GPS signals are lacking or not very good. One scenario in which the navigation method of the present invention is particularly useful is navigation in a busy commercial block or mall where a user can achieve precise navigation through light tags installed at shop doorways or buildings during walking, where GPS navigation is often difficult to meet the precision required in such situations.
In addition, the conventional GPS navigation cannot provide altitude information, which is inconvenient in a navigation scenario (e.g., navigation in a large mall having several floors) requiring altitude information, and the navigation method based on the optical tag network of the present application may provide the altitude information or floor information of the user using the optical tag, thereby solving the above problems.
Another scenario in which the navigation method of the invention is particularly suitable is the navigation of a machine that is capable of autonomous movement (e.g. a drone), which generally requires a relatively high accuracy. When navigating unmanned aerial vehicle, can set up the optical label of certain quantity on unmanned aerial vehicle's planning flight route for navigate to unmanned aerial vehicle's flight. In one embodiment, a number of optical labels may be disposed between city buildings. For example, a number of light tags may be placed at appropriate intervals along one or both sides of the street, preferably above the street lights and trees on the street, and may convey information to the sky above for collection by image capture devices mounted on the drone. The light label may also be mounted directly on top of the street light. Therefore, the unmanned aerial vehicle can be accurately navigated through the optical label network, and the defect of poor navigation precision of the GPS is overcome.
In one embodiment, in the navigation method based on the optical label network of the present invention, other existing navigation methods, such as GPS navigation, inertial navigation, visual navigation, etc., may be combined. For example, during the process of traveling a navigation object from one optical label to the next optical label, GPS navigation and/or inertial navigation and/or visual navigation can be used for assistance, so that the advantages of various navigation modes can be comprehensively utilized, and high-quality navigation service is provided for users.
In embodiments of the present invention, any optical label (or light source) that can be used to convey information may be used. For example, the present invention may be applied to a light source that transmits information through different stripes based on a rolling shutter effect of CMOS, may also be applied to an optical label as described in patent CN105740936A, may also be applied to various optical labels that can recognize transmitted information through a CCD photo sensor, or may also be applied to an array of optical labels (or light sources).
Reference in the specification to "various embodiments," "some embodiments," "one embodiment," or "an embodiment," etc., means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. Thus, the appearances of the phrases "in some embodiments," "in one embodiment," or "in an embodiment," or the like, in various places throughout this specification are not necessarily referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. Thus, a particular feature, structure, or characteristic illustrated or described in connection with one embodiment may be combined, in whole or in part, with a feature, structure, or characteristic of one or more other embodiments without limitation, as long as the combination is not logical or operational.
In the present application, for clarity of explanation, some illustrative operational steps are described in a certain order, but one skilled in the art will appreciate that each of these operational steps is not essential and some of them may be omitted or replaced by others. It is also not necessary that these operations be performed sequentially in the manner shown, but rather that some of these operations be performed in a different order, or in parallel, as desired, provided that the new implementation is not logical or functional.
Although the present invention has been described by way of preferred embodiments, the present invention is not limited to the embodiments described herein, and various changes and modifications may be made without departing from the scope of the present invention.

Claims (11)

1. A navigation method based on an optical label network, the optical label network comprising a server and a plurality of optical labels, each of the plurality of optical labels being operable to communicate its identification information, the server being operable to store the identification information, location information and orientation information of the optical label, wherein the orientation information is used to determine a relative direction of a navigation object and the optical label, the method comprising:
determining a starting point and a destination point of a navigation object, wherein the current position information of the navigation object is used as the starting point;
providing a planned travel route for the navigation object based on the starting point and the destination point and in combination with the optical label network, wherein the travel route is provided with a plurality of optical labels along the way;
obtaining position information of the navigation object obtained based on position information of an optical label and relative position relation of the navigation object relative to the optical label in a navigation process, wherein the relative position relation comprises relative distance and relative direction; and
continuing to provide navigation for the navigation object based on the position information of the navigation object;
wherein still include:
indicating how the navigation object travels from one light label to the next based on relative positional relationships between a plurality of light labels along the planned travel route.
2. The navigation method according to claim 1, wherein the position information of the navigation object is obtained by:
the navigation object collects information transmitted by the optical label through image collection equipment carried by the navigation object so as to obtain identification information of the optical label;
the navigation object obtains the position information and the orientation information of the optical label from the server through the identification information of the optical label;
the navigation object determines the relative position relation between the navigation object and the optical label; and
and obtaining the position information of the navigation object based on the relative position relationship between the navigation object and the optical label and the position information of the optical label.
3. A navigation method according to claim 2, wherein said navigation object determines its relative positional relationship to the optical label by reverse localization.
4. The navigation method of any one of claims 1-3, wherein the continuing to provide navigation for the navigation object based on the location information of the navigation object comprises:
determining whether the navigation object has deviated from the planned travel route based on the position information of the navigation object;
if there is no deviation, continuing navigation based on the planned travel route;
and if the deviation is detected, providing a planned travel route for the navigation object again based on the position information of the navigation object and the destination point and combining with the optical label network.
5. The navigation method according to any one of claims 1-3, wherein said providing a planned travel route for the navigation object based on the start point and the destination point in conjunction with the optical label network comprises:
determining a feasible route using the starting point and the destination point; and
and providing one or more planned travel routes for the navigation object based on the deployment condition of the optical labels on each feasible route.
6. A navigation method according to any one of claims 1-3, wherein the starting point is obtained by:
the navigation object collects information transmitted by the optical label through image collection equipment carried by the navigation object so as to obtain identification information of the optical label;
the navigation object obtains the position information of the optical label from the server through the identification information of the optical label;
the navigation object determines the relative position relation between the navigation object and the optical label; and
and obtaining the position information of the navigation object as the starting point based on the relative position relation between the navigation object and the optical label and the position information of the optical label.
7. The navigation method according to any one of claims 1-3, further comprising:
and associating the optical label with the store or building where the optical label is located so as to perform association prompting on the navigation object in the navigation process.
8. The navigation method according to any one of claims 1-3, further comprising:
one or more of the following navigation modes are combined in the navigation process:
GPS navigation;
inertial navigation; or
And (4) visual navigation.
9. The navigation method according to any one of claims 1 to 3, wherein the navigation object is a user holding the image capturing apparatus or an autonomously movable machine on which the image capturing apparatus is mounted.
10. A computing device comprising a processor and a memory, the memory having stored therein a computer program which, when executed by the processor, is operable to implement the navigation method of any one of claims 1-9.
11. A storage medium having stored therein a computer program which, when executed, is operable to implement the navigation method of any one of claims 1-9.
CN201810435184.2A 2018-05-09 2018-05-09 Navigation method based on optical label network and corresponding computing equipment Active CN110470312B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201810435184.2A CN110470312B (en) 2018-05-09 2018-05-09 Navigation method based on optical label network and corresponding computing equipment
PCT/CN2019/085995 WO2019214640A1 (en) 2018-05-09 2019-05-08 Optical label network-based navigation method and corresponding computing device
TW108116062A TWI733107B (en) 2018-05-09 2019-05-09 Navigation method based on optical tag network, corresponding computing equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810435184.2A CN110470312B (en) 2018-05-09 2018-05-09 Navigation method based on optical label network and corresponding computing equipment

Publications (2)

Publication Number Publication Date
CN110470312A CN110470312A (en) 2019-11-19
CN110470312B true CN110470312B (en) 2021-11-16

Family

ID=68467123

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810435184.2A Active CN110470312B (en) 2018-05-09 2018-05-09 Navigation method based on optical label network and corresponding computing equipment

Country Status (3)

Country Link
CN (1) CN110470312B (en)
TW (1) TWI733107B (en)
WO (1) WO2019214640A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112558008B (en) * 2019-09-26 2024-03-12 北京外号信息技术有限公司 Navigation method, system, equipment and medium based on optical communication device
CN113610195A (en) * 2020-01-12 2021-11-05 异起(上海)智能科技有限公司 Method and device for marking scene
CN113618746B (en) * 2021-10-13 2022-01-18 季华实验室 Robot navigation control method and device, electronic equipment and storage medium
CN114061588A (en) * 2021-11-16 2022-02-18 郑州轻工业大学 Route determining method and system based on optical communication
CN114333410B (en) * 2021-12-30 2023-03-28 中国联合网络通信集团有限公司 Vehicle searching method, device, terminal, server and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102506854A (en) * 2011-11-16 2012-06-20 华南理工大学 Wireless region traffic pointing system
US8210435B2 (en) * 2008-01-14 2012-07-03 Sky-Trax, Inc. Optical position marker apparatus
CN102901948A (en) * 2012-11-05 2013-01-30 北京半导体照明科技促进中心 Indoor positioning device and system
CN103823204A (en) * 2014-03-10 2014-05-28 北京理工大学 Indoor positioning method based on visible light label
CN105890615A (en) * 2016-04-11 2016-08-24 深圳市轱辘软件开发有限公司 Navigation method and device
CN106802156A (en) * 2017-03-07 2017-06-06 邢霞 A kind of code tag, air navigation aid and map
CN107734449A (en) * 2017-11-09 2018-02-23 陕西外号信息技术有限公司 A kind of outdoor assisted location method, system and equipment based on optical label

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006008807A1 (en) * 2004-07-20 2006-01-26 Navitime Japan Co., Ltd. Route seeking device, route seeking method, and program
CN104567875A (en) * 2014-12-26 2015-04-29 北京理工大学 Indoor hybrid positioning system and method based on inertial positioning and VLC for mobile phone
CN106443583B (en) * 2016-08-30 2020-09-01 西安小光子网络科技有限公司 Regional optical label rapid deployment method based on positioning
CN106441289B (en) * 2016-09-09 2018-09-14 中国地质大学(武汉) A kind of LED indoor 2D localization methods of combination accelerometer
WO2018067544A2 (en) * 2016-10-04 2018-04-12 Wal-Mart Stores, Inc. Systems and methods for autonomous drone navigation
CN107907897B (en) * 2017-10-31 2020-06-02 深圳市汇智伟业信息技术有限公司 Intelligent tunnel navigation device and navigation system based on LIFI

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8210435B2 (en) * 2008-01-14 2012-07-03 Sky-Trax, Inc. Optical position marker apparatus
CN102506854A (en) * 2011-11-16 2012-06-20 华南理工大学 Wireless region traffic pointing system
CN102901948A (en) * 2012-11-05 2013-01-30 北京半导体照明科技促进中心 Indoor positioning device and system
CN103823204A (en) * 2014-03-10 2014-05-28 北京理工大学 Indoor positioning method based on visible light label
CN105890615A (en) * 2016-04-11 2016-08-24 深圳市轱辘软件开发有限公司 Navigation method and device
CN106802156A (en) * 2017-03-07 2017-06-06 邢霞 A kind of code tag, air navigation aid and map
CN107734449A (en) * 2017-11-09 2018-02-23 陕西外号信息技术有限公司 A kind of outdoor assisted location method, system and equipment based on optical label

Also Published As

Publication number Publication date
WO2019214640A1 (en) 2019-11-14
TW201947190A (en) 2019-12-16
TWI733107B (en) 2021-07-11
CN110470312A (en) 2019-11-19

Similar Documents

Publication Publication Date Title
CN110470312B (en) Navigation method based on optical label network and corresponding computing equipment
CN111157009A (en) Indoor positioning navigation creating method and system based on mobile terminal and AR (augmented reality) intelligence
CN108496057B (en) Based on illumination source positioning
CN103162683B (en) For performing the apparatus and method of map match
CN105973236A (en) Indoor positioning or navigation method and device, and map database generation method
CN110471403B (en) Method for guiding an autonomously movable machine by means of an optical communication device
CN103398717A (en) Panoramic map database acquisition system and vision-based positioning and navigating method
CN102829775A (en) Indoor navigation method, systems and equipment
CN109470250A (en) A kind of indoor navigation method and system
CN111026107B (en) Method and system for determining the position of a movable object
KR20160027605A (en) Method for locating indoor position of user device and device for the same
CN109282814A (en) Localization method, device and system, positioning system distribution method, storage medium
WO2021057886A1 (en) Navigation method and system based on optical communication apparatus, and device, and medium
JP2017134834A (en) Computer vision positioning system and method
CN112528699B (en) Method and system for obtaining identification information of devices or users thereof in a scene
CN107036602B (en) Indoor autonomous navigation system and method of hybrid unmanned aerial vehicle based on environment information code
KR20190029412A (en) Method for Providing Off-line Shop Information in Network, and Managing Server Used Therein
CN112689234A (en) Indoor vehicle positioning method and device, computer equipment and storage medium
KR102516450B1 (en) Method of generating map and visual localization system using the map
CN109612455A (en) A kind of indoor orientation method and system
CN112581630B (en) User interaction method and system
KR20180106189A (en) Apparatus of detecting indoor position using lacation map image and method thereof
Jonas et al. IMAGO: Image-guided navigation for visually impaired people
KR101999457B1 (en) Method and apparatus for estimating location using optical camera communication
WO2019214645A1 (en) Method for constructing optical label network, and corresponding optical label network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40013133

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant
EE01 Entry into force of recordation of patent licensing contract
EE01 Entry into force of recordation of patent licensing contract

Application publication date: 20191119

Assignee: Shanghai Guangshi fusion Intelligent Technology Co.,Ltd.

Assignor: BEIJING WHYHOW INFORMATION TECHNOLOGY Co.,Ltd.

Contract record no.: X2022110000047

Denomination of invention: Navigation method and corresponding computing equipment based on optical label network

Granted publication date: 20211116

License type: Common License

Record date: 20221012