Disclosure of Invention
The invention aims to provide a navigation method based on an optical label network.
One aspect of the present invention provides a navigation method based on an optical label network, the optical label network including a server and a plurality of optical labels, each of the plurality of optical labels being capable of communicating identification information thereof, the server being configured to store the identification information and location information of the optical labels, the method including:
determining a starting point and a destination point of a navigation object;
providing a planned travel route for the navigation object based on the starting point and the destination point and in combination with the optical label network, wherein the travel route is provided with one or more optical labels along the travel route;
obtaining position information of the navigation object obtained based on the position information of the optical label in the navigation process; and
and continuing to provide navigation for the navigation object based on the position information of the navigation object.
Preferably, the position information of the navigation object is obtained by:
the navigation object collects information transmitted by the optical label through image collection equipment carried by the navigation object so as to obtain identification information of the optical label;
the navigation object obtains the position information of the optical label from the server through the identification information of the optical label;
the navigation object determines the relative position relation between the navigation object and the optical label; and
and obtaining the position information of the navigation object based on the relative position relationship between the navigation object and the optical label and the position information of the optical label.
Preferably, the navigation object determines its relative positional relationship to the optical label by reverse positioning.
Preferably, the continuing to provide navigation for the navigation object based on the position information of the navigation object comprises:
determining whether the navigation object has deviated from the planned travel route based on the position information of the navigation object;
if there is no deviation, continuing navigation based on the planned travel route;
and if the deviation is detected, providing a planned travel route for the navigation object again based on the position information of the navigation object and the destination point and combining with the optical label network.
Preferably, the providing a planned travel route for the navigation object based on the start point and the destination point and in combination with the optical label network includes:
determining a feasible route using the starting point and the destination point; and
and providing one or more planned travel routes for the navigation object based on the deployment condition of the optical labels on each feasible route.
Preferably, the starting point is obtained by:
the navigation object collects information transmitted by the optical label through image collection equipment carried by the navigation object so as to obtain identification information of the optical label;
the navigation object obtains the position information of the optical label from the server through the identification information of the optical label;
the navigation object determines the relative position relation between the navigation object and the optical label; and
and obtaining the position information of the navigation object as the starting point based on the relative position relation between the navigation object and the optical label and the position information of the optical label.
Preferably, the navigation method further comprises: indicating how the navigation object travels from one light label to the next based on relative positional relationships between a plurality of light labels along the planned travel route.
Preferably, the navigation method further comprises: and associating the optical label with the store or building where the optical label is located so as to perform association prompting on the navigation object in the navigation process.
Preferably, the navigation method further comprises: one or more of GPS navigation, inertial navigation, or visual navigation is used in conjunction with the navigation process.
Preferably, the navigation object is a user holding the image capturing apparatus or a machine capable of autonomous movement in which the image capturing apparatus is installed.
Another aspect of the invention provides a computing device comprising a processor and a memory, the memory having stored therein a computer program which, when executed by the processor, is operable to implement the navigation method described above.
A further aspect of the invention provides a storage medium in which a computer program is stored, which computer program, when being executed, is adapted to carry out the above-mentioned navigation method.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail by embodiments with reference to the accompanying drawings. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
In embodiments of the present invention, the optical label may be any optical communication device capable of transmitting different information by emitting different light. In one embodiment, an optical label may include at least one light source and a controller for controlling different lights emitted by the light source to convey different information. For example, the controller may cause the light sources to emit different light by changing a property of the light emitted by the light sources. The property of the light may be any property that can be perceived by an optical imaging device (e.g., a CMOS or CCD imaging device); for example, it may be a property that is perceptible to the human eye, such as intensity, color, wavelength, etc., or it may be another property that is not perceptible to the human eye, such as an intensity, color, or wavelength change of electromagnetic wavelengths outside the visible range of the human eye, or any combination of the above properties. Thus, the change in the property of the light may be a change in a single property or a change in a combination of two or more properties. When selecting the intensity of the light as the property, this can be achieved simply by selecting the light source to be switched on or off.
Various forms of light sources may be used in the optical label, so long as certain properties thereof are perceptible by the optical imaging device. Various common optical devices may be included in the light source, such as light guide plates, diffuser plates, diffusers, and the like. For example, the light source may be one LED lamp, an array of a plurality of LED lamps, a display screen, or a part thereof, and even an irradiation area of light (for example, an irradiation area of light on a wall) may be used as the light source. The shape of the light source may be various shapes such as a circle, sphere, square, rectangle, bar, etc.
In one embodiment, the controller of the optical label may control the properties of the light emitted by each light source in order to convey information. For example, a "0" or a "1" of binary digital information may be represented by controlling the turning on and off of each light source, so that a plurality of light sources in the optical label may be used to represent a sequence of binary digital information. As will be appreciated by those skilled in the art, each light source may be used to represent not only a binary number, but also ternary or higher data. For example, each light source may be enabled to represent ternary or higher data by setting the intensity of light emitted by the light source to be selected from three or more levels, or by setting the color of light emitted by the light source to be selected from three or more colors, or even by employing a combination of intensity and color. Therefore, compared with the traditional two-dimensional code, the optical label can remarkably improve the data coding density.
In yet another embodiment, the controller of the optical tag may control the light source to change the properties of the light it emits at a certain frequency, and thus the optical tag of the present invention may represent different data information, e.g. different sequences of binary digital information, at different times. In this way, when the optical label of the present invention is continuously photographed using an optical imaging device (e.g., at a rate of 30 frames/second), each frame image thereof can be used to represent a set of information sequences, thereby further significantly improving the data encoding density thereof compared to the conventional static two-dimensional code.
In embodiments of the present application, the optical label may be imaged by using an optical imaging device or an image acquisition device commonly known in the art, and the information to be transferred, for example, an information sequence of binary data 1 or data 0, is determined from each frame of image, so as to realize information transfer of the optical label to the optical imaging device. The optical imaging device or image acquisition device may comprise an image acquisition element, a processor, a memory, etc. The optical imaging device or the image capturing device may be, for example, a mobile terminal with a shooting function, including a mobile phone, a tablet computer, smart glasses, and the like, which may include an image capturing apparatus and an image processing module. A user finds the optical label by naked eyes within a range of sight distance from the optical label, and scans the optical label and carries out information capturing and interpretation processing by enabling the mobile terminal imaging sensor to face the optical label. When the controller of the optical tag controls the light source to change the attribute of the light emitted therefrom at a certain frequency, the image capturing frequency of the mobile terminal may be set to be greater than or equal to 2 times the attribute changing frequency of the light source. By performing a decoding operation on the acquired image frames, a process of recognizing decoding may be completed. In one embodiment, to avoid duplication, omission, etc. of image frames, a sequence number, check bits, time stamps, etc. may be included in the information conveyed by the optical label. If desired, a start frame or an end frame, or both, may be provided in the plurality of image frames to indicate the start or end position of a complete cycle of the plurality of image frames, and the start frame or the end frame may be set to display a particular data combination, such as: all 0's or all 1's, or any particular combination that is not identical to the information that may actually be displayed.
Taking the CMOS imaging device as an example, when a continuous multi-frame image of the light source is photographed by the CMOS imaging device, it is controlled by the controller so that the switching time interval between the operation modes of the light source is equal to the time length of one complete frame imaging of the CMOS imaging device, thereby realizing frame synchronization of the light source and the imaging device. Assuming that each light source transmits 1 bit of information per frame, each light source can deliver 30 bits of information per second for a shooting speed of 30 frames/second, with a coding space of up to 230The information may include, for example, a start frame marker (header), an ID of an optical label, a password, a verification code, website address information, a timestamp, or various combinations thereof, and so forth. The data packet structure can be formed by setting the sequence relation of the various information according to a structuring method. Each time a complete packet structure is received, it is considered to obtain a complete set of data (one packet), which can be further subjected to data reading and verification analysis. Table 1 gives an example data packet structure according to one embodiment of the invention:
TABLE 1
Frame header
|
Attribute field (optional)
|
Data field
|
Check bit
|
Frame end |
Compared with the traditional two-dimensional code, the optical label transmits information by emitting different lights, has the advantages of long distance, loose requirements on visible light conditions, strong directivity and positioning, and can rapidly change along with time, so that large information capacity can be provided. Therefore, the optical label has stronger information interaction capability, thereby providing great convenience for users and merchants.
In order to provide corresponding services to users and merchants based on optical labels, each optical label may be assigned an identification Information (ID) that is used to uniquely identify or identify the optical label by the manufacturer, manager, user, etc. of the optical label. Generally, the identification information of the optical label can be issued by the optical label, and the user can acquire the information (e.g., the identification information) transmitted by the optical label by using, for example, an image acquisition device or an imaging device built in the mobile phone to acquire the image of the optical label, so that the corresponding service can be accessed based on the information, for example, accessing a web page associated with the identification information of the optical label, acquiring other information associated with the identification information (e.g., the location information of the optical label corresponding to the identification information), and the like.
Referring now to FIG. 1, shown is a schematic diagram of an optical label network in accordance with one embodiment of the present invention. The optical label network includes a plurality of optical labels and at least one server. Each optical label typically includes at least one light source and a controller therein that controls the light source to emit light in different patterns to convey different information, which may include, for example, identification information of the optical label. The optical label may be a fixed optical label or a mobile optical label. Fixed optical labels generally refer to optical labels that remain substantially positionally invariant, e.g., optical labels mounted at store doorheads, on buildings. A portable optical label generally refers to an optical label whose position is changeable at any time, for example, an optical label mounted on a portable device such as a car, or an optical label worn on a person. As shown in fig. 1, information associated with each optical label may be stored on a server. For example, identification Information (ID), location information, and optionally other information, such as whether the optical label is stationary or mobile, service information associated with the optical label, other descriptive information or attributes associated with the optical label, such as physical size, orientation, etc., of the optical label, may be maintained on the server. A server may be a software program running on a computing device, or a cluster of computing devices. The optical label may be offline, i.e., the optical label does not need to communicate with the server. Of course, it will be appreciated that an online optical tag capable of communicating with a server is also possible.
The position information of the optical label may comprise an absolute position and/or a relative position. An absolute position refers to the actual position of the optical label in the physical world, which may be indicated by geographical coordinate information, for example. The relative position of an optical label refers to the position of the optical label relative to another optical label. In one example, the relative position of an optical label may be represented by a spatial displacement of the optical label relative to another optical label, that is, a position of the optical label in a coordinate system with an origin of another optical label (hereinafter, also referred to as a reference optical label) corresponding to the optical label, for example, the relative position may be represented by (x, y, z: refID), where refID is identification information of an optical label as an origin of the coordinate system, that is, identification information of a reference optical label to which the optical label is opposite, and x, y, z respectively represent displacements in three directions relative to the origin of the coordinate system. Preferably, each optical label may have one or more relative positions. For each optical label, its absolute position, or its relative position, or both, may be stored in the server. When the absolute positions of the plurality of optical labels are known, the relative positions between them can be deduced. Similarly, when the relative positions of the two optical labels and the absolute position of one of the optical labels are known, the absolute position of the other optical label can be deduced. The absolute position of each optical label may also be obtained by recursively traversing the relative positions of the optical labels. For example, for a certain optical label, if the absolute position of a corresponding one of the reference optical labels has been determined, the absolute position of the optical label may be obtained according to the relative position of the optical label to the reference optical label and the absolute position of the reference optical label itself. If the absolute positions of all the reference optical labels corresponding to the optical label are not determined, traversing all the relative positions of the reference optical labels with each reference optical label as a starting point, and if the absolute position of the reference optical label corresponding to one of the relative positions is known, obtaining the absolute position of the reference optical label as the starting point according to the relative position and the known absolute position, thereby further obtaining the absolute position of the optical label. The above process may be repeated until some determined absolute position is obtained.
In some embodiments, certain optical labels in the optical label network may be set as reference optical labels having associated absolute positions that may contain precise physical position information. For example, a certain amount of reference optical labels with fixed positions may be laid out in advance, or physical position information of some optical labels may be acquired by a precise positioning device and set as the reference optical labels. In this way, the rest of the optical labels in the cursor network can use these reference optical labels as direct or indirect reference optical labels, and then use the above-mentioned recursive process to obtain their absolute positions.
After the optical label network is constructed, the corresponding service can be obtained by scanning any optical label in the optical label network. For example, a user may scan the optical label through a mobile phone to collect information issued by the optical label and identify identification information of the optical label therefrom, and then may obtain location information of the optical label from a server through the identification information, where the location information may be used for precise positioning and navigation. In positioning and navigation based on optical labels, fixed optical labels are preferably used.
In one embodiment, when the user scans the optical label with the mobile phone, the relative position relationship between the user (more precisely, the mobile phone of the user) and the optical label can be determined by reverse positioning. Based on the relative position relationship between the user and the optical label and the position information of the optical label, the position information of the user can be determined, so that the positioning and navigation of the user can be realized.
Various reverse positioning means known in the art may be used to determine the relative positional relationship of the user and the optical label. In one embodiment, a user may perform image capture on an optical label using an imaging device built in a mobile phone carried around, obtain a relative distance between the user and the optical label based on the captured image, and may obtain current orientation information of the mobile phone through a sensor built in the mobile phone, and obtain a relative direction between the user and the optical label based on the orientation information (preferably, the relative direction between the user and the optical label may be determined more accurately by further combining a position of the optical label in the image), so that a relative positional relationship between the user and the optical label may be obtained based on the relative distance and the relative direction between the user and the optical label. Many imaging devices sold in the market at present are generally provided with a binocular camera or a depth camera, and the imaging device provided with the binocular camera or the depth camera is used for acquiring images of the optical label, so that the relative distance between the imaging device and the optical label can be easily obtained. In another embodiment, the physical size information of the optical label may be stored in the server in order to determine the relative distance of the user from the optical label. When the user recognizes the identification information of the optical label, the physical size information may be obtained from the server using the identification information. Based on the physical size information of the optical label and the imaging size of the optical label on the mobile phone of the user, the relative distance between the user and the optical label can be calculated by using an imaging formula (the larger the imaging is, the closer the distance is; the smaller the imaging is, the farther the distance is). In another embodiment, orientation information of the optical label may be stored in the server in order to determine the relative direction of the user and the optical label. After the user recognizes the identification information of the optical label, the orientation information can be obtained from the server by using the identification information, and then, based on the orientation information of the optical label and the perspective distortion of the image of the optical label on the mobile phone of the user, the relative direction between the user and the optical label can be calculated.
In another embodiment, at least two optical labels may be used in reverse positioning, and the position information of the user may be determined by triangulation based on the position information of each optical label and the relative distance of the user from each optical label. It should be noted that when using two optical labels for triangulation, two candidate positions are typically obtained. In this case, it may be necessary to select from these two candidate positions. In one embodiment, one of the candidate locations may be selected in conjunction with positioning information (e.g., GPS information) of the user's handset itself. For example, a candidate location may be selected that is closer to the GPS information. In another embodiment, the orientation information of each optical label or the orientation information of the mobile phone may be further considered to select one of the candidate positions. It will be appreciated that in the case of three or more optical labels, only one candidate position will theoretically be obtained.
Since the optical tag itself has accurate position information, more accurate user position information than the conventional GPS positioning can be obtained using the optical tag-based reverse positioning, and particularly, when the user is located close to the optical tag (for example, when the user performs reverse positioning by scanning optical tags arranged on shops or buildings on both sides of the block while walking in a commercial block), the error may be very small.
It should be noted that, although the above description is given by taking a user holding a mobile phone as an example, it is understood that the object performing the reverse positioning or navigation may not be the user, but may be a machine capable of autonomous movement, such as a drone, an unmanned automobile, a robot, and the like. The machine capable of moving autonomously can be provided with image acquisition equipment and can interact with the optical label in a manner similar to a mobile phone, so that the position information of the machine can be obtained.
Based on the optical label network and the reverse positioning function provided by the optical label network, a navigation method based on the optical label network can be realized. An optical label network-based navigation method according to an embodiment of the present invention is shown in fig. 2, and includes: step S1, determining the starting point and the destination point of the navigation object; step S2, based on the starting point and the destination point, and combining the optical label network, providing a planned traveling route for the navigation object, wherein the traveling route has one or more optical labels along the route; step S3, obtaining position information of the navigation object based on the position information of the optical label in the navigation process; and a step S4 of continuing to provide navigation for the navigation object based on the position information of the navigation object.
The individual steps in the above-described method are described in detail below.
In step S1, a start point and a destination point of the navigation object are determined.
The user may manually set a navigation start point (e.g., manually input, select on a map, etc.), or set a current location obtained through a device sensor (e.g., a current location provided by a GPS module in the handset) as a default navigation start point. In one embodiment, the navigation object may obtain its current position as the navigation starting point by acquiring an optical label near the scan, and combining the position information of the optical label and the reverse positioning result.
Similarly, the user may manually set the navigation destination (e.g., manually input, select on a map, etc.). In one embodiment, a user may select a destination optical label on the optical label net map as a navigation destination.
As mentioned before, the navigation object may be a user holding the image capturing device or an autonomously movable machine, e.g. a drone, on which the image capturing device is mounted.
And step S2, providing a planned travel route for the navigation object based on the starting point and the destination point and in combination with the optical label network.
After the navigation start point and the navigation destination point are obtained, a planned travel route with one or more optical labels along the route can be provided for the navigation object based on the deployment situation of the optical labels in the optical label network. For example, one or more recommended planned travel routes may be provided for the navigation object based on the deployment of optical labels on each route after several feasible routes have been determined using the starting and destination points. Under otherwise identical conditions, it is preferable to recommend a travel route along which more optical labels are deployed, so as to continuously navigate the navigation object through the optical labels along the route during the travel process.
In step S3, position information of the navigation object obtained based on the position information of the optical label is obtained in the navigation process.
During the traveling process of the navigation object, the information issued by the optical label can be collected by scanning the optical label along the way, the identification information of the optical label is identified from the information, and then the position information of the optical label can be obtained from the server through the identification information. In addition, when the navigation object scans the optical label, the relative position relation between the navigation object and the optical label can be determined in a reverse positioning mode. Based on the relative position relationship between the navigation object and the optical label and the position information of the optical label, the position information of the navigation object can be determined. The navigation system or navigation software may obtain the position information of the navigation object so that a subsequent navigation operation can be performed based on the position information. It will be appreciated that the optical signature scanned by the navigation object during travel need not be an optical signature along the originally planned travel route, e.g., the user may have deviated from the planned travel route during travel. Also, the navigation object does not necessarily scan all the optical labels along the planned travel route during travel, but can selectively scan based on actual needs, for example, scanning a certain optical label nearby when arriving at an intersection.
Step S4, based on the position information of the navigation object, continues to provide navigation for the navigation object.
The navigation system or navigation software may continue to provide navigation for the navigation object based on the location information of the navigation object after obtaining the location information. For example, it may be determined whether the navigation object has deviated from the originally planned travel route based on the position information of the navigation object, and if not, the navigation may be continued based on the originally planned travel route; if the deviation is already caused, the planned travel route can be provided for the navigation object again based on the position information of the navigation object and combined with the optical label network.
In one embodiment, the navigation method further includes: indicating how the navigation object travels from one light label to the next based on relative positional relationships between the plurality of light labels along the planned travel route. For example, relative distances, relative directions, etc. between a plurality of light labels along the planned travel route may be determined and optionally combined with actual road conditions to provide travel instructions to the navigation object, which may be graphical instructions, voice instructions, etc. on the navigation software or navigation system. In one embodiment, the light label deployment map and the map can be displayed in an overlapping manner in the navigation software or the navigation system, so that the navigation object can conveniently view the map. In another embodiment, the optical tag may be associated with a store or building in which the optical tag is located in navigation software or a navigation system to provide an association prompt to the navigation object, for example, a voice prompt may be provided to the navigation object: "proceed forward for 50 meters and reach the next optical label at the store.
The navigation method of the present invention can provide higher accuracy than usual GPS navigation and can be used in situations where GPS signals are lacking or not very good. One scenario in which the navigation method of the present invention is particularly useful is navigation in a busy commercial block or mall where a user can achieve precise navigation through light tags installed at shop doorways or buildings during walking, where GPS navigation is often difficult to meet the precision required in such situations.
In addition, the conventional GPS navigation cannot provide altitude information, which is inconvenient in a navigation scenario (e.g., navigation in a large mall having several floors) requiring altitude information, and the navigation method based on the optical tag network of the present application may provide the altitude information or floor information of the user using the optical tag, thereby solving the above problems.
Another scenario in which the navigation method of the invention is particularly suitable is the navigation of a machine that is capable of autonomous movement (e.g. a drone), which generally requires a relatively high accuracy. When navigating unmanned aerial vehicle, can set up the optical label of certain quantity on unmanned aerial vehicle's planning flight route for navigate to unmanned aerial vehicle's flight. In one embodiment, a number of optical labels may be disposed between city buildings. For example, a number of light tags may be placed at appropriate intervals along one or both sides of the street, preferably above the street lights and trees on the street, and may convey information to the sky above for collection by image capture devices mounted on the drone. The light label may also be mounted directly on top of the street light. Therefore, the unmanned aerial vehicle can be accurately navigated through the optical label network, and the defect of poor navigation precision of the GPS is overcome.
In one embodiment, in the navigation method based on the optical label network of the present invention, other existing navigation methods, such as GPS navigation, inertial navigation, visual navigation, etc., may be combined. For example, during the process of traveling a navigation object from one optical label to the next optical label, GPS navigation and/or inertial navigation and/or visual navigation can be used for assistance, so that the advantages of various navigation modes can be comprehensively utilized, and high-quality navigation service is provided for users.
In embodiments of the present invention, any optical label (or light source) that can be used to convey information may be used. For example, the present invention may be applied to a light source that transmits information through different stripes based on a rolling shutter effect of CMOS, may also be applied to an optical label as described in patent CN105740936A, may also be applied to various optical labels that can recognize transmitted information through a CCD photo sensor, or may also be applied to an array of optical labels (or light sources).
Reference in the specification to "various embodiments," "some embodiments," "one embodiment," or "an embodiment," etc., means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. Thus, the appearances of the phrases "in some embodiments," "in one embodiment," or "in an embodiment," or the like, in various places throughout this specification are not necessarily referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. Thus, a particular feature, structure, or characteristic illustrated or described in connection with one embodiment may be combined, in whole or in part, with a feature, structure, or characteristic of one or more other embodiments without limitation, as long as the combination is not logical or operational.
In the present application, for clarity of explanation, some illustrative operational steps are described in a certain order, but one skilled in the art will appreciate that each of these operational steps is not essential and some of them may be omitted or replaced by others. It is also not necessary that these operations be performed sequentially in the manner shown, but rather that some of these operations be performed in a different order, or in parallel, as desired, provided that the new implementation is not logical or functional.
Although the present invention has been described by way of preferred embodiments, the present invention is not limited to the embodiments described herein, and various changes and modifications may be made without departing from the scope of the present invention.