CN110942115A - Service providing method and system based on optical label - Google Patents

Service providing method and system based on optical label Download PDF

Info

Publication number
CN110942115A
CN110942115A CN201811113147.6A CN201811113147A CN110942115A CN 110942115 A CN110942115 A CN 110942115A CN 201811113147 A CN201811113147 A CN 201811113147A CN 110942115 A CN110942115 A CN 110942115A
Authority
CN
China
Prior art keywords
optical label
service
terminal device
optical
light source
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811113147.6A
Other languages
Chinese (zh)
Inventor
李江亮
方俊
牛旭恒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Whyhow Information Technology Co Ltd
Original Assignee
Beijing Whyhow Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Whyhow Information Technology Co Ltd filed Critical Beijing Whyhow Information Technology Co Ltd
Priority to CN201811113147.6A priority Critical patent/CN110942115A/en
Priority to PCT/CN2019/086001 priority patent/WO2020062876A1/en
Priority to TW108119213A priority patent/TW202013255A/en
Publication of CN110942115A publication Critical patent/CN110942115A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K17/00Methods or arrangements for effecting co-operative working between equipments covered by two or more of main groups G06K1/00 - G06K15/00, e.g. automatic card files incorporating conveying and reading operations
    • G06K17/0022Methods or arrangements for effecting co-operative working between equipments covered by two or more of main groups G06K1/00 - G06K15/00, e.g. automatic card files incorporating conveying and reading operations arrangements or provisious for transferring data to distant stations, e.g. from a sensing device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K17/00Methods or arrangements for effecting co-operative working between equipments covered by two or more of main groups G06K1/00 - G06K15/00, e.g. automatic card files incorporating conveying and reading operations

Abstract

The invention provides a service providing method and system based on optical labels, wherein optical labels associated with service providers are scanned by terminal equipment carried by users to enter related service access interfaces and generate service requests; based on the scanning of the optical label, the position information of the terminal equipment is determined, and the service provider provides corresponding service according to the determined position information and the service request. Therefore, when a user generates a service demand, the user can utilize the portable terminal equipment to scan the relevant optical labels in the visual field range to request the service and enjoy the service in a short time, and the service demand is not limited by addresses and distribution time.

Description

Service providing method and system based on optical label
Technical Field
The present invention relates to location services, and more particularly, to a method and system for providing services based on optical labels.
Background
With the wide popularization of the internet, various industries try to develop a new service providing mode by using an internet platform, and the internet + becomes a research hotspot. Such as online shopping, mobile payment based on code scanning, etc., have been the service methods generally accepted by the general public. However, such a service providing method is generally a service provided based on a preset location, and one of the user and the service provider must know the predetermined location of the other clearly to perform accurate service interaction. For online shopping, for example, the user is typically required to provide a fixed address for distribution at the time of shopping, and at least hours, typically 1-3 days, of delivery time are required before the user can receive the purchased goods at the fixed address provided by the user. However, this service providing method cannot satisfy the service demand generated by the user anytime and anywhere.
Disclosure of Invention
Therefore, an object of the present invention is to provide a new service providing method and system based on optical labels, which can quickly and timely meet the service requirements generated by users in real time.
The purpose of the invention is realized by the following technical scheme:
in one aspect, the present invention provides a service providing method based on an optical label, including:
s1) acquiring images of the optical labels associated with the service providers through the terminal equipment carried by the users;
s2) acquiring information related to the service provider based on the acquired optical label image and determining the position of the terminal device relative to the optical label;
s3) generating a service request according to the acquired information related to the service provider, and transmitting the service request and the location information of the terminal device to the corresponding service provider, wherein the location information of the terminal device is determined based on the location of the terminal device relative to the optical label;
s4) providing, by the service provider, the requested service based on the received service request and the location information of the terminal device.
In the above method, the optical label may be associated with one or more service providers. The method may further include:
sending the service request to a plurality of service providers that can provide the requested service;
one of the service providers is selected to provide the requested service to the user.
The method can also comprise the steps of acquiring images of the optical label near the current position of the user by using the terminal equipment to determine the position of the terminal equipment relative to the optical label and sending the position to a service provider as new position information; the current location of the terminal device is redetermined by the service provider in response to receiving new location information sent from the terminal device.
In the above method, the location information of the terminal device may include a location of the terminal device relative to the collected optical label and/or a geographic location of the terminal device.
In the method, the geographic location of the terminal device may be determined according to the location of the terminal device relative to the optical label and a pre-calibrated geographic location of the optical label.
In the above method, the position information of the terminal device at step S3) may be a position of the terminal device relative to the collected optical label, and at step S4) may further include determining a geographical position of the terminal device according to the received position of the terminal device relative to the optical label and a geographical position of the optical label calibrated in advance, so as to provide the requested service thereto.
In yet another embodiment, there is provided an optical label-based service providing method including:
s1) acquiring images of the optical labels associated with the service providers through the terminal equipment carried by the users;
s2) acquiring information related to the service provider based on the acquired optical label image;
s3) generating a service request according to the acquired information related to the service provider, and sending the service request to the corresponding service provider;
s4), acquiring images of one or more optical labels around the user through the terminal equipment, determining the current position of the terminal equipment based on the currently acquired images and sending the current position to a corresponding service provider;
s5) the service provider providing the requested service based on the received service request and the current location.
In the above method, at step S4), the one or more optical labels may include an optical label associated with the service provider or other optical labels near the current location of the user.
In the above method, the current location of the terminal device may include a location of the terminal device relative to the collected optical label and/or a geographical location of the terminal device.
In the above method, the determining of the current position of the terminal device based on the currently acquired image at step S4) may include:
identifying currently collected identification information of the optical label;
acquiring geographic position information related to the optical label from a preset optical label server based on the identified identification information;
determining the position of the terminal equipment relative to the optical label based on the currently collected optical label image;
and determining the geographic position of the terminal equipment as the current position of the terminal equipment according to the position of the terminal equipment relative to the optical label and the obtained geographic position information related to the optical label.
In yet another aspect, an optical label-based service providing system is provided, including an optical label client running on a terminal device carried by a user, an optical label associated with a service provider, and a server, wherein:
the optical label client is configured to:
acquiring an image of an optical label associated with a service provider;
acquiring information related to a service provider based on the acquired optical label image and determining the position of the terminal device relative to the optical label;
generating a service request according to the acquired information related to the service provider, and sending the service request and the position of the terminal equipment relative to the optical label to the corresponding service provider;
the server is configured to:
determining the current position of the terminal equipment according to the received position of the terminal equipment relative to the optical label;
providing the requested service based on the received service request and the current location.
In the above system, the optical label client may be further configured to:
acquiring an image of an optical label near the current position of a user to determine the position of the terminal equipment relative to the optical label and sending the position as a new position to a service provider; and
the server may be further configured to:
the current location of the terminal device is re-determined in response to receiving the new location sent from the terminal device.
In the above system, the server may be further configured to:
sending the request to a plurality of service providers that can provide the requested service;
one of the service providers is selected to provide the requested service to the user.
In yet another embodiment, there is also provided an optical label-based service providing system, including an optical label client running on a terminal device carried by a user, an optical label associated with a service provider, and a server, wherein:
the optical label client is configured to:
acquiring an image of an optical label associated with a service provider;
acquiring information related to a service provider based on the acquired optical label image;
generating a service request according to the acquired information related to the service provider, and sending the service request to the corresponding service provider; and
acquiring images of one or more optical labels around a user, determining the current position of the terminal equipment based on the currently acquired images and sending the current position to a corresponding service provider; and
the server is configured to:
providing the requested service based on the received service request and the current location.
In the above system, the determining, by the optical label client, the current location of the terminal device based on the currently captured image includes:
identifying currently collected identification information of the optical label;
acquiring geographic position information related to the optical label from a preset optical label server based on the identified identification information;
determining the position of the terminal equipment relative to the optical label based on the currently collected optical label image;
and determining the geographic position of the terminal equipment as the current position of the terminal equipment according to the position of the terminal equipment relative to the optical label and the obtained geographic position information related to the optical label.
Compared with the prior art, the invention has the advantages that:
the method provides a high-efficiency and rapid service interaction mode, simplifies the interaction process of shopping consumption and service acquisition of the user, can provide the service for the user in time in a short time, meets the instant service requirement, and enables the user to enjoy the real-time service at any time and any place.
Drawings
Embodiments of the invention are further described below with reference to the accompanying drawings, in which:
FIG. 1 is a schematic diagram of a CMOS imager device;
FIG. 2 is a directional diagram of an image acquired by a CMOS imager;
FIG. 3 is a light source according to one embodiment of the present invention;
FIG. 4 is a light source according to another embodiment of the present invention;
FIG. 5 is an imaging timing diagram for a CMOS imager;
FIG. 6 is another imaging timing diagram for a CMOS imager;
fig. 7 shows an imaging diagram on a CMOS imager at different stages when the light source is driven in a certain drive mode;
FIG. 8 illustrates an imaging timing diagram for a CMOS imager according to an embodiment of the invention;
fig. 9 shows an imaging timing diagram of a CMOS imager device in accordance with an embodiment of the invention;
fig. 10 shows an imaging timing diagram of a CMOS imager device in accordance with an embodiment of the invention;
FIG. 11 shows an imaging timing diagram for a CMOS imager for implementing different stripes than FIG. 10, in accordance with one embodiment of the invention;
FIG. 12 illustrates an optical label according to one embodiment of the present invention;
FIG. 13 illustrates an optical label according to another embodiment of the invention;
fig. 14 shows an imaging timing diagram of a CMOS imager device according to an embodiment of the invention;
FIG. 15 shows a real imaging diagram implemented by controlling three light sources in a similar manner to FIG. 14;
FIG. 16 shows a physical image of an optical label using different stripe widths to effect information transfer;
FIG. 17 is a schematic view of an optical label including a location indicator according to one embodiment of the present invention;
FIG. 18 illustrates an optical label including a locating mark according to one embodiment of the present invention when viewed with the naked eye;
FIG. 19 is a schematic diagram of an optical label network according to an embodiment of the invention;
FIG. 20 is a schematic diagram of the basic principle of the triangulation method;
FIG. 21 is a schematic diagram of an imaging process of the imaging device when collecting an optical label;
FIG. 22 is a simplified relationship diagram between an object coordinate system and an image coordinate system;
fig. 23 is a flow chart illustrating an optical label-based retail method according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail by embodiments with reference to the accompanying drawings. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The optical label employed in the embodiment of the present invention may be an optical communication apparatus capable of transmitting different information by emitting different lights, such as the optical communication apparatuses described in chinese patent publications CN104168060A, CN105740936A, patent applications CN201711374915.9, CN201711374042.1, CN201711375274.9, and the like. In one embodiment, an optical label may include at least one light source and a controller for controlling different lights emitted by the light source to convey different information. For example, the controller may cause the light sources to emit different light by changing a property of the light emitted by the light sources. When the light source is operated, the light source can be imaged by using an imaging device or a device with the imaging device (such as a mobile phone, a tablet computer, smart glasses and the like) to obtain attribute change information of the light source.
A property of light refers in this application to any property of light that can be recognized by the optical imaging device, for example, it may be a property that is perceptible to the human eye, such as intensity, color, wavelength, etc., of light, or it may be another property that is not perceptible to the human eye, such as intensity, color, or wavelength of electromagnetic wavelengths outside the visible range of the human eye, or any combination of the above. Thus, the change in the property of the light may be a change in a single property or a change in a combination of two or more properties. When selecting the intensity of light as a property, this can be achieved simply by selecting the light source to be switched on or off, i.e. in this context, no light will be emitted as a situation where the property of the light changes. In some embodiments below, for simplicity, the light source is turned on or off to change the property of the light, but it will be appreciated by those skilled in the art that other ways of changing the property of the light are possible.
Various forms of light sources may be used in the optical label, as long as certain properties thereof that are perceivable by the optical imaging device can be varied at different frequencies. Various common optical devices may be included in the light source, such as light guide plates, diffuser plates, diffusers, and the like. For example, the light source may be one LED lamp, an array of a plurality of LED lamps, a display screen, or a part thereof, and even an irradiation area of light (for example, an irradiation area of light on a wall) may be used as the light source. The shape of the light source may be various shapes such as a circle, a square, a rectangle, a bar, an L-shape, etc. Various common optical devices may be included in the light source, such as light guide plates, diffuser plates, diffusers, and the like. In a preferred embodiment, the light source may be a two-dimensional array of a plurality of LED lights, one dimension of which is longer than the other, preferably in a ratio of about 6-12: 1. for example, the LED lamp array may be constituted by a plurality of LED lamps arranged in a line. When illuminated, the LED light array may appear as a generally rectangular light source, and the operation of the light source is controlled by the controller. In another embodiment, the light source may not be limited to a planar light source, but may be implemented as a solid light source, for example, a bar-shaped cylindrical light source, a cubic light source, a spherical light source, or the like. The light source may be placed on a square, suspended at a location in an indoor location (e.g., a restaurant, a conference room, etc.), for example, so that nearby users in all directions may photograph the light source via a cell phone to obtain information conveyed by the light source.
When the optical tag is in operation, the controller may change the operation mode of the light source in the optical tag at a certain frequency (e.g., 30 times/second), thereby enabling the optical tag to continuously transmit information to the outside. The controller may control the properties of the light emitted by each light source in order to convey information. For example, a "0" or a "1" of binary digital information may be represented by controlling the turning on and off of each light source, so that a plurality of light sources in the optical label may be used to represent a sequence of binary digital information. As will be appreciated by those skilled in the art, each light source may be used to represent not only a binary number, but also ternary or higher data. For example, each light source may be enabled to represent ternary or higher data by setting the intensity of light emitted by the light source to be selected from three or more levels, or by setting the color of light emitted by the light source to be selected from three or more colors, or even by employing a combination of intensity and color. The controller may control the light source to change the properties of the light it emits at a frequency such that the optical label may represent different data information at different times. Thus, when the optical label of the present invention is continuously photographed using an optical imaging device (e.g., at a rate of 30 frames/second), each frame image thereof can be used to represent a set of information sequences.
To identify the information conveyed by the optical label, various optical imaging devices (e.g., CCD devices, CMOS devices, etc.) may be used to scan the optical label and acquire one or more frames of images of the optical label to identify the information conveyed by the optical label as each frame of image is captured. The optical imaging device may be integrated or provided in a terminal device such as a mobile phone, a tablet computer, smart glasses, or the like. For example, when a user finds the optical label by naked eyes within a range of a visual distance from the optical label, the optical label may be subjected to image acquisition by an optical imaging device in a mobile terminal carried by the user, for example, the optical label is scanned and information capturing and interpretation processing is performed. When the controller of the optical tag controls the light source to change the attribute of the light emitted therefrom at a certain frequency, the image capturing frequency of the mobile terminal may be set to be greater than or equal to 2 times the attribute changing frequency of the light source. By performing a decoding operation on the acquired image frames, a process of recognizing decoding may be completed. In one embodiment, to avoid duplication, omission, etc. of image frames, a sequence number, check bits, time stamps, etc. may be included in the information conveyed by the optical label. If desired, a start frame or an end frame, or both, may be provided in the plurality of image frames to indicate the start or end position of a complete cycle of the plurality of image frames, and the start frame or the end frame may be set to display a particular data combination, such as: all 0's or all 1's, or any particular combination that is not identical to the information that may actually be displayed.
Taking CMOS imaging device as an example, when photographing through CMOS imaging deviceWhen continuous multi-frame images of the light source are obtained, the controller can be used for controlling, so that the switching time interval between the working modes of the light source is equal to the imaging time length of one complete frame of the CMOS imaging device, and the frame synchronization of the light source and the imaging device is realized. Assuming that each light source transmits 1 bit of information per frame, each light source can deliver 30 bits of information per second for a shooting speed of 30 frames/second, with a coding space of up to 230The information may include, for example, a start frame marker (header), an ID of an optical label, a password, a verification code, website address information, a timestamp, or various combinations thereof, and so forth. The data packet structure can be formed by setting the sequence relation of the various information according to a structuring method. Each time a complete packet structure is received, it is considered to obtain a complete set of data (one packet), which can be further subjected to data reading and verification analysis. Table 1 gives an example data packet structure according to one embodiment of the invention:
TABLE 1
Frame header Attribute field (optional) Data field Check bit Frame end
CMOS imaging devices, which are currently widely used in electronic devices, generally use a rolling shutter imaging mode, i.e., individual pixels in a frame of an image are not exposed simultaneously (e.g., pixels are exposed in a line-by-line manner). The present invention advantageously utilizes this non-simultaneous exposure characteristic of the rolling shutter imaging manner, so that when a light source is driven by different driving modes, various different stripe patterns or no stripe patterns can be presented on an image of the light source obtained when the light source is photographed by a rolling shutter imaging device. By analyzing and identifying the pattern in the light source image, the identification of the information transmitted by the optical communication device can be realized.
Fig. 1 shows an example CMOS imaging device that includes an array of image sensors (also referred to as image sensing cells) and some other elements. Each image sensor in the array of image sensors corresponds to a pixel. Each column of image sensors is connected to a column amplifier, the output signal of which is then sent to an a/D converter (ADC) for analog-to-digital conversion and then output via an interface circuit. For any image sensor in the image sensor array, the exposure is cleared when the exposure is started, and then the signal value is read after the exposure time is over. CMOS imaging devices typically employ a rolling shutter imaging approach. In the CMOS imaging device, the readout of data is serial, so the clear/expose/readout is also performed sequentially line by line only in a pipeline-like manner, and combined into one frame image after all the lines of the image sensor array have been processed. Thus, the entire CMOS image sensor array is actually exposed row by row (in some cases the CMOS image sensor array may also be exposed row by row at a time), which results in a small time delay between the rows. Due to this small time delay, when the light source flashes at a certain frequency (e.g., by turning the light source on and off), stripes appear on the image taken by the CMOS imaging device.
When the light source is operated, the light source may be imaged using a CMOS imaging device or a device having a CMOS imaging device (e.g., a mobile phone, a tablet computer, smart glasses, etc.), that is, by means of a rolling shutter. Hereinafter, a mobile phone will be described as an example of a CMOS imager, as shown in fig. 2. The line scan direction of the handset is shown as vertical in fig. 2, but those skilled in the art will appreciate that the line scan direction may also be horizontal depending on the underlying hardware configuration.
Fig. 3 shows a light source according to an embodiment of the invention. When the light source shown in fig. 3 is imaged using the CMOS imaging device, it is preferable to make the long side of the light source shown in fig. 3 perpendicular or substantially perpendicular to the row direction of the CMOS imaging device (for example, the row scanning direction of the mobile phone shown in fig. 2) so as to image as many stripes as possible under the same other conditions. However, sometimes a user does not know the line scanning direction of his mobile phone, and in order to ensure that the mobile phone can recognize in various postures and can reach the maximum recognition distance in both the portrait screen and the landscape screen, the light source may be a combination of a plurality of rectangles, for example, an L-shaped light source as shown in fig. 4.
Fig. 5 shows an imaging timing diagram for a CMOS imager device, where each row corresponds to a row of sensors for the CMOS imager device. In imaging each row of a CMOS imaging sensor array, two phases are mainly involved, exposure time and readout time, respectively. There is a possibility that the exposure times of the rows overlap, but the readout times do not overlap.
It is noted that only a small number of rows are schematically shown in fig. 5, and in an actual CMOS imager there are typically thousands of rows of sensors depending on the difference in resolution. For example, for a 1080p resolution, it has 1920 × 1080 pixels, numeral 1080 indicates 1080 scan lines, and numeral 1920 indicates 1920 pixels per line. For 1080p resolution, the readout time per row is approximately 8.7 microseconds (i.e., 8.7 × 10)-6Seconds).
If the exposure time is too long, resulting in a large amount of overlap between the exposure times of adjacent rows, then a stripe may appear as a distinct transition in the imaging, e.g., a plurality of rows of pixels having different gray levels between a row of purely black pixels and a row of purely white pixels. The present invention contemplates that the pixel rows be rendered as sharp as possible, for which the exposure time for each row of a CMOS imager (e.g., a cell phone) may be set or adjusted (e.g., by an APP installed on the cell phone) to select a relatively short exposure time. In a preferred embodiment, the exposure time may be made approximately equal to or less than the readout time for each row. Taking 1080p resolution as an example, the readout time per line is approximately 8.7 microseconds, in which case it is contemplated to adjust the exposure time of the handset to approximately 8.7 microseconds or less. Fig. 6 shows an imaging timing chart of the CMOS imaging device in this case. In this case, the exposure times of the lines do not substantially overlap, or overlap is less, so that a streak having a clearer boundary can be obtained at the time of imaging, which is more easily recognized. It should be noted that fig. 6 is only a preferred embodiment of the present invention, and that longer (e.g., equal to or less than twice, three times, or four times the readout time per row, etc.) or shorter exposure times are also possible. For example, the readout time per line may be approximately 8.7 microseconds, while the set exposure time per line is 14 microseconds. In addition, in order to exhibit the streak, it is preferable that the duration of one period of the signal of the driving pattern of the light source be set to about twice the exposure duration or longer.
Fig. 7 shows the imaging diagram on the CMOS imager at different stages when the controller turns the light source on and off at a certain frequency in a certain driving mode. Specifically, the upper part of fig. 7 shows a state change diagram of the light source at different stages (white corresponds to the light source being turned on, and black corresponds to the light source being turned off), and the lower part shows an imaging diagram of the light source on the CMOS imaging device at different stages, wherein the row direction of the CMOS imaging device is the vertical direction and scans from left to right. Since the image collected by the CMOS imaging device is scanned line by line, when a high-frequency flicker signal is captured, a portion of the obtained one-frame image corresponding to the imaging position of the light source may form a stripe as shown in the lower part of fig. 7, specifically, in a period 1, the light source is turned on, and the scanning line of the leftmost part exposed in the period shows a bright stripe; in a period 2, the light source is turned off, and the scanning line exposed in the period presents dark stripes; in a period 3, the light source is turned on, and the scanning line exposed in the period shows bright stripes; in period 4, the light source is turned off, and the scan line exposed in this period appears dark striped.
The controller may adjust the width of the stripes that appear by setting the frequency at which the light source flashes via the drive pattern, or by setting the duration of each turn on and off of the light source. Longer on or off times generally correspond to wider stripes. For example, in the case shown in fig. 6, if the time length of each turn-on and turn-off of the light source is set to be approximately equal to the exposure time of each line of the CMOS imaging device (the exposure time may be set by the APP mounted on the mobile phone or manually set), it is possible to present a stripe having a width of only one pixel at the time of imaging. If the time period of each turn-on or turn-off of the light source is set to be approximately equal to approximately 2 times the exposure time period of each line of the CMOS imaging device, a stripe having a width of approximately two pixels can be realized, as shown in fig. 8 in particular, where the upper part of fig. 8 shows a signal waveform of a driving mode of the light source, a high level of which may correspond to the turn-on of the light source, and a low level of which may correspond to the turn-off of the light source. The signal frequency of the drive mode of fig. 8 may be, for example, 16000 times per second (each cycle having a duration of 62.5 microseconds, with an on-duration and an off-duration each of about 31.25 microseconds). In the embodiment shown in fig. 8, the duty ratio of the signal of the driving pattern is set to about 50%, and the exposure time period of each line is set to be approximately equal to the readout time of each line, but those skilled in the art will appreciate that other settings are possible as long as distinguishable stripes can be exhibited. For simplicity of description, fig. 8 uses synchronization between the light source and the CMOS imager such that the on and off times of the light source approximately correspond to the start or end times of the exposure duration of a row of the CMOS imager, but those skilled in the art will appreciate that even if the two are not synchronized as in fig. 8, a significant stripe may appear on the CMOS imager, where there may be some transition stripes, but there must be a row exposed when the light source is always off (i.e., the darkest stripe) and a row exposed when the light source is always on (i.e., the brightest stripe), separated by one pixel. Such a change in brightness (i.e., a streak) of a line of pixels can be readily detected (e.g., by comparing the brightness or gray scale of some pixels in the imaging area of the light source). Further, even if there is no line exposed when the light source is always off (i.e., the darkest stripe) and no line exposed when the light source is always on (i.e., the brightest stripe), if there are a line in which the light source-on portion t1 is less than a certain length of time or occupies a small proportion of the entire exposure time (i.e., the darker stripe) and a line in which the light source-on portion t2 is greater than a certain length of time or occupies a large proportion of the entire exposure time (i.e., the lighter stripe) during the exposure time, and t2-t1> the light-dark stripe difference threshold (e.g., 10 microseconds), or t2/t1> the light-dark stripe proportion threshold (e.g., 2), a change in brightness between these pixel lines. The light and shade stripe difference value threshold and the proportion threshold are related to the light intensity of the cursor label, the property of the photosensitive device, the shooting distance and the like. It will be appreciated by those skilled in the art that other thresholds are possible, as long as a computer-resolvable fringe pattern is exhibited.
The stripe pattern recognition method according to an embodiment of the present invention is as follows: obtaining an image of the optical label, and segmenting an imaging area of the light source by using a projection mode; collecting striped and non-striped pictures in different configurations (e.g., different distances, different light source blinking frequencies, etc.); all collected pictures are normalized uniformly to a specific size, for example 64 x 16 pixels; extracting each pixel feature (such as brightness, color and the like) as an input feature, and constructing a machine learning classifier; and performing classification judgment to judge whether the image is a stripe image or a non-stripe image. For the stripe recognition, any other method known in the art can be used for processing by those skilled in the art, and will not be described in detail.
For a strip light source with a length of 5 cm, when a camera is shot at a distance of 10 m (i.e. at a distance of 200 times the length of the light source) with a resolution of 1080p using a conventional mobile phone on the market, the strip light source occupies 6 pixels in the length direction, and if each stripe has a width of 2 pixels, at least one distinct stripe appears in the width range of 6 pixels, which can be easily recognized. If a higher resolution is set, or optical zoom is used, the fringes can also be identified at a greater distance, for example a distance of 300 or 400 times the length of the light source.
The controller may also drive the light source by different driving modes, for example to switch the light source on and off at another frequency. For the case shown in fig. 6, the light source may be configured to turn on and off at least once, for example 64000 times per second or more, during the exposure time of each row of the CMOS imaging device. Fig. 9 shows a case where the light source is turned on and off only once in the exposure time of each line, wherein the upper part of fig. 9 shows a signal waveform of a driving pattern of the light source, and a high level thereof may correspond to the turning on of the light source and a low level thereof may correspond to the turning off of the light source. Since the light source is turned on and off once in the same manner during the exposure time of each row, the exposure intensity energy obtained during each exposure time is approximately equal, and therefore, the brightness of each pixel row of the final image of the light source does not have obvious difference, and stripes do not exist. It will be appreciated by those skilled in the art that higher on and off frequencies are also possible. In addition, for simplicity of description, synchronization between the light source and the CMOS imaging device is used in fig. 9 so that the turn-on time of the light source approximately corresponds to the start time of the exposure time period of a certain line of the CMOS imaging device, but those skilled in the art will appreciate that even if the two are not synchronized as in fig. 9, there is no significant difference in brightness between the respective pixel lines of the final imaging of the light source, and thus no streak exists.
In another embodiment, when the presence of the streak is not desired, direct current may be supplied to the light source so that the light source emits light with a substantially constant intensity, whereby the streak is not present on one frame image of the light source obtained when the light source is photographed by the CMOS image sensor. In this case, it is also possible to achieve approximately the same luminous flux in the different drive modes to avoid flickering phenomena that may be perceived by the human eye when switching between the different drive modes. In addition, it can be understood that when the light source of the present invention is continuously operated in a certain driving mode, human eyes will not perceive any flickering phenomenon.
While fig. 8 above describes an embodiment in which the stripes are presented by varying the intensity of light emitted by the light source (e.g., by turning the light source on or off), in another embodiment, the stripes may also be presented by causing the light source to emit light of a different wavelength or color, as shown in fig. 10. In the embodiment shown in fig. 10, the light source includes a red light emitting red light and a blue light emitting blue light. The upper part of fig. 10 shows signals of a light source driving mode, which includes a red light driving signal and a blue light driving signal, wherein a high level corresponds to the turn-on of the corresponding light source, and a low level corresponds to the turn-off of the corresponding light source. The red and blue drive signals are phase shifted by 180, i.e. the levels are opposite. The light source can be made to alternately emit red light and blue light outward by the red light driving signal and the blue light driving signal, so that red and blue stripes can be presented when the light source is imaged by the CMOS imaging device.
In one embodiment, stripes of different widths may be implemented based on different signal frequencies of the light source drive mode, e.g., in a first drive mode, the light source may operate as shown in fig. 8, thereby implementing a first type of stripe having a width of about two pixels; in the second driving mode, the duration of the high level and the low level in each period of the signal of the light source driving mode in fig. 8 may be modified to be twice, respectively, for example, the LED lamp blinking frequency may be set 8000 times per second (the duration of each period is 125 microseconds, where the on-time and the off-time are each about 62.5 microseconds), thereby implementing the second stripe having a width of about four pixels, as shown in fig. 11 in particular.
In another embodiment, stripes with different colors may be implemented, for example, the light source may be set to include a red light capable of emitting red light and a blue light capable of emitting blue light, and in the first driving mode, the blue light may be turned off, and the red light may operate as shown in fig. 8, thereby implementing red-black stripes; in the second driving mode, the red lamp may be turned off and the blue lamp may be operated as shown in fig. 8, thereby implementing the blue-black stripe. In the above-described embodiment, the red-black stripe and the blue-black stripe are implemented using the same signal frequency in the first driving mode and the second driving mode, but it is understood that different signal frequencies may be used in the first driving mode and the second driving mode.
In addition, it will be understood by those skilled in the art that more than two stripes may be further implemented, for example, in the embodiment including the red lamp and the blue lamp in the light source described above, a third driving mode may be further provided in which the red lamp and the blue lamp are controlled in the manner shown in fig. 10 to implement the red-blue stripe. Obviously, optionally, a stripe-free pattern may be further implemented.
The controller may constantly drive the light source by the corresponding driving mode according to the information to be transmitted over time (for example, the driving mode of the light source is set at a frequency of 30 times/second, that is, the driving mode of the light source is set according to the information to be transmitted every pass of 1/30 seconds), so that the light source can continuously transmit information to the outside. In order to identify the information conveyed by the light source, it may be scanned using a CMOS imaging device and one or more frames of images of the light source are acquired, so that the information conveyed by the light source when each frame of image is captured is identified by the different patterns (e.g., no stripe pattern and various stripe patterns) presented by the light source on each frame of image.
In the above, for convenience of explanation, the driving mode with the corresponding signal frequency is described by taking a square wave as an example, but those skilled in the art can understand that other waveforms of signals, such as sine waves, triangular waves, etc., can also be used in the driving mode.
In practical application environments, the light source may be affected by ambient lighting conditions, interference, noise, etc. when emitting light, which may affect the recognition of the information conveyed by the light source. Therefore, in order to improve the identification accuracy, the invention makes a pair of light sources mutually reference and cooperate to transmit information together in the optical label. This is very advantageous because the light sources in the optical label are located at approximately the same location and are subject to the same ambient lighting conditions, interference, noise, etc., and thus by comparing the imaging of a pair of light sources, rather than analyzing the imaging of a light source individually, the accuracy and stability of the identification of the information conveyed by the light sources can be improved, particularly suitable for remote identification in complex environments. For example, when it is desired to communicate first information, the controller may set the drive patterns of the two light sources to be the same so that they can exhibit the same pattern (e.g., the same stripes) when photographed using a rolling shutter imaging device; when it is desired to communicate other information than the first information, the controller may set the drive patterns of the two light sources to be different so that they can exhibit different patterns (e.g., different stripes) when photographed using a rolling shutter imaging device. In this context, the different patterns may be stripes having different widths, may be stripes having the same width but different positions (due to different phases of driving modes of the light sources, which will be described in detail later), or may be stripes having differences in at least one of width, position, color, and brightness.
Fig. 12 shows an optical label 100 (also referred to as an optical communication device) comprising two light sources, a first light source 101 and a second light source 102, respectively, according to an embodiment of the invention. The optical label 100 further comprises a controller for driving the first light source 101 and the second light source 102 by the driving mode. The controller may be integrated with the light source in one housing or may be remote from the light source as long as it can control the driving mode of the light source. For simplicity, the controller in optical label 100 is not shown in FIG. 12.
In one embodiment, the controller may drive the light source by using a first driving mode and may also drive the light source by using a second driving mode, wherein the first driving mode and the second driving mode may have the same or different frequencies. If the first light source 101 and the second light source 102 are driven in the same driving pattern at a certain time, it can be used to transfer first information, for example, binary data 0; if the first light source 101 and the second light source 102 are driven in different driving patterns at a certain time, it may be used to externally transfer second information different from the first information, for example, binary data 1. In one embodiment, for simplicity, one of the first light source 101 and the second light source 102 may always be driven using the same driving mode.
In one embodiment, when the first light source 101 and the second light source 102 are driven in different driving modes, different information may be further conveyed according to the specific driving modes of the two. For example, when the first light source 101 is driven in the first driving mode and the second light source 102 is driven in the second driving mode, the second information may be externally delivered, and when the first light source 101 is driven in the second driving mode and the second light source 102 is driven in the first driving mode, the third information may be externally delivered.
In one embodiment, the controller may drive the first light source 101 and the second light source 102 through more than two driving modes to increase the encoding density. For example, the controller may drive the first light source 101 and the second light source 102 in a first driving mode, a second driving mode, and a third driving mode. In this case, when the first light source 101 and the second light source 102 are driven in different driving modes, more different information can be delivered according to the specific driving modes of both. For example, information transferred when the first light source 101 is driven in the first driving mode and the second light source 102 is driven in the second driving mode may be different from information transferred when the first light source 101 is driven in the first driving mode and the second light source 102 is driven in the third driving mode.
To increase the encoding density, three or more light sources may be present in the optical label. Fig. 13 shows an optical label 200 comprising three light sources (a first light source 201, a second light source 202 and a third light source 203, respectively) according to an embodiment of the invention. In this embodiment, the controller may drive the light sources by the first driving mode and the second driving mode, and may determine two pairs of light sources accordingly, for example: a first light source 201 and a second light source 202; and a second light source 202 and a third light source 203. For either of the two pairs of light sources, different information may be conveyed depending on whether the pair of light sources are driven in the same driving mode or not. In one embodiment, for simplicity, the same driving mode may always be used to drive the second light source 202 that is common to both pairs of light sources.
In one embodiment, the controller may control the turning on and off of the light sources in a first driving mode having a first frequency and a first phase, and may also control the turning on and off of the light sources in a second driving mode, which may have the same first frequency and a second phase different from the first phase. The first frequency may preferably be a certain frequency between 15Hz and 32KHz, for example, 15Hz, 30Hz, 50Hz, 60Hz, 80Hz, 100Hz, 200Hz, 500Hz, 1KHz, 2KHz, 4KHz, 6KHz, 8KHz, 12KHz, 16KHz, 32KHz, etc. Preferably, the phase difference of the first phase and the second phase is 180 ° (i.e. the two are in anti-phase).
Fig. 14 shows an imaging timing diagram for the CMOS imager device for the optical label shown in fig. 13. The signals of the driving modes of the three light sources are shown in the upper part of fig. 14, and in this embodiment they may (but need not) have the same amplitude, wherein a high level may for example correspond to the turning on of the light sources and a low level may correspond to the turning off of the light sources, but it will be understood by those skilled in the art that the high level and the low level may also correspond to the brightness of the light sources, i.e. the brightness variation of the light sources is controlled by the amplitude variation of the signals instead of turning on and off the light sources.
In fig. 14, the first light source and the second light source are now used to communicate first information, so the controller drives the first light source and the second light source by the same driving mode (e.g., both the first driving mode or the second driving mode); the second and third light sources are now used to convey the second information, so the controller drives the second and third light sources in two drive modes of the same frequency but 180 ° out of phase (e.g. one in the first drive mode and the other in the second drive mode). In this way, when the CMOS imaging device is used to image the optical label, the stripes with the same width are displayed on the images of the first light source, the second light source and the third light source, but the positions or phases of the stripes on the images of the first light source and the second light source are the same (i.e., the row where the bright stripe of the first light source is located is the same as the row where the bright stripe of the second light source is located, and the row where the dark stripe of the first light source is located is the same as the row where the dark stripe of the second light source is located), and the positions or phases of the stripes on the images of the second light source and the third light source are opposite (i.e., the row where the bright stripe of the second light source is located is the same as the row where the dark stripe of the third light source is located, and the row where the dark stripe of the second light source is located is the same as the row where the bright stripe of the third light source is located).
Fig. 15 shows an actual imaging diagram realized by controlling three light sources in a similar manner to fig. 14. The fringe pattern at the top of fig. 15 is the imaging of the first light source; the middle stripe pattern is an image of the second light source; the bottom stripe pattern is the image of the third light source. The line scanning direction of the CMOS imaging device is here the vertical direction. As shown in fig. 15, the stripe widths of the stripe patterns of the three light sources are the same, but the positions or phases of the stripes on the images of the first light source and the second light source are identical, and the positions or phases of the stripes on the images of the second light source and the third light source are reversed (i.e., in the row scanning direction, the bright stripes and the dark stripes of the second light source correspond to the dark stripes and the bright stripes, respectively, of the third light source).
After the actual imaging chart shown in fig. 15 is obtained by the CMOS imaging device, it can be subjected to recognition decoding. In one embodiment, the strip-shaped imaging regions corresponding to each light source may be respectively cut out from the actual imaging map, and the strip-shaped imaging regions are projected in the vertical direction (i.e. the line scanning direction of the CMOS imaging device), so as to obtain three projection vectors: feature _ vector [1], feature _ vector [2], feature _ vector [3 ]; and respectively calculating correlation coefficients of the feature _ vector [1] and the feature _ vector [2] and the feature _ vector [3] to obtain corresponding first and second light source correlation coefficients [1,2] and second and third light source correlation coefficients [2,3 ]. Calculated from the actual imaging graph shown in fig. 15:
coorelation_coefficient[1,2]=0,912746;
coorelation_coefficient[2,3]=-0,96256;
from the correlation coefficient it can be determined that the first light source and the second light source are strongly correlated, indicating that they are using the same driving pattern with the same phase, which in turn results in that they deliver the first information, e.g. binary data 0. The negative correlation of the second light source and the third light source can be determined from the correlation coefficient, indicating that they are using two different driving patterns with opposite phases, which in turn result in that they convey a second information, for example binary data 1. Thus, the result obtained by decoding the entire real imaging image is, for example, a binary data sequence "01". It will be appreciated by those skilled in the art that other image analysis methods known in the art may be used to analytically decode the actual imaged image, as long as they are capable of identifying the similarities and differences of the fringe patterns.
In fig. 15, a case where the imaging area of each light source accommodates several light stripes and dark stripes is shown, but it can be understood by those skilled in the art that in the case of driving the light sources by two driving modes with the same frequency but 180 ° out of phase, the imaging area of each light source does not need to accommodate a plurality of light stripes or dark stripes, or even one complete light stripe or dark stripe (because whether there is a difference between light and dark in the two light source images can be judged by a part of the stripes). This means that the CMOS imager can be located further away from the optical label (since no larger light source imaging is needed to accommodate the plurality of light or dark stripes) or the signal frequency of the driving pattern can be set lower (a lower frequency corresponds to a wider stripe, in case no light source imaging is needed to accommodate the plurality of light or dark stripes, or even in case no light source imaging is needed to accommodate one complete light or dark stripe, a wider stripe can be used, i.e. a driving pattern with a lower signal frequency can be used, which can be as low as 15Hz, for example). In the test, the identification distance which is 400 times as long as the light source can be obtained, namely, for a light source which is 5 cm long and is arranged on a street, people within a range of 20 m away from the light source can identify the information transmitted by the light source through a mobile phone. If a zoom technique or the like is further adopted, a longer recognition distance can be realized.
While described above in connection with the optical label 200 of fig. 13 having three light sources, it will be apparent to those skilled in the art that two or more light sources are possible.
In the above embodiment, the phase difference between the first driving mode and the second driving mode is exemplified by 180 °, but it is understood that the phase difference between the two is not limited to 180 °, but may be set to other values, for example, 90 °, 270 °, and the like. For example, in one embodiment, the phase of the first driving mode is set 90 ° ahead of the phase of the second driving mode, so that the first information is delivered when the driving modes of the two light sources are the same, the second information is delivered when the first light source is driven in the first driving mode and the second light source is driven in the second driving mode, and the third information is delivered when the second light source is driven in the first driving mode and the first light source is driven in the second driving mode. In another embodiment, the controller may provide more driving modes to drive the light source, wherein each driving mode may have a different phase. For example, in one embodiment, the phase of the first driving mode is set 90 ° ahead of the phase of the second driving mode and 180 ° ahead of the phase of the third driving mode, so that, for example, the first information is transferred when the driving modes of the two light sources are the same, the second information is transferred when the first light source is driven in the first driving mode and the second light source is driven in the second driving mode, the third information is transferred when the second light source is driven in the first driving mode and the first light source is driven in the second driving mode, and the fourth information is transferred when the first light source is driven in the first driving mode and the second light source is driven in the third driving mode (or vice versa).
In one embodiment, the light source drive pattern provided by the controller may be at different frequencies, so that a striped pattern or a non-striped pattern with stripes of different widths may be exhibited when the light source is photographed using a CMOS imaging device. For example, the controller may provide several driving modes with different frequencies for the light source, so that when the light source is photographed using the CMOS imaging device, a stripe pattern or a non-stripe pattern with stripe widths of, for example, 2 pixels, 4 pixels, 8 pixels, etc., respectively, may be presented, and recognition of information transmitted by the light source may be achieved by comparing the stripe patterns or the non-stripe patterns. For example, if the stripe widths of two light sources are the same, it is indicative that they convey the first information; if the stripe width of one light source is about 2 times the stripe width of another light source, it indicates that they convey the second information; if the stripe width of one light source is about 4 times the stripe width of another light source, it indicates that they convey the third information; and so on.
Fig. 16 shows an actual image of an optical label using different stripe widths for information transfer, where the line scan direction of the CMOS imager is vertical here. In the imaging graph, the top stripe pattern is the imaging of the first light source, the middle stripe pattern is the imaging of the second light source, the bottom stripe pattern is the imaging of the third light source, and the stripe widths of the second light source and the third light source are the same and twice the stripe width of the first light source. If the first light source and the second light source are used as a pair of light sources which are mutually referenced and matched to transmit information, and the second light source and the third light source are used as another pair of light sources which are mutually referenced and matched to transmit information, it can be determined that the first light source and the second light source adopt the same driving mode, and then the first information, such as binary data 0, transmitted by the first light source and the second light source can be obtained; and it can be determined that the second light source and the third light source are using two different driving patterns with different frequencies (here, the frequency of the driving pattern of the third light source is twice the frequency of the driving pattern of the second light source), and it can be derived that they convey the second information, for example, binary data 1. Thus, the result obtained by decoding the entire real imaging image is, for example, a binary data sequence "01".
It will be appreciated by those skilled in the art that different frequencies and different phases may be used in the various drive modes provided by the controller, so that more information may be represented by different combinations of fringe width differences and phase differences. In one embodiment, a greater variety of fringe patterns may alternatively or additionally be achieved by taking into account the color and/or intensity of the light emitted by the light source, among the multiple drive modes provided by the controller. In fact, the stripes of the different kinds of stripe patterns may be distinguished in at least one of width, position, color, brightness, and the like, as long as the stripe patterns can be distinguished from each other.
The controller may continuously drive the respective light sources in the optical label through respective drive patterns according to the information to be transmitted over time (e.g., set the drive patterns of the respective light sources in the optical label at a frequency of 30 times/second, that is, set the drive patterns of the respective light sources in the optical label according to the information to be transmitted every 1/30 seconds that passes), so that the optical label can continuously transmit information outward. The optical imaging device can continuously scan the optical label and acquire one or more frames of images of the optical label, so as to identify the information transmitted by the optical label when each frame of image is shot, and the information can form a corresponding information sequence.
In one embodiment, the optical label may additionally include one or more location indicators located near the light source used to convey information, such as a light of a particular shape or color, which may, for example, remain constantly on during operation. The location indicator may help a user of the optical imaging device (e.g., a cell phone) to easily find the optical label. In addition, when the optical imaging device is set to be in a mode of shooting the optical label, the positioning mark is obviously imaged and is easy to identify. Thus, one or more location markers disposed in proximity to the information delivery light source can also facilitate the handset in quickly determining the location of the information delivery light source, thereby facilitating subsequent image recognition. In one embodiment, in the identification, the location indicator may be first identified in the image, so that the approximate location of the optical label is found in the image. After the location indicator is identified, one or more regions in the image that encompass the imaging location of the information-conveying light source may be determined based on a predetermined relative positional relationship between the location indicator and the information-conveying light source. These regions may then be identified to determine the information conveyed by the light source.
Fig. 17 is a schematic diagram of an optical label including a location indicator according to one embodiment of the present invention, including three horizontally disposed information delivery light sources 201, 202, and 203, and two vertically disposed location indicator lights 204 and 205 on either side of the information delivery light sources. The imaging area of the information transfer light source can be conveniently determined through the positioning identification lamp and the predetermined relative position relationship between the positioning identification lamp and the information transfer light source.
FIG. 18 illustrates an optical label including a locating mark according to one embodiment of the present invention when viewed with the naked eye. In the optical label, three horizontally arranged information transmission light sources are transmitting information, and the other two vertically arranged positioning identification lamps are positioned at two sides of the information transmission light sources. The information-conveying light source in the optical label is similar to a common illumination light source when viewed with the naked eye.
In one embodiment, an ambient light detection circuit may be included in the optical label, which may be used to detect the intensity of the ambient light. The controller may adjust the intensity of light emitted by the light source when turned on based on the detected intensity of the ambient light. For example, when the ambient light ratio is strong (e.g., daytime), the intensity of the light emitted by the light source is made larger, and when the ambient light ratio is weak (e.g., night), the intensity of the light emitted by the light source is made smaller.
The above-mentioned scheme of the present invention does not need the precise detection of the imaging of a single light source (but by comparing the imaging of a pair of light sources which are mutually referenced and used cooperatively), and therefore, it has extremely strong stability and reliability in the actual information transmission. In particular, in the present invention, rather than analyzing the imaging of a single light source in determining the information conveyed by the light sources, comparing the imaging of a pair of light sources is highly advantageous because the light sources in the optical label are located in approximately the same location and are subject to the same ambient lighting conditions, interference, noise, etc., and thus by comparing the imaging of a pair of light sources, rather than analyzing the imaging of only one light source individually, the accuracy and stability of the identification of the information conveyed by the light sources can be improved, particularly suitable for remote identification and outdoor identification.
Furthermore, it is further advantageous that, since the above-described aspect of the present invention obtains the information conveyed by the light sources by comparing the images of a pair of light sources, it is not necessary to include a large number of fringes (or even a complete fringe in some cases) in the image of each light source, which further facilitates the remote identification and allows the signal frequency of the driving pattern used to generate the fringes in the image of the light source to be reduced.
Compared with bar codes and two-dimensional codes used for close-range identification, optical labels transmit information by emitting different lights, which has advantages of long distance, loose requirements for visible light conditions, strong directivity, and locatability, and the information transmitted by optical labels can be rapidly changed with time, so that a large information capacity can be provided. Therefore, the optical label has stronger information interaction capability, thereby providing great convenience for users and merchants. In order to provide corresponding services to users and merchants based on optical labels, each optical label may be assigned a unique Identifier (ID) that may be used to uniquely identify or identify the optical label by the manufacturer, manager, user, etc. of the optical label. For example, the identifier may be issued by an optical label, and a user may acquire information (e.g., the identifier) transmitted by the optical label by using, for example, an image acquisition device or an imaging device built in a mobile phone to acquire the image of the optical label, so as to access a service provided based on the optical label.
Information or services related to the optical label may be stored or provided on one or more servers. As shown in fig. 19, an Identifier (ID), location information, a service associated with the optical label, or other information, such as whether the optical label is stationary or mobile, other descriptive information or attributes associated with the optical label, such as physical size, orientation, etc., of the optical label, may be maintained on at least one server. Such servers together with optical labels distributed at various locations form an optical label network. Each optical label in the optical label network may be a fixed optical label or a mobile optical label. Fixed optical labels generally refer to optical labels that remain substantially fixed in position, for example, optical labels that are installed in store doorheads, buildings. A portable optical label generally refers to an optical label whose position is changeable at any time, for example, an optical label mounted on a portable device such as a car, or an optical label worn on a person.
In such an optical label network, the position information of each optical label may comprise an absolute position and/or a relative position. Absolute position refers to the actual position of the optical label in the physical world, which may be indicated by GPS information, for example. The relative position of an optical label refers to the position of the optical label relative to another optical label. In one example, the relative position of an optical label can be represented by a spatial displacement of the optical label relative to another optical label, that is, a position of the optical label in a coordinate system with an origin of another optical label (hereinafter, also referred to as a reference optical label), for example, the relative position can be represented by (x, y, z: refID), where refID is an identifier of the optical label with the origin of the coordinate system, that is, an identifier of the reference optical label, and x, y, z respectively represent displacements in three directions relative to the origin of the coordinate system. Preferably, each optical label may have one or more relative positions. The absolute position of each optical label may be obtained by recursively traversing the relative positions of the optical labels. For example, for a certain optical label, if the absolute position of a corresponding one of the reference optical labels has been determined, the absolute position of the optical label can be obtained according to the relative position of the optical label and the absolute position of the reference optical label. If the absolute positions of all the reference optical labels corresponding to the optical label are not determined, traversing all the relative positions of the reference optical labels with each reference optical label as a starting point, and if the absolute position of the reference optical label corresponding to one of the relative positions is known, obtaining the absolute position of the reference optical label as the starting point according to the relative position and the known absolute position, thereby further obtaining the absolute position of the optical label. The above process may be repeated until some determined absolute position is obtained.
As mentioned above, the optical tag may transmit its identification information (e.g. ID information) during operation, and the terminal device may obtain the ID information of the optical tag by scanning the optical tag, and then query the server based on the ID information to obtain the geographical location information of the optical tag. Meanwhile, the geographical position of the terminal device scanning the optical label can be calculated based on the geographical position information of the optical label and the relative position of the optical label and the terminal device, that is, the terminal device scanning the optical label is precisely positioned (also referred to as reverse positioning). Wherein various feasible reverse positioning methods can be used to determine the relative positional relationship between the optical label and the terminal device on which it is scanned. For example, the relative distance between the terminal device and the optical label may be obtained based on the captured image of the optical label, and then the relative position between the terminal device and the captured optical label may be obtained based on the relative distance and the orientation of the terminal device, so that the geographical position of the terminal device may be calculated based on such relative positional relationship and the geographical position information of the optical label. Many terminal devices sold in the market at present are generally provided with a binocular camera or a depth camera, an imaging device provided with the binocular camera or the depth camera is used for carrying out image acquisition on an optical label, and the relative distance between the terminal device and the optical label can be obtained based on the acquired image. For another example, when a user captures an image of an optical tag using a general camera built in a terminal device, the physical size of the optical tag may be obtained from a server according to the ID information of the recognized optical tag, and then the relative distance between the terminal device and the optical tag may be obtained by using a lens object formula and an object-image relationship based on the size of the captured image of the optical tag, a focal length parameter when the image of the optical tag is captured, the physical size of the optical tag, and the like. For another example, the relative distance between the terminal device and the optical label can be determined by the imaging size of the optical label or by any application with a ranging function on the mobile phone, and the relative position relationship between the terminal device and any optical label can be determined by triangulation using two or more optical labels. The relative positional relationship between the terminal device and the optical label can also be determined by determining the relative distance of the terminal device and the optical label and by analyzing the perspective distortion of the optical label imaged on the terminal device.
For example, in one embodiment, at least two optical labels may be used for positioning. The following steps may be performed for each optical label:
the method comprises the following steps: the ID information of the optical label is collected using an imaging device.
Step two: and obtaining the physical size information and the geographical position information of the optical label through the ID information inquiry.
Step three: and taking a picture of the optical label by adopting the default focal length of the imaging equipment so as to obtain an image of the optical label. Due to the default focal length of the imaging device, the captured optical label image may be blurred.
Step four: the focal length of the imaging device is adjusted and optimized to obtain a sharp image of the optical label. For example, one may first try to increase the focal length based on the default focal length, continue to increase the focal length if the optical label image becomes sharp, and adjust in the opposite direction, i.e., decrease the focal length if the optical label image becomes blurred; and vice versa. In the focal length adjusting process, in order to determine the definition of the optical label image, texture feature extraction can be performed on the optical label image, the clearer the optical label image is, the simpler the corresponding texture information is, and the smaller the density of the texture is, so that an optimal focal length parameter can be determined according to the density of the texture of the optical label image, when smaller texture density cannot be obtained after multiple iterations, the image with the minimum texture density can be considered to be a clear image, and the focal length parameter corresponding to the obtained minimum texture density is taken as the optimal focal length parameter.
Step five: and shooting a clear image of the optical label based on the optimal focal length parameter, and then calculating the relative distance between the imaging equipment and the optical label according to the size of the clear image of the optical label, the physical size of the optical label and the optimal focal length parameter by using a simple lens object image formula and an object image relation.
After obtaining the relative distance between the imaging device and each of the at least two optical labels, the specific location information of the imaging device, i.e., the specific coordinates of the imaging device in the physical world coordinate system, can be determined using triangulation. Fig. 20 is a schematic diagram of a triangulation method, in which two optical labels (optical label 1 and optical label 2) are used for triangulation.
In addition, when triangulation is performed using two optical labels, two candidate locations are typically obtained. In this case, it may be necessary to select from these two candidate positions. In one embodiment, one of the candidate locations may be selected in conjunction with positioning information (e.g., GPS information) of the imaging device (e.g., handset) itself. For example, a candidate location may be selected that is closer to the GPS information. In another embodiment, orientation information of each optical label may be further considered, which actually defines the area where the optical label may be observed, and thus one of the candidate positions may be selected based on the orientation information. The orientation information of the optical label can also be stored in the server and can be inquired and obtained through the ID information of the optical label. In the above embodiment, two optical labels are taken as an example for illustration, but those skilled in the art will understand that the above method based on triangulation can be applied to three or more optical labels. In fact, using three or more optical labels may allow for more precise positioning and often no multiple candidate points will occur.
In yet another embodiment, the following reverse orientation method may also be employed, which does not require the use of at least two optical labels, but rather may use one optical label for reverse orientation. The method of this embodiment comprises the steps of:
the method comprises the following steps: the ID information of the optical label is collected using an imaging device.
Step two: the geographic position information of the optical label and the related information of a plurality of points on the optical label are obtained through the ID information inquiry. The relevant information is for example the position information of the points on the optical label and their coordinate information.
Step three: and taking a picture of the optical label by adopting the default focal length of the imaging equipment so as to obtain an image of the optical label. For example, as described above, the optimal focal length parameter may be determined according to the density of the texture of the optical label image, and when a smaller texture density cannot be obtained after a plurality of iterations, the image with the smallest texture density may be considered to be a sharp image, and the focal length parameter corresponding to the obtained smallest texture density may be taken as the optimal focal length parameter.
Step five: based on the optimal focal length parameters, a sharp image of the optical label is taken, achieving the reverse orientation as described below:
referring to fig. 21, fig. 21 is a schematic diagram of an imaging process of an optical label on an imaging device. Establishing an object coordinate system (X, Y, Z) by taking the centroid of the optical label as an origin, and taking the position F of the imaging devicecAn image coordinate system (x, y, z), also called physical world coordinate system, is established for the origin, the image coordinate system also being called camera coordinate system. In addition, a two-dimensional coordinate system (u, v), called as an image plane coordinate system, is established in an image plane of the optical label by taking a point at the upper left corner of the image of the optical label acquired by the imaging device as a coordinate origin, and an intersection point of the image plane and an optical axis (namely, a Z axis) is taken as a main point, (c)x,cy) Is the coordinate of the principal point in the image plane coordinate system. The coordinates of any point P on the optical label in the object coordinate system are (X, Y, Z), the corresponding image point is q, the coordinates thereof in the image coordinate system are (X, Y, Z), and the coordinates thereof in the image plane coordinate system are (u, v). During imaging, not only the change of displacement, but also the angular rotation of the image coordinate system with respect to the object coordinate system, the relationship between the object coordinate system (X, Y, Z) and the image coordinate system (X, Y, Z) can be expressed as:
Figure BDA0001809771740000241
defining variables: x 'is x/z, and y' is y/z;
then, like the coordinates in the plane coordinate system:
u=fx*x′+cxand u ═ fy*x′+cy(2);
Wherein f isxAnd fyFocal lengths of the imaging device in the x-axis and y-axis directions, c, respectivelyx,cyAs the coordinates of the principal point in the image plane coordinate system, fx、fy、cx、cyAre all parameters inside the imaging device and can be detected in advance. The rotation matrix R and the displacement vector t respectively represent the posture information of the object coordinate system relative to the image coordinate system (i.e. the posture of the imaging device relative to the optical label, which is the central axis phase of the imaging device)The deviation from the optical label is also referred to as the orientation of the imaging device relative to the optical label, e.g., when the imaging device is facing the optical label, R is 0) and displacement information (i.e., the displacement between the imaging device and the optical label). In three-dimensional space, the rotation can be decomposed into two-dimensional rotations about respective coordinate axes, which, if rotated by an angle psi about the x, y, z axes in turn,
Figure BDA0001809771740000242
and theta, then the total rotation matrix R is three matrices Rx(ψ),
Figure BDA0001809771740000243
RzThe product of (θ), i.e.:
Figure BDA0001809771740000244
wherein the content of the first and second substances,
Figure BDA0001809771740000251
Figure BDA0001809771740000252
Figure BDA0001809771740000253
for simplicity, and because it is well known in the art, the calculations are not expanded here, but the rotation matrix is simply written in the form:
Figure BDA0001809771740000254
and the displacement vector t can be simply written in the form that
Figure BDA0001809771740000255
The following relationship is then obtained:
Figure BDA0001809771740000256
where s is an object image conversion factor, equal to the ratio of the size of the image plane to the resolution of the imaging device, is also known.
The image points of the points, e.g., a ', B', C 'and D', in the optical label image are determined based on the information (e.g., the position information of the points on the optical label) about the points (e.g., at least four points A, B, C and D) on the optical label obtained in step two. These four points A, B, C and D may be, for example, the left and right sides of the optical label, respectively, or may be four separate point light sources located at the four corners of the optical label, etc. Coordinate information (X) of the four pointsA,YA,ZA)、(XB,YB,ZB)、(XC,YC,ZC) And (X)D,YD,ZD) Also obtained in step two above. By measuring the coordinates (u) of the corresponding four image points A ', B', C 'and D' in the image plane coordinate systemA’,vA’)、(uB’,vB’)、(uC’,vC’) And (u)D’,vD’) The rotation matrix R and the displacement vector t are solved by substituting the above relation (3), and the relation between the object coordinate system (X, Y, Z) and the image coordinate system (X, Y, Z) is obtained. Based on the relationship, the posture information and displacement information of the imaging device relative to the optical label can be obtained, and therefore the imaging device can be positioned. Fig. 22 shows a simplified representation of the relationship between the object coordinate system and the image coordinate system. Then, according to the geographical position information of the optical label obtained in the second step, the actual specific position and posture of the imaging device can be calculated by the rotation matrix R and the displacement vector t, the specific position of the imaging device is determined by the displacement vector t, and the posture of the imaging device relative to the optical label is determined by the rotation matrix R.
Fig. 23 shows a flowchart of a service providing method based on an optical label according to an embodiment of the present invention. As shown in fig. 23, the method mainly includes image-capturing an optical label associated with a service provider by a terminal device carried by a user (S1); acquiring information related to a service provider according to the acquired optical label image and determining location information of the terminal device (S2); generating, by the terminal device, a service request based on the acquired information, and transmitting the service request and the location information of the terminal device to a corresponding service provider (S3); the service provider provides the requested service to the user based on the received service request and the location information (S4).
For better understanding of the present invention, the retail industry is taken as an example for explanation, and it is assumed that the service provider is a retail store, and the user scans the optical label associated with the retail store through the imaging device (e.g. a camera) in the terminal device carried around by the user, so as to enter the purchase page of the retail store, select the goods to be purchased, and perform ordering, payment and other operations; and simultaneously determining the location of the terminal device relative to the retail store based on the scanning of the optical label, transmitting the determined location information to the retail store along with the order information for stocking and distribution. For example, when a tourist sees that a landscape in a certain place suddenly wants to take paper and a pen to draw down in the walking process, the tourist can take a mobile phone to scan available optical labels in the surrounding environment, enter a purchase page of a corresponding retail store to select the pen and the paper, and can also select other commodities such as water, beverage and the like, can enjoy the landscape while resting after placing an order, and can send the ordered commodities to a corresponding store member in a short time. For another example, when walking outside with a child, suddenly find out that there is no diaper, take the cell phone to scan the light label in its surroundings, find the corresponding retail store and order the diaper through its purchase page, then go slowly in place for a short while or with the child, and the clerk will send the ordered diaper to the customer in a short time.
Compared with the traditional retail store operation mode, the service providing method provided by the embodiment of the invention has the advantages that the retail store does not need to provide a goods exhibition area for customers to browse, and does not need to hire a large number of waiters to manage goods on shelves, perform various operation works such as replenishment or placement, tallying, settlement and the like; but may be used as a warehouse for goods, may be in an automatic pick-up manner or may simply leave a passage for a store clerk to pick up goods. The clerk of such a retail store is usually out of the store and the task is mainly to pick up goods and to distribute them over short distances. Therefore, site costs, labor costs, and time costs can be significantly reduced. Meanwhile, for the user, when the user needs to purchase goods, the user can scan the relevant optical tags in the visual field range of the user by using the portable terminal device (such as a mobile phone) at any time to enter a purchase page of a corresponding retail store for purchase and payment operation, and the user can receive the goods delivered by the merchant after waiting for a while or can receive the goods delivered by the merchant after continuing activities within a certain range. It should be understood that the retail industry is taken as a service provider by way of example only and not by way of limitation, and in addition to the above-mentioned goods distribution, services that can be provided to the current location of the user in a short time may be taken as services provided by the service provider in the present embodiment, such as city tour guide, designated driving, car utilization, printing, and the like.
Compared with the traditional online shopping mode, the invention can carry out shopping at any time and any place based on the optical label, provides a mobile service, does not require the user to know the service access address in advance, and does not require the user to input much information, such as the set position information of the received service; and the user is not restricted by the position any more, and can receive the service at any time and any place. The existing scheme is either complex and needs to provide a lot of information, or is fixed in place and cannot provide mobile services.
With continued reference to fig. 23 and more particularly, at step S1, when the user wishes to obtain certain services, the optical labels associated with the service provider in their surroundings may be image-captured by the terminal device carried by the user. In one embodiment, each service provider may be assigned a particular optical label associated therewith. For example, an optical label typically associated with a retail store may be installed in an area that facilitates image capture by a customer, such as by installing the optical label outside the retail store, such as on a door head. The optical label can be integrated with the signboard of the retail store, and when the signboard of the retail store is subjected to image acquisition by using a mobile phone, information transmitted by the optical label integrated with the signboard can be acquired. Of course, the optical label may also be placed beside the sign of the retail store. For another example, the optical label associated with the retail store may be placed on a billboard that is easily found by the customer, and such a billboard may not necessarily be mounted on the exterior surface of the retail store, but may be mounted anywhere within a certain area around the retail store that is easily found by the customer, such as the exterior surface of a tall building that the retail store is in or near. In this way, when a user finds a service in his field of view that he wants to access, he can access the service by scanning the optical label associated with the service by his terminal device. In yet another embodiment, each optical label may be associated with one or more service providers. For example, the optical label may be associated with a unified service interface through which one or more types of services provided by one or more service providers may be accessed. In this embodiment, such optical labels may be placed in locations in the environment that are easily discovered or otherwise conspicuous, and various service providers within a certain proximity may have their services offered registered in relation to the unified service interface. Thus, a user may only need to scan such an optical label when they want to access a service or services in their vicinity. Even if the service that one wants to access is not within the user's field of view, the service can be discovered by scanning the optical label that provides the unified service interface.
In step S2, information about the service provider is acquired based on the captured optical label image and position information of the terminal device is determined. In some embodiments, the optical label communicates its identification information. As mentioned above, the optical label may release its identification information, and the optical label may be subjected to image acquisition by using an imaging device built in or integrated with the user terminal device, and the identification information of the optical label may be obtained by analyzing the acquired optical label image. The terminal device queries, for example, the optical label network server of fig. 19, using the identification information to obtain information of one or more service providers associated with the optical label. In some embodiments, in addition to the identification information, the optical label may encode and publish more data content, such as an access interface of a service provider associated with the optical label (e.g., website information for a purchase page of a retail store), and the like. In this way, the website of the purchase page of the retail store associated with the optical label can be directly acquired by analyzing the acquired optical label image. In embodiments where the optical label is associated with one or more service providers, the information conveyed by the optical label may include, for example, a web address of the unified service interface. In this way, a user may discover and select one or more services that need to be accessed through the unified service interface.
In addition, as introduced above, the location information of the terminal device may be determined based on the captured optical label image. Here, the location information of the terminal device may include a relative location relationship between the terminal device and the optical label and/or a geographical location of the terminal device. For example, a terminal device equipped with a binocular camera or a depth camera captures an image of an optical label, a relative distance between the terminal device and the optical label is obtained based on the captured image, and then a relative positional relationship between the terminal device and the captured optical label is obtained in conjunction with an orientation of the terminal device at the time of capturing the image. For another example, when a user uses a general camera built in the terminal device to capture an image of an optical label, the physical size of the optical label may be obtained from the server according to the ID information of the identified optical label, and then the relative distance between the terminal device and the optical label may be obtained by using the lens object-image formula and the object-image relationship based on the size of the captured image of the optical label, the focal length parameter when the image of the optical label is captured, and the physical size of the optical label, and further the relative positional relationship between the terminal device and the optical label may be obtained by combining the orientation of the terminal device when the image is captured. In the optical label network mentioned above, the geographical location of each optical label may be pre-calibrated and may be stored in the optical label network server. Therefore, when the geographical position of the optical label is obtained, the geographical position of the terminal device can be obtained through the geographical position of the optical label and the relative position relation between the terminal device and the optical label. Although the approximate geographical position pair of the terminal device can be estimated by GPS positioning, the accuracy of the GPS positioning is in a range of several tens of meters, and is easily affected by weather conditions or electromagnetic interference, and cannot meet the requirement of small-range accurate positioning. Unlike the approximate position of the terminal device obtained by GPS positioning, the position of the terminal device obtained by the optical tag in the environment around the terminal device is more accurate, the positioning accuracy is higher, and there are three-dimensional coordinate information including height information, and attitude information of the photographing device.
In step S3, a service request is generated based on the acquired information and sent to the corresponding service provider together with the terminal device location information. For example, when the information delivered by the optical tag only includes identification information, a predetermined server for serving the optical tag may be further queried based on the acquired identification information of the optical tag to acquire information or services related to the optical tag from the server, such as a website of a purchase page of a retail store associated with the optical tag, a pre-calibrated geographic location of the optical tag, and the like. In this way, the customer may enter the purchase page to select the relevant item for payment to generate the relevant item order. If the information conveyed by the optical label includes a website of a purchase page of the retail store associated with the optical label, the customer may directly access the website to generate an order for the merchandise. Wherein the purchase page associated with the retail store can be provided and maintained by a retail management server located at the retail store, the retail management server further configured to receive the generated order for the goods and to deliver the goods in accordance with the order. The purchase pages associated with the retail stores may also be provided and maintained by other service platforms or network platforms located in the network from which the retail management server of the retail store is responsible for receiving order information for delivery of the goods.
In some embodiments where the optical label is associated with one or more service providers, different service providers may provide the same service, and when a user selects a service through a unified service interface provided by the intermediate service platform, the intermediate service platform may select one of the service providers that may provide the service to respond to the user based on one or more of the following: distance between the service provider and the user, number of service requests currently pending by the service provider, service latency projected by the service provider, service cost, and the like. In further examples, the intermediate service platform may also broadcast a service request from a user to multiple service providers that may provide the requested service, select the fastest responding service provider, the service provider with the best cost of service, the service provider with the shortest service latency, or the service provider closest to the user, etc. to provide the corresponding service to the user. Wherein the payment process between the user and the service provider can be realized by utilizing various payment modes or payment platforms in the prior art.
In step S4, the service provider provides the requested service to the user according to the received service request and the location information of the terminal device. When the received location information of the terminal device is the location of the terminal device relative to the optical label, the service provider may determine the geographic location of the terminal device based on the geographic location of the optical label associated therewith and the received location of the terminal device relative to the optical label. The service provider may also determine the geographic location of the terminal device based on the geographic location of the service provider itself, the relative location between the service provider and its associated optical label, and the received location of the terminal device relative to the optical label. The geographical location of the terminal device may be used as an estimate of the current location of the user so that the requested service may be provided for that location. Still taking retail sales of goods as an example, retail stores may be provided in the form of an inventory warehouse that automatically dispenses and retrieves goods, and employ a specialized retail management server to process service requests to provide corresponding services to users. For example, one or more channels for automatic shipment are provided at the gate of a retail store, a customer can enter a purchase page by scanning an optical label in a field of view and generate an order, and after receiving the order for an article, a retail management server of the retail store can automatically extract the corresponding article from a warehouse and transfer the article to one of the automatic shipment channels. After placing an order, the customer walks to the gate of the retail store, and can take out the purchased goods from the corresponding delivery channel according to the order number, the verification code and the like. In yet another example, for a better shopping experience, the retail management server may determine an approximate location of the customer placing the order based on the location of the terminal device receiving the customer relative to the optical label and instruct the distribution personnel to bring the items ordered by the customer to the customer. Wherein the range of distribution of the goods is related to the farthest identification distance of the optical label, and the identification distance of the optical label depends on the size of the optical label to a certain extent, so that the size of the optical label associated with the retail store can be set according to actual needs to ensure that the distribution time in the distribution range is not more than 5 or 10 minutes as much as possible.
In another embodiment, in order to facilitate the distribution of the distribution personnel, the surrounding environment of the service provider can be modeled in a 3D manner in advance, and easily recognized landmarks or other markers can be calibrated in the established 3D environment model as the environmental features. As mentioned above, the service provider may determine the geographical location of the terminal device from the geographical location of the optical label associated therewith and the received position of the terminal device relative to the optical label. After determining the geographic location of the terminal device, the service provider may extract the environmental characteristics associated with the geographic location from the pre-established environmental model based on the geographic location of the terminal device and provide the environmental characteristics to the distribution personnel so that the distribution personnel can more easily identify the location of the customer.
The user may continue to move within a certain range after sending the service request, so in yet another embodiment, before the user receives the service, the current location of the terminal device may be more accurately determined by the terminal device performing image capture on the optical label in the vicinity of the user, and the re-determined current location of the terminal device may be used to enable the corresponding service provider to perform more accurate service delivery. The current location of the terminal device may include a location of the terminal device relative to the collected optical labels and/or a geographic location of the terminal device. The method introduced above may be employed to determine the current location of the terminal device. For example, when a user moves to a certain location after issuing a service request and wishes to receive a service at the new location, the relative positional relationship between the terminal device and the captured optical tags may be determined by the portable terminal device performing image capture on one or more optical tags available in its vicinity. As mentioned above in conjunction with fig. 19 to describe the optical label network, the optical label server preset in the optical label network may be queried according to the identification information of the identified optical labels to obtain the geographic location information related to the optical labels, and then the geographic location of the terminal device may be determined according to the obtained geographic location information related to one or more optical labels and the relative location relationship between the terminal device and the optical labels, so that the service provider may perform more accurate service delivery according to the current location of the terminal device.
In yet another embodiment, when the user continues to move after sending the service request, the user who sent the service request can be identified and tracked in real time by the service provider or the intermediate service platform through a visual tracking method, and the current location of the user can be provided to the corresponding service provider in time for more accurate service. For example, the retail management server may identify and track users in real time using a camera device installed outside the retail store. When a user uses a portable terminal device to capture an image of an optical label associated with a retail store, the position of the terminal device relative to the captured optical label can be determined by the above-mentioned reverse positioning, and then the approximate position of the terminal device can be estimated according to the preset geographic position of the optical label. Therefore, the position of the terminal equipment of the user can be checked and identified in the local area by utilizing the camera device of the retail store. When a user scans the optical label by using the terminal device, the user usually has a specific hand-lifting photographing action, so that the action can be detected and recognized in a local area of the position where the terminal device is located by using an image recognition technology, and a customer who scans the optical label at the specific position to place an order at the moment is determined. Then, the currently identified person is tracked by using a target tracking method, and the current position of the customer is provided for the terminal equipment carried by the delivery personnel of the related commodity in time. Thus, even if the user orders a small range of movement, the current actual position can be accurately obtained. When the distribution personnel carry out distribution, the client for placing the order can be accurately determined according to the result of the identification and tracking of the camera. The retail management server can provide the tracking result to the terminal device carried by the delivery person, so that the delivery person can know the position of the customer in time.
In another embodiment, information such as delivery countdown and the movement track of delivery personnel can be displayed on a terminal device carried by the user, so that the user can know the delivery condition of the ordered service in time. Still taking retail of commodities as an example, when a distributor takes the commodities for distribution, the distributor can establish connection with a retail management server by using a terminal device carried by the distributor, send order information corresponding to the commodities to register on the server, and then continuously feed back geographical position information of the distributor to the retail server in real time. The retail management server may estimate the required delivery time based on the geographic location of the delivery personnel and the geographic location of the customer. In the case where the customer places an order directly through a purchase page provided by the retail management server, the retail management server may feed back the real-time location and delivery time of the deliverer to the customer's terminal device through a previously established connection with the customer's terminal device to display thereon information on the movement trajectory of the deliverer, delivery countdown, and the like. When a customer places an order through a purchase page provided by other service platforms or online shopping platforms in the network, the retail management server of the retail store can feed back the real-time geographic position of the configuration personnel to the service platforms or the network platforms, and the retail management server of the retail store feeds back the real-time position and the estimated delivery time of the delivery personnel to the terminal equipment of the customer so as to display the information of the movement track, the delivery countdown and the like of the delivery personnel on the terminal equipment.
In yet another embodiment, an optical label-based service providing system is also provided, comprising an optical label client running on a terminal device carried by a user, an optical label associated with a service provider, and a server. Wherein the optical label client can perform image acquisition on an optical label associated with a service provider; acquiring information related to a service provider based on the acquired optical label image and determining position information of the terminal equipment; and generating a service request according to the acquired information related to the service provider, and sending the service request and the position information of the terminal equipment to a server associated with the corresponding service provider. A server associated with the service provider may provide the requested service based on the received service request and the location information as introduced above. In another embodiment, the optical label client may further perform image acquisition on an optical label near the current location of the user, and determine the current location of the terminal device based on the currently acquired optical label image; and sending the re-determined current position of the terminal equipment to a server associated with the corresponding service provider. Accordingly, the server may be further configured to more accurately provide the requested service according to the re-determined current location of the terminal device.
The embodiment of the invention provides an efficient and rapid service interaction mode, simplifies the interaction process of shopping consumption and service acquisition of the user, can provide the service for the user in time in a short time, meets the instant service requirement, and enables the user to enjoy the real-time service at any time and any place.
In addition, the applicant has filed a number of patent applications for optical labels, the contents of which are also incorporated by reference in their entirety: CN2017113749159, CN2017113747312, CN2017113721384, CN2017113740421, CN2017113752749, CN2018104352309, CN201810435207X, CN2018104351838, PCT/CN2017/099642, PCT/CN 2017/099645. While the present invention has been described with reference to the preferred embodiments, the present invention is not limited to the embodiments described herein, and various changes and modifications may be made without departing from the scope of the present invention.

Claims (19)

1. An optical label-based service providing method, comprising:
s1) acquiring images of the optical labels associated with the service providers through the terminal equipment carried by the users;
s2) acquiring information related to the service provider based on the acquired optical label image and determining the position of the terminal device relative to the optical label;
s3) generating a service request according to the acquired information related to the service provider, and transmitting the service request and the location information of the terminal device to the corresponding service provider, wherein the location information of the terminal device is determined based on the location of the terminal device relative to the optical label;
s4) providing, by the service provider, the requested service based on the received service request and the location information of the terminal device.
2. The method of claim 1, wherein the optical label is associated with one or more service providers.
3. The method of claim 2, further comprising:
sending the service request to a plurality of service providers that can provide the requested service;
one of the service providers is selected to provide the requested service to the user.
4. The method of claim 1, further comprising, prior to S4):
acquiring an image of the optical label near the current position of the user by using the terminal equipment to determine the position of the terminal equipment relative to the optical label and sending the position as new position information to a service provider;
the current location of the terminal device is redetermined by the service provider in response to receiving new location information sent from the terminal device.
5. The method according to any of claims 1-4, wherein the location information of the terminal device comprises a location of the terminal device relative to the collected optical labels and/or a geographical location of the terminal device.
6. The method of claim 5, wherein the geographic location of the terminal device is determined based on the location of the terminal device relative to the optical label and a pre-calibrated geographic location of the optical label.
7. The method of claim 1, wherein the location information of the terminal device at step S3) is a location of the terminal device relative to the collected optical label, and further comprising determining a geographic location of the terminal device at step S4) based on the received location of the terminal device relative to the optical label and a pre-calibrated geographic location of the optical label, so as to provide the requested service thereto.
8. The method of claim 1, further comprising:
identifying, by a service provider, a user who issued the service request based on position information of a terminal device using a camera device, and tracking the identified user in real time using a target tracking method;
the results of the tracking are used to adjust the provision of the service requested by the user.
9. An optical label-based service providing method, comprising:
s1) acquiring images of the optical labels associated with the service providers through the terminal equipment carried by the users;
s2) acquiring information related to the service provider based on the acquired optical label image;
s3) generating a service request according to the acquired information related to the service provider, and sending the service request to the corresponding service provider;
s4), acquiring images of one or more optical labels around the user through the terminal equipment, determining the current position of the terminal equipment based on the currently acquired images and sending the current position to a corresponding service provider;
s5) the service provider providing the requested service based on the received service request and the current location.
10. The method of claim 9, wherein the optical label is associated with one or more service providers.
11. The method of claim 10, further comprising:
sending the service request to a plurality of service providers that can provide the requested service;
one of the service providers is selected to provide the requested service to the user.
12. The method according to any of claims 9-11, wherein at step S4) the one or more optical labels comprise an optical label associated with a service provider or other optical labels near the current location of the user, the other optical labels being optical labels near the current location of the user.
13. The method according to any of claims 9-11, wherein the current location of the terminal device comprises a location of the terminal device relative to the acquired optical label and/or a geographical location of the terminal device.
14. The method according to any of claims 9-11, wherein determining at step S4) the current position of the terminal device based on the currently acquired image comprises:
identifying currently collected identification information of the optical label;
acquiring geographic position information related to the optical label from a preset optical label server based on the identified identification information;
determining the position of the terminal equipment relative to the optical label based on the currently collected optical label image;
and determining the geographic position of the terminal equipment as the current position of the terminal equipment according to the position of the terminal equipment relative to the optical label and the obtained geographic position information related to the optical label.
15. An optical label-based service providing system comprising an optical label client running on a terminal device carried by a user, an optical label associated with a service provider, and a server, wherein:
the optical label client is configured to:
acquiring an image of an optical label associated with a service provider;
acquiring information related to a service provider based on the acquired optical label image and determining the position of the terminal device relative to the optical label;
generating a service request according to the acquired information related to the service provider, and sending the service request and the position of the terminal equipment relative to the optical label to the corresponding service provider;
the server is configured to:
determining the current position of the terminal equipment according to the received position of the terminal equipment relative to the optical label;
providing the requested service based on the received service request and the current location.
16. The system of claim 15, the optical label client further configured to:
acquiring an image of an optical label near the current position of a user to determine the position of the terminal equipment relative to the optical label and sending the position as a new position to a service provider; and
the server is further configured to:
the current location of the terminal device is re-determined in response to receiving the new location sent from the terminal device.
17. The system of claim 15 or 16, the server further configured to:
sending the request to a plurality of service providers that can provide the requested service;
one of the service providers is selected to provide the requested service to the user.
18. An optical label-based service providing system comprising an optical label client running on a terminal device carried by a user, an optical label associated with a service provider, and a server, wherein:
the optical label client is configured to:
acquiring an image of an optical label associated with a service provider;
acquiring information related to a service provider based on the acquired optical label image;
generating a service request according to the acquired information related to the service provider, and sending the service request to the corresponding service provider; and
acquiring images of one or more optical labels around a user, determining the current position of the terminal equipment based on the currently acquired images and sending the current position to a corresponding service provider; and
the server is configured to:
providing the requested service based on the received service request and the current location.
19. The system of claim 18, wherein the optical label client determining the current location of the terminal device based on the currently captured image comprises:
identifying currently collected identification information of the optical label;
acquiring geographic position information related to the optical label from a preset optical label server based on the identified identification information;
determining the position of the terminal equipment relative to the optical label based on the currently collected optical label image;
and determining the geographic position of the terminal equipment as the current position of the terminal equipment according to the position of the terminal equipment relative to the optical label and the obtained geographic position information related to the optical label.
CN201811113147.6A 2018-09-25 2018-09-25 Service providing method and system based on optical label Pending CN110942115A (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201811113147.6A CN110942115A (en) 2018-09-25 2018-09-25 Service providing method and system based on optical label
PCT/CN2019/086001 WO2020062876A1 (en) 2018-09-25 2019-05-08 Service provision method and system based on optical label
TW108119213A TW202013255A (en) 2018-09-25 2019-06-03 Service provision method and system based on optical label

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811113147.6A CN110942115A (en) 2018-09-25 2018-09-25 Service providing method and system based on optical label

Publications (1)

Publication Number Publication Date
CN110942115A true CN110942115A (en) 2020-03-31

Family

ID=69905507

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811113147.6A Pending CN110942115A (en) 2018-09-25 2018-09-25 Service providing method and system based on optical label

Country Status (3)

Country Link
CN (1) CN110942115A (en)
TW (1) TW202013255A (en)
WO (1) WO2020062876A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI741588B (en) * 2020-05-05 2021-10-01 光時代科技有限公司 Optical communication device recognition method, electric device, and computer readable storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6056199A (en) * 1995-09-25 2000-05-02 Intermec Ip Corporation Method and apparatus for storing and reading data
GB2380883A (en) * 2001-06-20 2003-04-16 Roke Manor Research Location and identification of participants in a sporting event by means of optically readable tags
CN105718840A (en) * 2016-01-27 2016-06-29 西安小光子网络科技有限公司 Optical label based information interaction system and method
CN106372701A (en) * 2016-08-30 2017-02-01 西安小光子网络科技有限公司 Optical label coding and identification method
CN107734449A (en) * 2017-11-09 2018-02-23 陕西外号信息技术有限公司 A kind of outdoor assisted location method, system and equipment based on optical label
CN107818375A (en) * 2017-11-09 2018-03-20 陕西外号信息技术有限公司 A kind of service reservation method and system with diversion function based on optical label

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102419854A (en) * 2011-09-23 2012-04-18 纽海信息技术(上海)有限公司 Wireless shopping system and method
CN107153954A (en) * 2017-05-15 2017-09-12 田智龙 A kind of smart shopper system
CN108074168A (en) * 2017-12-06 2018-05-25 武汉中石汽车服务有限公司 Shopping management method, device, system and the storage device of convenience store of gas station

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6056199A (en) * 1995-09-25 2000-05-02 Intermec Ip Corporation Method and apparatus for storing and reading data
GB2380883A (en) * 2001-06-20 2003-04-16 Roke Manor Research Location and identification of participants in a sporting event by means of optically readable tags
CN105718840A (en) * 2016-01-27 2016-06-29 西安小光子网络科技有限公司 Optical label based information interaction system and method
CN106372701A (en) * 2016-08-30 2017-02-01 西安小光子网络科技有限公司 Optical label coding and identification method
CN107734449A (en) * 2017-11-09 2018-02-23 陕西外号信息技术有限公司 A kind of outdoor assisted location method, system and equipment based on optical label
CN107818375A (en) * 2017-11-09 2018-03-20 陕西外号信息技术有限公司 A kind of service reservation method and system with diversion function based on optical label

Also Published As

Publication number Publication date
WO2020062876A1 (en) 2020-04-02
TW202013255A (en) 2020-04-01

Similar Documents

Publication Publication Date Title
US10484092B2 (en) Modulating a light source in a light based positioning system with applied DC bias
US9918013B2 (en) Method and apparatus for switching between cameras in a mobile device to receive a light signal
JP4032776B2 (en) Mixed reality display apparatus and method, storage medium, and computer program
US20160210100A1 (en) Differentiated content delivery system and method therefor
CA2892923C (en) Self-identifying one-way authentication method using optical signals
US20180293593A1 (en) Camera based location commissioning of electronic shelf labels
CN109328359A (en) Multi-camera system for inventory tracking
US20190101377A1 (en) Light fixture commissioning using depth sensing device
US11338920B2 (en) Method for guiding autonomously movable machine by means of optical communication device
CN109936712B (en) Positioning method and system based on optical label
TWI702805B (en) System and method for guiding a machine capable of autonomous movement
CN105187122A (en) Information providing system and the method thereof
CN110943778B (en) Optical communication device and method for transmitting and receiving information
TWI713887B (en) Optical communication device and system and corresponding information transmission and reception method
CN110942115A (en) Service providing method and system based on optical label
CN114296556A (en) Interactive display method, device and system based on human body posture
CN106663213A (en) Detection of coded light
Zhang et al. Capturing images with sparse informational pixels using projected 3D tags

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200331

RJ01 Rejection of invention patent application after publication