WO2019214642A1 - Système et procédé de guidage d'une machine autonome - Google Patents
Système et procédé de guidage d'une machine autonome Download PDFInfo
- Publication number
- WO2019214642A1 WO2019214642A1 PCT/CN2019/085998 CN2019085998W WO2019214642A1 WO 2019214642 A1 WO2019214642 A1 WO 2019214642A1 CN 2019085998 W CN2019085998 W CN 2019085998W WO 2019214642 A1 WO2019214642 A1 WO 2019214642A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- communication device
- optical communication
- light source
- image
- movable machine
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 35
- 230000003287 optical effect Effects 0.000 claims abstract description 230
- 238000004891 communication Methods 0.000 claims abstract description 95
- 238000005096 rolling process Methods 0.000 claims abstract description 23
- 230000008859 change Effects 0.000 claims abstract description 20
- 238000012795 verification Methods 0.000 claims description 5
- 238000004590 computer program Methods 0.000 claims description 4
- 238000003384 imaging method Methods 0.000 description 99
- 238000004364 calculation method Methods 0.000 description 12
- 230000000875 corresponding effect Effects 0.000 description 10
- 238000010586 diagram Methods 0.000 description 8
- 230000001276 controlling effect Effects 0.000 description 6
- 230000002596 correlated effect Effects 0.000 description 6
- 238000012546 transfer Methods 0.000 description 6
- 238000001514 detection method Methods 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 238000002474 experimental method Methods 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 230000004397 blinking Effects 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000004907 flux Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000001934 delay Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000036544 posture Effects 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
- 241000894007 species Species 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0221—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0223—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
- G05D1/0251—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0259—Control of position or course in two dimensions specially adapted to land vehicles using magnetic or electromagnetic means
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
- G05D1/0278—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using satellite positioning signals, e.g. GPS
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
- G05D1/0285—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using signals transmitted via a public communication network, e.g. GSM network
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K7/00—Methods or arrangements for sensing record carriers, e.g. for reading patterns
- G06K7/10—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K7/00—Methods or arrangements for sensing record carriers, e.g. for reading patterns
- G06K7/10—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
- G06K7/14—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K7/00—Methods or arrangements for sensing record carriers, e.g. for reading patterns
- G06K7/10—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
- G06K7/14—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
- G06K7/1404—Methods for optical code recognition
- G06K7/1408—Methods for optical code recognition the method being specifically adapted for the type of code
- G06K7/1417—2D bar codes
Definitions
- the present invention relates to guidance for machines capable of autonomous movement, and more particularly to a system and method for guiding autonomously movable machine through an optical communication device.
- a U.S. patent No. 95,621,216 B1 describes a drone cargo delivery system that navigates a drone based on a GPS and an altimeter and can remotely assist the navigation through the camera of the drone.
- the above system cannot achieve precise navigation of the drone.
- Amazon exposes the drone is first guided to the destination by GPS, and then the unattended machine looks for a unique "mark" in its field of view, which is a good customer.
- the above approach requires the buyer to have a courtyard suitable for receiving the goods and place a unique "mark” in the courtyard. Moreover, since the "mark” itself cannot be used to distinguish between different buyers, if there are multiple "tags” placed by multiple buyers near the destination, the drone cannot determine which one to place the package to. Mark”. Therefore, the above scheme is not applicable to people living in urban apartments.
- Traditional two-dimensional codes can be used to identify different users, but the recognition distance of the two-dimensional code is very limited.
- the camera when scanning with a camera, the camera must typically be placed at a relatively short distance, typically about 15 times the width of the two-dimensional code.
- a drone equipped with a camera needs to travel to about 3 meters from the two-dimensional code to recognize the two-dimensional code. Therefore, for long-distance recognition, the two-dimensional code cannot be realized, or a very large two-dimensional code must be customized, but this brings about an increase in cost, and in many cases due to various other restrictions (such as space size limitation) Impossible.
- the camera needs to photograph the two-dimensional code substantially, and if the deviation angle is too large, the recognition cannot be performed.
- a CMOS imaging device is a widely used imaging device, as shown in FIG. 1, including an array of image sensitive cells (also referred to as image sensors) and some other components.
- the image sensor array can be a photodiode array, with each image sensor corresponding to one pixel.
- Each column of image sensors corresponds to a column amplifier, and the output signal of the column amplifier is sent to an A/D converter (ADC) for analog-to-digital conversion and then output through an interface circuit.
- ADC A/D converter
- CMOS imaging devices typically employ rolling shutter imaging.
- CMOS imaging devices data readout is serial, so clear/exposure/readout can only be done line-by-line in a pipeline-like manner and will be processed after all rows of the image sensor array have been processed. It is synthesized into one frame of image. Thus, the entire CMOS image sensor array is actually progressively exposed (in some cases CMOS image sensor arrays may also be exposed in multiple lines at a time), which results in small delays between rows. Due to this small delay, when the light source flashes at a certain frequency, some undesired streaks appear on the image taken by the CMOS imaging device, which affects the shooting effect.
- One aspect of the invention relates to a system for guiding a machine capable of autonomous movement, comprising:
- An optical communication device comprising a light source configured to be operable in at least two modes, the at least two modes comprising a first mode and a second mode,
- the first mode and the second mode are used to convey different information.
- the second mode controlling, by a light source control signal having a second frequency different from the first frequency, an attribute of light emitted by the light source to continuously change at a second frequency to pass the scrolling
- the shutter camera does not exhibit streaks on the image of the light source obtained when the light source is photographed, or exhibits a stripe different from the stripe in the first mode.
- the second frequency is greater than the first frequency.
- an attribute of light emitted by the light source continuously changes at the first frequency, and an image of the light source obtained when the light source is photographed by the rolling shutter camera Stripes that differ from the stripes in the first mode are presented.
- Another aspect of the invention relates to a method of guiding a machine capable of autonomous movement using the above system, comprising:
- the optical communication device is a target optical communication device
- the autonomously movable machine or a portion thereof is controlled to travel to the optical communication device.
- the above method further comprises: if the optical communication device is not a target optical communication device, then:
- the autonomously moveable machine or portion thereof is directed to the target optical communication device based at least in part on a relative positional relationship between the target optical communication device and the autonomously movable machine or portion thereof.
- determining a relative positional relationship between the autonomously movable machine or a portion thereof and the optical communication device comprises: determining, by relative positioning, the autonomously movable machine or a portion thereof and the optical communication device Relative positional relationship between them.
- collecting, by the rolling shutter camera mounted on the autonomously movable machine, information transmitted by a surrounding optical communication device and identifying the transmitted information comprises: obtaining the light by the rolling shutter camera a continuous multi-frame image of the communication device; for each frame image, determining whether a portion of the image corresponding to the position of the light source has stripes or which type of stripes exist; and determining information represented by each frame image .
- the above method further comprises: first controlling the autonomously movable machine to travel to the vicinity of the target optical communication device.
- first controlling the autonomously movable machine to travel to the vicinity of the target optical communication device comprises: guiding the autonomously movable machine to the vicinity of the target optical communication device at least partially by a satellite navigation system; and / Or directing the autonomously movable machine to the vicinity of the target optical communication device, at least in part, using a relative positional relationship between the other optical communication device and the target optical communication device.
- the at least partially utilizing the relative positional relationship between the other optical communication device and the target optical communication device to direct the autonomously movable machine to the vicinity of the target optical communication device comprises:
- the autonomously movable machine identifies other optical communication devices while traveling, and obtains a relative positional relationship between the other optical communication devices and the target optical communication device;
- the autonomously moveable machine is directed to the vicinity of the target optical communication device based at least in part on a relative positional relationship between the target optical communication device and the autonomously movable machine.
- determining whether the optical communication device is the target optical communication device based on the transmitted information comprises: determining whether the transmitted information includes the predetermined information explicitly or implicitly.
- the predetermined information is a predetermined identifier or a verification code.
- determining whether the optical communication device is a target optical communication device based on the transmitted information comprises: determining, by the autonomously movable device, whether the optical communication device is a target optical communication device; or The autonomously moving machine transmits the transmitted information to the server, and the server determines whether the optical communication device is the target optical communication device based on the transmitted information, and transmits the determination result to the autonomously movable machine.
- Another aspect of the invention relates to a machine capable of autonomous movement, comprising a rolling shutter camera, a processor and a memory, wherein the memory stores a computer program that can be used to implement when executed by the processor The above method.
- Another aspect of the invention relates to a storage medium in which is stored a computer program that, when executed, can be used to implement the method described above.
- FIG. 1 is a schematic view of a CMOS imaging device
- CMOS imaging device 2 is a pattern of an image acquired by a CMOS imaging device
- Figure 3 is a light source in accordance with one embodiment of the present invention.
- FIG. 4 is a light source in accordance with another embodiment of the present invention.
- FIG. 5 is an imaging timing chart of a CMOS imaging device
- FIG. 6 is another imaging timing diagram of a CMOS imaging device
- Figure 7 shows an image of the CMOS imaging device at different stages when the light source is operating in the first mode
- FIG. 8 illustrates an imaging timing diagram of a CMOS imaging device when the light source operates in the first mode, in accordance with an embodiment of the present invention
- FIG. 9 illustrates an imaging timing diagram of a CMOS imaging device when the light source operates in the second mode, in accordance with an embodiment of the present invention
- FIG. 10 illustrates an imaging timing diagram of a CMOS imaging device when a light source operates in a first mode in accordance with another embodiment of the present invention
- FIG. 11 shows an imaging timing diagram of a CMOS imaging device for implementing a stripe different from that of FIG. 8 in accordance with another embodiment of the present invention
- Figures 12-13 show two striped images of a light source obtained at different settings
- Figure 14 shows a streak-free image of the obtained light source
- Figure 15 is an image view of an optical tag employing three separate light sources, in accordance with one embodiment of the present invention.
- 16 is an image view of an optical tag including a positioning mark, in accordance with one embodiment of the present invention.
- Figure 17 illustrates an optical tag including a reference light source and two data sources in accordance with one embodiment of the present invention
- FIG. 18 shows an imaging timing chart of the CMOS imaging device for the optical tag shown in FIG. 17;
- Figure 19 illustrates a method of UAV guidance by optical tags in accordance with one embodiment of the present invention.
- the optical communication device includes a light source and a controller capable of controlling the light source to operate in two or more modes by a light source control signal, the two or more modes including the first mode and the second mode, Wherein, in the first mode, the light source control signal has a first frequency such that an attribute of light emitted by the light source continuously changes at a first frequency to deliver first information, and in the second mode, The property of the light emitted by the light source continues to change at the second frequency or does not change to deliver second information that is different from the first information.
- the attribute of light in this application refers to any property that the CMOS imaging device can recognize, for example, it may be an attribute that the human eye can perceive, such as the intensity, color, and wavelength of light, or other attributes that are not perceptible to the human eye.
- the intensity, color or wavelength of the electromagnetic wavelength outside the visible range of the human eye changes, or any combination of the above properties.
- a change in the properties of light can be a single property change, or a combination of two or more properties can change.
- the intensity of the light is selected as an attribute, it can be achieved simply by selecting to turn the light source on or off.
- the light source is turned on or off to change the properties of the light, but those skilled in the art will appreciate that other ways to change the properties of the light are also possible.
- the attribute of the light varying at the first frequency in the first mode may be the same as or different from the attribute of the light changing at the second frequency in the second mode.
- the properties of the light that change in the first mode and the second mode are the same.
- the light source When the light source operates in the first mode or the second mode, the light source can be imaged using a rolling shutter imaging device such as a CMOS imaging device or a device having a CMOS imaging device (eg, a cell phone, a tablet, smart glasses, etc.), ie , imaging by rolling the shutter.
- a rolling shutter imaging device such as a CMOS imaging device or a device having a CMOS imaging device (eg, a cell phone, a tablet, smart glasses, etc.), ie , imaging by rolling the shutter.
- a mobile phone as a CMOS imaging device will be described as an example, as shown in FIG. 2 .
- the line scanning direction of the mobile phone is shown as a vertical direction in FIG. 2, but those skilled in the art can understand that the line scanning direction can also be a horizontal direction depending on the underlying hardware configuration.
- the light source can be a light source of various forms as long as one of its properties that can be perceived by the CMOS imaging device can be varied at different frequencies.
- the light source may be an LED light, an array of a plurality of LED lights, a display screen or a part thereof, and even an illuminated area of light (for example, an illuminated area of light on a wall) may also serve as a light source.
- the shape of the light source may be various shapes such as a circle, a square, a rectangle, a strip, an L shape, a cross shape, a spherical shape, or the like.
- Various common optical devices can be included in the light source, such as a light guide plate, a soft plate, a diffuser, and the like.
- the light source may be a two-dimensional array of a plurality of LED lamps, one dimension of which is longer than the other dimension, preferably a ratio of between about 6-12:1.
- the LED light array can be composed of a plurality of LED lamps arranged in a row.
- the LED light array can be rendered as a substantially rectangular light source when illuminated, and the operation of the light source is controlled by a controller.
- Figure 3 illustrates a light source in accordance with one embodiment of the present invention.
- the light source shown in FIG. 3 can be a plurality of rectangular shapes. Combination, for example, an L-shaped light source as shown in FIG.
- the light source may not be limited to a planar light source, but may be implemented as a stereoscopic light source, for example, a strip-shaped cylindrical light source, a cubic light source, or the like.
- the light source can be placed, for example, on a square, suspended at a substantially central location of an indoor venue (eg, a restaurant, a conference room, etc.) so that a nearby user in each direction can capture the light source through the mobile phone, thereby obtaining the light source.
- an indoor venue eg, a restaurant, a conference room, etc.
- FIG. 5 shows an imaging timing diagram of a CMOS imaging device, each of which corresponds to a row of sensors of the CMOS imaging device.
- two stages are mainly involved, namely, exposure time and readout time.
- the exposure time of each line may overlap, but the readout time does not overlap.
- the exposure time of a CMOS imaging device can be set or adjusted (for example, set or adjusted by an APP installed on a mobile phone) to select a relative Short exposure time.
- the exposure time can be made approximately equal to or less than the readout time of each row. Taking the 1080p resolution as an example, the readout time of each line is approximately 8.7 microseconds.
- FIG. 6 shows an imaging timing chart of the CMOS imaging device in this case.
- the exposure time of each line does not substantially overlap, or the number of overlapping portions is small, so that stripes having relatively clear boundaries can be obtained at the time of imaging, which is more easily recognized.
- FIG. 6 is only a preferred embodiment of the present invention, and a longer (for example, twice or three times, four times or four times the readout time of each row, etc.) or a shorter exposure time is also feasible. of.
- a longer for example, twice or three times, four times or four times the readout time of each row, etc.
- the readout time per line is approximately 8.7 microseconds, and the exposure time per line set is 14 microseconds.
- the length of one cycle of the light source may be set to about twice or more of the exposure time, and preferably may be set to about four times or more of the exposure time.
- Figure 7 is a view showing an image of a CMOS imaging device at different stages when the light source is operated in the first mode using a controller, in which the property of the light emitted by the light source is changed at a certain frequency, in this example Medium to turn the light source on and off.
- Fig. 7 shows a state change diagram of the light source at different stages
- the lower part shows an image of the light source on the CMOS imaging device at different stages, wherein the row direction of the CMOS imaging device is vertical and from the left Scan to the right. Since the image captured by the CMOS imaging device is progressively scanned, when the high-frequency flicker signal is captured, the portion of the obtained image on the image corresponding to the imaging position of the light source forms a stripe as shown in the lower part of FIG.
- time period 1 the light source is turned on, in which the scanning line of the leftmost portion of the exposure exhibits bright streaks; in time period 2, the light source is turned off, in which the scanned lines of the exposure exhibit dark stripes; in time period 3, the light source is turned on, The scanned lines exposed during this time period exhibit bright streaks; in time period 4, the light source is turned off, during which the scanned lines of exposure exhibit dark stripes.
- the frequency of the flashing of the light source can be set by the light source control signal, or the length of the light strip can be adjusted each time the light source is turned on and off, and the longer opening or closing time generally corresponds to a wider stripe.
- the exposure time is set to be substantially equal to the exposure time of each line of the CMOS imaging device (this exposure time can be set by the APP installed on the mobile phone or manually set. ), it is possible to present stripes with a width of only one pixel when imaging. In order to enable long-distance identification of optical tags, the narrower the stripes, the better.
- the stripe having a width of only one pixel may be less stable or less recognizable due to light interference, synchronization, etc., therefore, in order to improve the stability of recognition, it is preferable to realize a stripe having a width of two pixels.
- the stripe having a width of about two pixels can be realized by setting the duration of each turn on or off of the light source to be approximately equal to about twice the exposure time of each line of the CMOS imaging device.
- the signal of the upper portion of FIG. 8 is a light source control signal
- the high level corresponds to the turn-on of the light source
- the low level corresponds to the turn-off of the light source.
- the duty ratio of the light source control signal is set to about 50%, and the exposure duration of each line is set to be substantially equal to the readout time of each line, but those skilled in the art can understand that other Settings are also possible, as long as they can show distinguishable stripes.
- the synchronization between the light source and the CMOS imaging device is used in FIG. 8 such that the time of turning on and off of the light source substantially corresponds to the start or end time of the exposure time of a certain line of the CMOS imaging device, but the field The skilled person will understand that even if the two are not synchronized as shown in Fig. 8, they can exhibit significant streaks on the CMOS imaging device.
- the darkest stripe is the line that is exposed when the light source is always on (ie, the brightest stripe), which is separated by one pixel.
- the light and dark variations (i.e., fringes) of such pixel rows can be easily detected (e.g., by comparing the brightness or grayscale of some of the pixels in the imaged area of the source).
- the light and dark stripe difference threshold and the ratio threshold are related to the optical label illumination intensity, the photosensitive device property, the shooting distance, and the like. Those skilled in the art will appreciate that other thresholds are also possible as long as computer-resolvable stripes are present. When the streaks are identified, the information conveyed by the light source at this time, such as binary data 0 or data 1, can be determined.
- the stripe recognition method is as follows: obtaining an image of the optical label, and dividing the imaging area of the light source by means of projection; collecting stripe in different configurations (for example, different distances, different light source flicker frequencies, etc.) Images and unstripe pictures; normalize all collected pictures to a specific size, such as 64*16 pixels; extract each pixel feature as input feature, build a machine learning classifier; perform two-class discrimination to determine a striped picture Still a non-striped picture.
- a specific size such as 64*16 pixels
- extract each pixel feature as input feature, build a machine learning classifier perform two-class discrimination to determine a striped picture Still a non-striped picture.
- strip light source For a strip light source with a length of 5 cm, when using a mobile phone that is currently on the market, setting the resolution to 1080p, when shooting 10 meters away (that is, the distance is 200 times the length of the light source),
- the strip light source occupies about 6 pixels in its length direction, and if each stripe width is 2 pixels, it will appear in a range of widths of a plurality of apparent pixels within the width of the 6 pixels. At least one distinct stripe that can be easily identified. If a higher resolution is set, or optical zoom is used, the stripe can be recognized at a greater distance, for example, when the distance is 300 or 400 times the length of the light source.
- the controller can also operate the light source in the second mode.
- the light source control signal can have another frequency than the first mode to change the properties of the light emitted by the light source, such as turning the light source on and off.
- the controller can increase the turn-on and turn-off frequencies of the light source in the second mode compared to the first mode.
- the frequency of the first mode may be greater than or equal to 8000 times/second, and the frequency of the second mode may be greater than the frequency of the first mode.
- the light source can be configured to turn the light source on and off at least once during the exposure time of each row of the CMOS imaging device.
- FIG. 9 shows a case where the light source is turned on and off only once during the exposure time of each line, wherein the signal of the upper portion of FIG. 9 is a light source control signal whose high level corresponds to the turn-on of the light source, and the low level corresponds to the light source.
- the light source is turned off. Since the light source is turned on and off in the same way during the exposure time of each line, the exposure intensity energy obtained at each exposure time is roughly equal, so there is no significant difference in brightness between the individual pixel rows of the final image of the light source. So there are no stripes.
- CMOS imaging device is used in FIG.
- the turn-on time of the light source substantially corresponds to the start time of the exposure time of a certain line of the CMOS imaging device, but those skilled in the art can It is understood that even if the two are not synchronized as in Fig. 9, there is no significant difference in brightness between the respective pixel rows of the final image of the light source, so that no streaks exist. When the streaks are not recognized, the information conveyed by the light source at this time, such as binary data 1 or data 0, can be determined. For the human eye, the light source of the present invention does not perceive any flicker when operating in the first mode or the second mode described above.
- the duty ratios of the first mode and the second mode may be set to be substantially equal, thereby realizing in different modes.
- direct current may be supplied to the light source such that the light source emits light whose properties do not substantially change, thereby obtaining one of the light sources obtained when the light source is photographed by the CMOS image sensor. No streaks appear on the frame image.
- substantially the same luminous flux in different modes can also be achieved to avoid flicker that may be perceived by the human eye when switching between the first mode and the second mode.
- Figure 8 above describes an embodiment in which stripes are rendered by varying the intensity of the light emitted by the source (e.g., by turning the light source on or off).
- the wavelength or color of the light emitted by the light source changes to present stripes.
- the light source includes a red light that emits red light and a blue light that emits blue light.
- the two signals in the upper portion of FIG. 10 are a red light control signal and a blue light control signal, respectively, wherein a high level corresponds to the turn-on of the corresponding light source and a low level corresponds to the turn-off of the corresponding light source.
- the red light control signal and the blue light control signal are phase shifted by 180°, that is, the two levels are opposite.
- the red light control signal and the blue light control signal enable the light source to alternately emit red light and blue light outward, so that when the light source is imaged by the CMOS imaging device, red and blue stripes can be presented.
- CMOS imaging device By determining whether or not there is a streak on a portion of the image of one frame taken by the CMOS imaging device corresponding to the light source, information transmitted by each frame of the image, such as binary data 1 or data 0, can be determined. Further, by taking a continuous multi-frame image of the light source by the CMOS imaging device, a sequence of information composed of binary data 1 and 0 can be determined, and information transmission of the light source to the CMOS imaging device (for example, a mobile phone) is realized.
- CMOS imaging device for example, a mobile phone
- control may be performed by the controller such that the switching time interval between the operating modes of the light source is equal to the length of time that a complete frame of the CMOS imaging device is imaged
- the frame synchronization of the light source and the imaging device is realized, that is, information of 1 bit is transmitted per frame.
- the information can include, for example, a start frame mark (frame header), an optical tag ID, a password, a verification code. , URL information, address information, timestamps or different combinations thereof, and so on.
- the order relationship of the above various kinds of information can be set in accordance with a structuring method to form a packet structure. Each time a complete packet structure is received, it is considered to obtain a complete set of data (a packet), which can be read and verified.
- the following table shows the packet structure in accordance with one embodiment of the present invention:
- the information transmitted by the frame image is determined by judging whether or not there is a streak at the imaging position of the light source in each frame of the image.
- different information conveyed by the frame image may be determined by identifying different fringes at the imaging location of the light source in each frame of image.
- the property of the light emitted by the light source is controlled by the light source control signal having the first frequency to continuously change at the first frequency, thereby being capable of being imaged on the light source obtained when the light source is photographed by the CMOS image sensor.
- a second stripe different from the first stripe is present on the image of the light source.
- the difference in stripes may be based, for example, on different widths, colors, brightnesses, etc., or any combination thereof, as long as the difference can be identified.
- stripes of different widths may be implemented based on different light source control signal frequencies.
- the light source in the first mode, the light source may operate as shown in FIG. 8 to achieve a width of approximately two pixels. a stripe; in the second mode, the durations of the high level and the low level in each period of the light source control signal in FIG. 8 can be respectively changed to twice the original, as shown in FIG. A second stripe with a width of approximately four pixels is implemented.
- stripes of different colors may be implemented.
- the light source may be set to include a red light that emits red light and a blue light that emits blue light.
- the blue light may be turned off.
- the red lamp works as shown in Fig. 8 to realize red and black stripes; in the second mode, the red lamp can be turned off, and the blue lamp works as shown in Fig. 8, thereby realizing blue and black stripes.
- the red and black stripes and the blue and black stripes are realized using the light source control signals having the same frequency in the first mode and the second mode, but it is understood that the first mode and the second mode may also be used. Light source control signals at different frequencies.
- the third mode can be further set.
- the red and blue lights are controlled in the manner shown in Figure 10 to achieve a red-blue stripe, a third type of information.
- another type of information that is, the fourth type of information, can be further transmitted through the fourth mode in which the stripes are not presented.
- the above four modes can be arbitrarily selected for information transmission, and other modes can be further combined as long as different patterns generate different stripe patterns.
- Figure 12 shows the use of 1080p resolution imaging for LEDs that are flashing at a frequency of 16,000 times per second (each period has a duration of 62.5 microseconds with an on duration and a closure duration of approximately 31.25 microseconds each)
- stripes of approximately 2-3 pixel width are presented.
- Figure 13 shows that the blinking frequency of the LED lamp in Figure 12 is adjusted to 8000 times per second (the duration of each cycle is 125 microseconds, wherein the opening duration and the closing duration are each about 62.5 microseconds), under other conditions. Streaks on the image obtained by experiment under constant conditions.
- Figure 14 shows the adjustment of the blinking frequency of the LED lamp of Figure 12 to 64,000 times per second (the duration of each cycle is 15.6 microseconds, wherein the opening duration and the closing duration are each about 7.8 microseconds), under other conditions.
- the image obtained by experiment without change has no streaks on it, because the length of each line of exposure is 14 microseconds, which basically covers one opening time and one closing time of the LED lamp.
- the square wave is taken as an example to describe the light source control signal having the corresponding frequency, but those skilled in the art can understand that the light source control signal can also use other waveforms such as a sine wave, a triangular wave or the like.
- Figure 15 is an image view of an optical tag employing three independent light sources in which the imaging positions of the two light sources have streaks, and the imaging position of one light source has no streaks, according to one embodiment of the present invention.
- One frame of image can be used to convey information, such as binary data 110.
- the optical tag may further include one or more positioning indicators located adjacent to the information delivery source, the positioning identification being, for example, a lamp of a particular shape or color, which may remain bright, for example, during operation.
- the location identification can help a user of a CMOS imaging device, such as a cell phone, to easily discover optical tags.
- the CMOS imaging device is set to a mode in which the optical tag is photographed, the imaging of the positioning mark is conspicuous and easy to recognize.
- the one or more location markers disposed adjacent the information transfer light source can also assist the handset in quickly determining the location of the information transfer light source to facilitate identifying whether the imaged region corresponding to the information transfer light source has streaks.
- the location identification may first be identified in the image such that the approximate location of the optical tag is found in the image. After the location identification is identified, one or more regions in the image may be determined based on the relative positional relationship between the location identification and the information delivery light source, the region encompassing an imaging location of the information delivery light source. These areas can then be identified to determine if there are streaks or what stripes are present.
- 16 is an image diagram of an optical tag including a positioning mark including three horizontally disposed information transfer light sources, and two vertically positioned two positional identification lights located on both sides of the information transfer light source, in accordance with an embodiment of the present invention. .
- an ambient light detection circuit can be included in the optical tag that can be used to detect the intensity of ambient light.
- the controller can adjust the intensity of the light emitted by the light source when it is turned on based on the intensity of the detected ambient light. For example, when the ambient light is relatively strong (for example, during the day), the intensity of the light emitted by the light source is relatively large, and when the ambient light is relatively weak (for example, at night), the intensity of the light emitted by the light source is relatively small.
- an ambient light detection circuit can be included in the optical tag that can be used to detect the frequency of ambient light.
- the controller can adjust the frequency of the light emitted by the light source when it is turned on based on the frequency of the detected ambient light. For example, when the ambient light has the same frequency flashing light source, the light emitted by the light source is switched to another unoccupied frequency.
- the accuracy of the recognition may be affected. Therefore, in order to improve the accuracy of the recognition, in one embodiment of the present invention, in addition to the above-described light source for transmitting information (for clarity, hereinafter referred to as "data source” for clarity), It may also include at least one reference light source.
- the reference source itself is not used to convey information, but rather to aid in identifying the information conveyed by the data source.
- the reference source can be physically similar to the data source, but operates in a predetermined mode of operation, which can be one or more of various modes of operation of the data source. In this way, the decoding of the data source can be converted to a calculation that matches (eg, correlates) the image of the reference source, thereby improving the accuracy of the decoding.
- FIG. 17 shows an optical tag including a reference light source and two data light sources, wherein three light sources are arranged side by side, the first light source serves as a reference light source, and the other two light sources respectively serve as a first embodiment.
- the number of reference light sources in the optical label may be one or more, and is not limited to one; likewise, the number of data light sources may be one or more, and is not limited to two.
- the reference source is used to provide auxiliary recognition, its shape and size do not have to be the same as the data source. For example, in one embodiment, the length of the reference source can be half of the data source.
- each of the first data source and the second data source shown in FIG. 17 is configured to operate in three modes to, for example, display a streak-free image, respectively having a stripe width of 2 pixels.
- the reference light source may be configured to always operate in one of three modes to display one of the three images described above, or alternately operate in different modes to alternately display any two or all of the above three images in different frames, This provides a baseline or reference for image recognition of the data source.
- the reference light source alternately displays an image with a stripe width of 2 pixels and an image with a stripe width of 4 pixels in different frames.
- the image of the data source in each frame may be compared with the current frame and an adjacent frame (for example, before An image of a reference light source in a frame or a subsequent frame (the images must include an image with a stripe width of 2 pixels and an image with a stripe width of 4 pixels) to determine the type of the image; or It is also possible to collect successive multi-frame images of the reference light source in one time period, and to average the features of each group of images by taking the image of the odd frame number and the image of the even frame number as a group respectively (for example, finding each group) The average of the stripe width of the image), and which set of images according to the stripe width corresponds to an image with a stripe width of 2 pixels or an image with a stripe width of 4 pixels, thereby obtaining an image with a stripe width of 2 pixels.
- the reference light source Since the reference light source is located at substantially the same position as the data light source and is subjected to the same ambient lighting conditions, interference, noise, etc., it can provide one or more reference images or reference images for image recognition in real time, thereby improving The accuracy and stability of the identification of information conveyed by the data source.
- the data pattern can be accurately identified by comparing the imaging of the data source with the imaging of the reference source to identify the data it is transmitting.
- the reference light source can be controlled to operate in a predetermined mode of operation in which, for example, a stripe of 4 pixels in width is present on the image of the reference source.
- the stripes appearing on the image of the data source are similar to the stripes appearing on the image of the reference source (eg , the width is also 4 pixels) and there is no phase difference; if the data source is controlled to operate in the working mode at the same time, but the phase of the data source and the reference source are inconsistent (for example, inverted or 180° out of phase), the data source
- the stripes appearing on the image are similar to the stripes appearing on the image of the reference source (eg, the width is also 4 pixels) but there is a phase difference.
- FIG. 18 shows an imaging timing chart of the CMOS imaging device for the optical tag shown in FIG.
- the respective control signals of the reference source, the first data source, and the second data source are shown in the upper portion of FIG. 18, wherein a high level may correspond to the turn-on of the light source and a low level may correspond to the turn-off of the light source.
- the three control signals have the same frequency, and the first data source control signal is in phase with the reference source control signal, and the second data source control signal is 180 degrees out of phase with the reference source control signal.
- the reference light source, the first data source, and the second data source will each exhibit a stripe having a width of approximately four pixels, but the first data source and
- the fringe phase on the imaging of the reference source is uniform (eg, the line of the reference light source's bright stripe coincides with the line of the first data source's bright stripe, the line of the reference source's dark stripe and the first data source
- the lines of dark stripes are consistent, and the phase of the fringes of the second data source and the reference source are inverted (eg, the line where the light stripe of the reference source is located and the dark strip of the second data source)
- the lines are identical, and the line of the dark stripe of the reference source is identical to the line of the bright stripe of the second data source).
- each data source can deliver one of two types of data, such as 0 or 1, in one frame of image.
- the second mode itself can be used to deliver more than one type of data by providing a reference source and operating it in the second mode and further providing phase control when the data source is operating in the second mode. Taking the method shown in Fig. 18 as an example, the second mode combined with the phase control itself can be used to deliver one of the two kinds of data, so that each data source can deliver one of the three kinds of data in one frame of image.
- each data light source can deliver one of two kinds of data in one frame of image, thus the entire light label (contains three data sources) can be one of three species of transmitting data in an image composition; and if the reference light source, each data source can be one of three data transfer in an image, and thus the entire optical tag (contains two data sources) can pass one of the three combinations of two kinds of data in an image.
- positive values represent positive correlation and negative values represent negative correlation. If the frequency and phase of the data source and the reference source are the same, then in an ideal state, the images of the two sources are exactly the same, so the result of the correlation calculation is +1, indicating a complete positive correlation. If the frequency of the data source and the reference source are the same, but the phases are opposite, then under ideal conditions, the image of the two sources has the same stripe width, but the position of the bright and dark stripes is exactly opposite, so the result of the correlation calculation is -1 , indicating complete negative correlation. It can be understood that in the actual imaging process, it is difficult to obtain a completely positively correlated and completely negatively correlated image due to the presence of interference, errors, and the like. If the data source and the reference source operate in different modes of operation to display stripes of different widths, or one of them does not display stripes, the images of the two are typically micro-correlated.
- Tables 1 and 2 below show the correlation calculation results when the data source and the reference source are of the same frequency at the same frequency, and the correlation calculation results when the data source and the reference source are in the opposite phase of the same frequency. For each case, five images were taken, and the reference light source image in each frame image was correlated with the data source image in the frame image.
- Data image 1 Data image 2
- Data image 3 Data image 4
- Data image 5 Reference image 1 0.7710 Reference image 2 0.7862 Reference image 3 0.7632 Reference image 4 0.7883 Reference image 5 0.7967
- Data image 1 Data image 2
- Data image 3 Data image 4
- Data image 5 Reference image 1 -0.7849 Reference image 2 -0.7786 Reference image 3 -0.7509 Reference image 4 -0.7896 Reference image 5 -0.7647
- the correlation calculation results can indicate that they are significantly positively correlated.
- the correlation calculations can indicate that they are significantly negatively correlated.
- the identification distance of at least 200 times of the optical label of the present invention has obvious advantages.
- the long-distance recognition capability is especially suitable for outdoor recognition. Taking a recognition distance of 200 times as an example, for a light source with a length of 50 cm set on a street, a person within 100 meters from the light source can pass the mobile phone and the light source. Interact.
- the solution of the present invention does not require the CMOS imaging device to be located at a fixed distance from the optical tag, nor does it require time synchronization between the CMOS imaging device and the optical tag, and does not require accurate detection of the boundaries and widths of the individual stripes. Therefore, it has extremely strong stability and reliability in actual information transmission.
- the solution of the present invention also does not require that the CMOS imaging device must be substantially aligned with the optical tag for identification, especially for optical tags having strip, spherical sources.
- a CMOS imaging device within 360° of it can be identified. If the strip or columnar light label is placed on a wall, it can be identified by a CMOS imaging device that is approximately 180° around it.
- a spherical optical tag placed on a square it can be identified by a CMOS imaging device at any location in its three-dimensional space.
- One embodiment of the present invention is directed to a system for guiding autonomously movable machine through an optical tag, comprising a machine capable of autonomous movement and a light tag as described in any of the above embodiments.
- the autonomously movable machine is equipped with a CMOS camera capable of collecting and identifying information transmitted by the optical tag.
- Buyers can use their apartment as the shipping address and fill in the shipping address information in the online shopping platform, such as some of the following information: geographic location information, cell information, building number, floor, and so on.
- Buyers can place a light tag at the apartment (such as the balcony, exterior wall, etc. of the apartment) as the target light tag for the drone to deliver the goods.
- the optical tag may be configured to deliver the predetermined information by continuously working in different modes, the predetermined information may be, for example, the ID information of the optical tag itself, the ID of the buyer on the online shopping platform.
- the online shopping platform can transmit the predetermined information to the drone.
- the method for guiding the drone through the optical tag of the present invention can be as shown in FIG. 19, which includes the following steps:
- Step 101 Control the drone to travel to the vicinity of the target light tag.
- the drone After the drone picks up the goods to be sent to the buyer, it can first fly to the buyer's shipping address (ie, the buyer's apartment).
- the delivery address may preferably be geographic location information of the target optical tag itself (eg, precise latitude and longitude, height information, etc. of the optical tag), and may also include other information, such as the target optical tag. Orientation information, etc.
- Step 101 can be implemented in various possible existing ways in the art.
- the drone can fly to the vicinity of the shipping address (ie, near the buyer's light tag) by means of GPS navigation or the like.
- the existing GPS navigation method can reach the precision range of several tens of meters, and the optical tag of the present invention can realize the recognition distance of at least 200 times, taking the recognition distance of 200 times as an example, for the light source with a length of 20 cm, the drone Identification can be achieved as long as it can fly to within 40 meters of the source.
- the drone can also be guided to the vicinity of the target optical tag by using the relative positional relationship between the other optical tags and the target optical tag.
- the relative positional relationship between the individual optical tags can be stored, for example, in advance and can be obtained by the drone.
- the drone When the drone is flying, it can identify other optical tags along its flight path and obtain the relative positional relationship between the other optical tags and the target optical tags. Then, the drone can be relatively positioned (also called reverse Positioning) to determine the relative positional relationship between the target optical tag and the other optical tag, so that the relative positional relationship between the target optical tag and the drone can be determined. Based on the relative positional relationship, the drone can be guided to the vicinity of the target light tag. Those skilled in the art will appreciate that it is also possible to direct the drone to the vicinity of the target light tag using a combination of the various means described above.
- the drone can use its imaging device to image the optical tag, and obtain the relative distance from the optical tag based on the acquired image (the larger the imaging, the closer the distance; the smaller the imaging, the farther the distance
- the current orientation information of the drone can be obtained by the built-in sensor, and the relative direction of the drone and the optical tag is obtained based on the orientation information (preferably, the position of the optical tag in the image can be further combined to more accurately The relative direction of the drone and the optical tag is determined), so that the relative positional relationship between the drone and the optical tag can be obtained based on the relative distance and the relative direction of the drape.
- the orientation information of the optical label may be stored in the server. After the user identifies the identification information of the optical label, the orientation information may be obtained from the server using the identification information. Then, based on the orientation information of the optical tag and the perspective distortion of the imaging of the optical tag on the user's mobile phone, the relative direction of the user and the optical tag can be calculated.
- this step 101 is not a necessary step of the present invention, and may be omitted in some cases. For example, if the target light tag itself is already within the field of view of the drone.
- Step 102 Collect information transmitted by a surrounding optical tag through a CMOS camera installed on the drone, and identify the transmitted information.
- the drone After flying to the buyer's optical tag, the drone can find the cursor in its field of view and collect the information transmitted by the discovered optical tag through the CMOS camera installed on it and identify the transmitted information. For example, a drone can obtain a continuous multi-frame image of a certain optical tag through its CMOS camera, and determine, for each frame image, whether a portion of the image corresponding to the position of the light source has streaks or which type of streak exists. And determining the information represented by each frame of image. In one embodiment, if the drone finds an optical tag within its field of view, but the distance is too far to recognize the information it transmits, the drone can properly access the optical tag to achieve Identification of information conveyed by optical tags.
- Step 103 Determine whether the optical tag is a target optical tag based on the transmitted information.
- the drone can determine whether the optical tag is a target optical tag based on information transmitted by the optical tag. For example, the drone can determine whether the predetermined information is explicitly or implicitly included in the transmitted information. If included, it can be determined that the optical tag is the target optical tag, otherwise, it can be determined that the optical tag is not the target optical tag. In one embodiment, it may be up to the drone itself to determine if the optical tag is a target optical tag. In another embodiment, the drone can transmit the information conveyed by the optical tag to a server capable of communicating with the drone, and the server determines whether the optical tag is the target optical tag based on the transmitted information, and The judgment result is sent to the drone.
- the information transmitted by the optical tag can be encrypted information.
- Step 104 If the optical tag is a target optical tag, control the drone to travel to the optical tag.
- the drone can fly to the optical tag without error, for example by visual guidance of the optical tag.
- the drone can be stopped at a distance from the light tag using existing ranging techniques, such as a position tens of centimeters from the light tag, to avoid collisions with the light tag.
- the drone can relatively position and adjust its flight path based on the perspective distortion of the image of the optical tag it captured, such that the drone can eventually stop in a certain direction relative to the light tag, such as , the front of the light label.
- a shelf for receiving goods may be disposed directly in front of the optical tag, and the drone can easily deliver the goods into the shelf.
- the drone can identify other optical tags in the vicinity thereof, which is similar to the above process and will not be described again.
- the drone can determine the relative positional relationship between the optical tag and the target optical tag.
- the drone can determine its relative positional relationship with the optical tag by relative positioning, and can identify the optical tag based on the information transmitted by the optical tag (eg, obtain identification information of the optical tag) and obtain the The relative positional relationship between the optical tag and the target optical tag (the relative positional relationship between the optical tags can be pre-stored and can be obtained by the drone, for example), thereby determining the relationship between the drone and the target optical tag Relative positional relationship.
- the drone can fly to the vicinity of the target light tag using the relative positional relationship and optionally other navigation information (eg, GPS information).
- the drone delivery scheme of the present invention guided by the optical tag is not limited to the optical tag disposed at the buyer's apartment (for example, the balcony of the apartment, the outer wall, etc.), which obviously can also Suitable for light tags arranged in more open areas, for example, light tags placed in a courtyard.
- the buyer does not have his own light label, or if he wishes to deliver the goods to a location where other light labels are located (for example, public light labels located in squares, parks, etc., or light labels of friends' homes), they can
- the relevant information of the optical tag (ie, the target optical tag) at the delivery address eg, the ID information of the target optical tag, geographic location information, etc.
- the online shopping platform can inform the drone of the corresponding information, and the drone can recognize the information transmitted by the nearby optical tag (for example, the ID information transmitted by the optical tag) after flying to the vicinity of the target optical tag, and finally determine the target light. label.
- the drone delivery scheme guided by the optical tag of the present invention can be applied not only to a light tag having a fixed position, but also to a non-fixed optical tag (for example, a light tag that can be carried by a person) .
- a non-fixed optical tag for example, a light tag that can be carried by a person.
- the optical tag may be configured to transmit predetermined information, which may be, for example, ID information of the optical tag itself, ID information of the buyer on the online shopping platform, a verification code received by the buyer from the platform after the buyer purchases on the online shopping platform, And so on, as long as the predetermined information is known to the online shopping platform and can be used to identify the buyer or the goods it purchases.
- the drone After the drone arrives near the buyer's location, it can identify the information transmitted by the nearby optical tag, and finally determine the target optical tag (that is, the light tag carried by the buyer) to complete the delivery of the goods.
- the online shopping platform can inform the buyer of the estimated arrival time of the drone so that the buyer can move freely during this time, as long as the expected arrival time is returned to the vicinity of the previous location.
- the buyer may not return to the previous location, but may send its new location to the online shopping platform, and the online shopping platform may notify the drone of the new location so that the drone can fly to the location. Near the new location.
- the buyer may also set the goods delivery address to an address that is expected to arrive at a certain time and instruct the online shopping platform to ship the goods to the address at that time.
- the drone delivery application of the online shopping is taken as an example, but it can be understood that the drone guidance through the optical label is not limited to the above application, but can be used for each of the precise positioning requiring the drone.
- Applications such as automatic charging of drones, automatic mooring of drones, navigation of drone lines, etc.
- the optical tag-based guidance of the present invention is not only applicable to the drone, but can also be applied to other types of autonomously movable machines, such as driverless cars, robots, and the like.
- a CMOS camera can be mounted on a driverless car or robot and can interact with optical tags in a similar manner to drones.
- a portion of the autonomously movable machine is moveable, but another portion is fixed.
- the autonomously movable machine may be a machine having a fixed position on a pipeline or in a warehouse, the body portion of the machine being fixed in most cases but having one or more movable machinery arm.
- a CMOS camera can be mounted on a fixed portion of the machine for determining the position of the optical tag so that the movable portion of the machine (eg, a robotic arm) can be directed to the position of the optical tag.
- the CMOS camera can also be mounted on a movable portion of the machine, for example, on each robotic arm.
- appearances of the phrases “in the various embodiments”, “in some embodiments”, “in one embodiment”, or “in an embodiment” are not necessarily referring to the same implementation. example.
- the particular features, structures, or properties may be combined in any suitable manner in one or more embodiments.
- the particular features, structures, or properties shown or described in connection with one embodiment may be combined, in whole or in part, with the features, structures, or properties of one or more other embodiments without limitation, as long as the combination is not Logical or not working.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Aviation & Aerospace Engineering (AREA)
- Automation & Control Theory (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Electromagnetism (AREA)
- Toxicology (AREA)
- General Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Optical Communication System (AREA)
Abstract
L'invention concerne un système de guidage d'une machine autonome. Le système comprend : une machine autonome, un appareil de prise de vues à obturateur roulant étant monté sur la machine autonome ; et un dispositif de communication optique comprenant une source de lumière, la source de lumière étant conçue pour fonctionner dans au moins deux modes, et lesdits modes comprenant un premier mode et un deuxième mode. Dans le premier mode, un signal de commande de source de lumière ayant une première fréquence commande un attribut de lumière émise par la source de lumière pour le changer continuellement à la première fréquence, de sorte que des bandes sont présentes sur une image de la source de lumière acquise par l'appareil de prise de vues à obturateur roulant qui photographie la source de lumière ; dans le deuxième mode, aucune bande ou bandes différentes des bandes dans le premier mode n'est présente sur une image de la source de lumière acquise par l'appareil de prise de vues à obturateur roulant qui photographie la source de lumière.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810435227.7A CN110471402A (zh) | 2018-05-09 | 2018-05-09 | 对能够自主移动的机器进行导引的系统和方法 |
CN201810435227.7 | 2018-05-09 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019214642A1 true WO2019214642A1 (fr) | 2019-11-14 |
Family
ID=68468461
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2019/085998 WO2019214642A1 (fr) | 2018-05-09 | 2019-05-08 | Système et procédé de guidage d'une machine autonome |
Country Status (3)
Country | Link |
---|---|
CN (1) | CN110471402A (fr) |
TW (1) | TWI702805B (fr) |
WO (1) | WO2019214642A1 (fr) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI738271B (zh) * | 2020-03-30 | 2021-09-01 | 新能量科技股份有限公司 | 自走設備的自動導引方法 |
US11553824B2 (en) | 2020-06-25 | 2023-01-17 | Power Logic Tech, Inc. | Automatic guiding method for self-propelled apparatus |
WO2023005301A1 (fr) * | 2021-07-28 | 2023-02-02 | 广东奥普特科技股份有限公司 | Dispositif de guidage intelligent pour chariot élévateur à fourche agv, procédé de guidage intelligent pour chariot élévateur à fourche agv et système de guidage intelligent pour chariot élévateur à fourche agv |
CN113607158A (zh) * | 2021-08-05 | 2021-11-05 | 中铁工程装备集团有限公司 | 基于可见光通信的平板光源视觉识别匹配定位方法及系统 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060113386A1 (en) * | 2004-12-01 | 2006-06-01 | Psc Scanning, Inc. | Illumination pulsing method for a data reader |
CN102324013A (zh) * | 2005-03-11 | 2012-01-18 | 手持产品公司 | 具有全局电子快门控制的条形码读取装置 |
US8150255B2 (en) * | 2010-06-25 | 2012-04-03 | Apple Inc. | Flash control for electronic rolling shutter |
CN104395910A (zh) * | 2012-03-23 | 2015-03-04 | Opto电子有限公司 | 能够产生包括连续低强度水平照明成分和一个或多个脉冲式高强度水平照明成分的照明的图像读取器件 |
WO2016031359A1 (fr) * | 2014-08-29 | 2016-03-03 | ソニー株式会社 | Dispositif de commande, procédé de commande, et programme |
WO2017111201A1 (fr) * | 2015-12-24 | 2017-06-29 | 엘지전자 주식회사 | Appareil d'affichage d'image nocturne et son procédé de traitement d'image |
CN107370913A (zh) * | 2016-05-11 | 2017-11-21 | 松下知识产权经营株式会社 | 摄像装置、摄像系统以及光检测方法 |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7739034B2 (en) * | 2007-04-17 | 2010-06-15 | Itt Manufacturing Enterprises, Inc. | Landmark navigation for vehicles using blinking optical beacons |
EP2910019A4 (fr) * | 2012-10-19 | 2016-08-24 | Daniel Ryan | Procédé d'authentification unidirectionnelle d'auto-identification utilisant des signaux optiques |
CN203574655U (zh) * | 2013-04-09 | 2014-04-30 | 北京半导体照明科技促进中心 | 利用可见光传输信息的装置和系统以及光源 |
CN104661000B (zh) * | 2015-03-17 | 2018-01-09 | 珠海横琴华策光通信科技有限公司 | 定位系统、基于定位空间的图像来进行定位的方法及设备 |
CN105515657B (zh) * | 2015-11-19 | 2018-01-02 | 广东顺德中山大学卡内基梅隆大学国际联合研究院 | 一种采用led灯具mimo阵列架构的可见光摄像机通信系统 |
WO2018069867A1 (fr) * | 2016-10-13 | 2018-04-19 | Six Degrees Space Ltd | Procédé et appareil de positionnement intérieur |
-
2018
- 2018-05-09 CN CN201810435227.7A patent/CN110471402A/zh active Pending
-
2019
- 2019-05-08 WO PCT/CN2019/085998 patent/WO2019214642A1/fr active Application Filing
- 2019-05-09 TW TW108116064A patent/TWI702805B/zh active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060113386A1 (en) * | 2004-12-01 | 2006-06-01 | Psc Scanning, Inc. | Illumination pulsing method for a data reader |
CN102324013A (zh) * | 2005-03-11 | 2012-01-18 | 手持产品公司 | 具有全局电子快门控制的条形码读取装置 |
US8150255B2 (en) * | 2010-06-25 | 2012-04-03 | Apple Inc. | Flash control for electronic rolling shutter |
CN104395910A (zh) * | 2012-03-23 | 2015-03-04 | Opto电子有限公司 | 能够产生包括连续低强度水平照明成分和一个或多个脉冲式高强度水平照明成分的照明的图像读取器件 |
WO2016031359A1 (fr) * | 2014-08-29 | 2016-03-03 | ソニー株式会社 | Dispositif de commande, procédé de commande, et programme |
WO2017111201A1 (fr) * | 2015-12-24 | 2017-06-29 | 엘지전자 주식회사 | Appareil d'affichage d'image nocturne et son procédé de traitement d'image |
CN107370913A (zh) * | 2016-05-11 | 2017-11-21 | 松下知识产权经营株式会社 | 摄像装置、摄像系统以及光检测方法 |
Also Published As
Publication number | Publication date |
---|---|
CN110471402A (zh) | 2019-11-19 |
TWI702805B (zh) | 2020-08-21 |
TW201947893A (zh) | 2019-12-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2019214642A1 (fr) | Système et procédé de guidage d'une machine autonome | |
WO2019214643A1 (fr) | Procédé de guidage d'une machine à déplacement autonome au moyen d'un dispositif de communication optique | |
US10371504B2 (en) | Light fixture commissioning using depth sensing device | |
WO2018041136A1 (fr) | Dispositif et système de communication optique et procédé de transfert et de réception d'informations correspondant | |
CN108604382A (zh) | 用于移动运输系统的基于光的车辆定位 | |
CN110476148B (zh) | 用于提供多视图内容的显示系统和方法 | |
WO2019120156A1 (fr) | Procédé et système de positionnement basé sur une étiquette optique | |
CN105210454B (zh) | 环境照明系统的基于相机的校准 | |
CN106538064B (zh) | 照明系统 | |
WO2019120053A1 (fr) | Appareil de communication optique ayant une source de lumière de référence, et procédés d'émission et de réception d'informations correspondants | |
WO2019120052A1 (fr) | Procédé et appareil de décodage d'informations transmises par une source optique | |
CN109691232A (zh) | 具有编码光功能的灯 | |
US10990774B2 (en) | Optical communication device and system, and corresponding information transmitting and receiving methods | |
TWI745699B (zh) | 光通信裝置以及用於傳輸和接收資訊的方法 | |
CN108353487A (zh) | 智能选通机制 | |
CN107462248A (zh) | 一种室内光学定位系统及其使用方法 | |
WO2020062876A1 (fr) | Procédé et système de fourniture de service basés sur une étiquette optique | |
WO2019120051A1 (fr) | Système et procédé de détermination de sécurité d'étiquette optique |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19799188 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19799188 Country of ref document: EP Kind code of ref document: A1 |