CN104615132A - Autonomous mobile carrier and automatic following system - Google Patents

Autonomous mobile carrier and automatic following system Download PDF

Info

Publication number
CN104615132A
CN104615132A CN201310541245.0A CN201310541245A CN104615132A CN 104615132 A CN104615132 A CN 104615132A CN 201310541245 A CN201310541245 A CN 201310541245A CN 104615132 A CN104615132 A CN 104615132A
Authority
CN
China
Prior art keywords
image
light source
mobile
module
sensing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201310541245.0A
Other languages
Chinese (zh)
Other versions
CN104615132B (en
Inventor
梁家钧
高铭璨
柯怡贤
陈信嘉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pixart Imaging Inc
Original Assignee
Pixart Imaging Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pixart Imaging Inc filed Critical Pixart Imaging Inc
Priority to CN201310541245.0A priority Critical patent/CN104615132B/en
Priority to CN201611018073.9A priority patent/CN106933225B/en
Priority to US14/450,377 priority patent/US9599988B2/en
Publication of CN104615132A publication Critical patent/CN104615132A/en
Application granted granted Critical
Publication of CN104615132B publication Critical patent/CN104615132B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Studio Devices (AREA)

Abstract

Provided are a mobile carrier and an automatic following system using the mobile carrier. The mobile carrier can acquire the image of a guide light source by an image sensing technology and track the guide light source according to the imaging characteristics of the guide light source, so that the mobile carrier can actively follow the guide light source. The mobile carrier further can comprise a mobile light source. When the guide light source cannot be acquired through the mobile carrier, the image of the mobile light source can be acquired through a remote image sensing device, and a control signal can be provided according to the imaging characteristics of the mobile light source to guide the autonomous mobile carrier.

Description

Autonomous type mobile vehicle and automatic following system
Technical field
The present invention is about a kind of autonomous type mobile vehicle and automatic following system, refers to the automatic following system of the detection of a kind of image and tracer technique especially.
Background technology
The automatic following system of mobile vehicle is applied to golf course at present, transports golf clubs by mobile vehicle, and utilizes and automatically follow function mobile vehicle can accompany movement person be moved automatically.
Tradition autonomous type mobile vehicle and its automatic following system mainly provide athletic instant coordinate position mobile vehicle can navigate to sportsman's periphery according to described instant coordinate position by GPS (Global Positioning System, GPS).
But although GPS can provide instant coordinate position, but instant coordinate position very accurately cannot be provided, and the relevant spare part cost of GPS is higher, also must consuming a large amount of electric power, therefore also cannot meet the demand of user when being applied to outdoor automatic following system completely.In addition, GPS also can only be applied to outdoor environment, and cannot provide the function of automatically following in indoor.
Summary of the invention
The invention provides a kind of mobile vehicle, can automatic tracing follow guiding device.
The present invention separately provides a kind of mobile vehicle, can cannot automatic tracing and provide when following a guiding device user manually mode of operation control described mobile vehicle.
One embodiment of the invention proposes a kind of autonomous type mobile vehicle, light source is guided in order to follow the trail of, described autonomous type mobile vehicle has axis, described autonomous type mobile vehicle comprises: sensing module, there is the first image sensing cell and the second image sensing cell, wherein said first image sensing cell is positioned at the first side of described axis, in order to guide light source described in sensing and to produce at least one first image, described second image sensing cell is positioned at the second side of described axis in order to guide light source described in sensing and to produce at least one second image; And mobile module, have driver element, this driver element is in order to control the speed turning to and/or advance and the time of the working direction of described mobile module according to the image space of described guiding light source in described first image and described second image.
Another embodiment of the present invention proposes a kind of autonomous type mobile vehicle, light source is guided in order to follow the trail of, this autonomous type mobile vehicle comprises: sensing module, there is at least one first image sensing cell and assistant images sensing unit, wherein said first image sensing cell has in order to guide light source described in sensing and to produce the non-wide-angle lens of at least one non-wide angle picture, and described assistant images sensing unit has in order to guide light source described in sensing and to produce at least one wide angle picture wide-angle lens; And mobile module, there is driver element, this driver element in order to control turning to of the working direction of described mobile module according to described guiding light source at the image space of described wide angle picture, or controls the speed turning to and advance and the time of the working direction of described mobile module according to described guiding light source at the image space of described non-wide angle picture.
Another embodiment of the present invention proposes a kind of automatic following system, this automatic following system comprises: mobile vehicle, there is at least one mobile light source, and this mobile vehicle has mobile module, described mobile module receives pilot signal to control the speed turning to and/or advance and the time of working direction by driver element; And guiding device, there is sensing module, described sensing module comprises at least one navigational figure sensing unit in order to mobile light source described in sensing and produces at least one navigational figure, and produces described control signal according to described mobile light source in the one-tenth image position feature of described navigational figure.
Another embodiment of the present invention proposes a kind of automatic following system, and this automatic following system comprises: guiding device, and described guiding device comprises: guide light source; And sensing module, have in order to sensing mobile light source and produce at least one navigational figure sensing unit of at least one navigational figure, and producing pilot signal according to described mobile light source in the one-tenth image position feature of described navigational figure; And mobile vehicle, described mobile vehicle comprises: described mobile light source; There is the sensing module of the first image sensing cell and the second image sensing cell, wherein said first image sensing cell is positioned at the first side of described axis, in order to guide light source described in sensing and to produce at least one first image, described second image sensing cell is positioned at the second side of described axis in order to guide light source described in sensing and to produce at least one second image; And mobile module, have driver element, this driver element is in order to according to described pilot signal or described guiding light source, the image space in described first image and described second image controls the speed turning to and/or advance and the time of the working direction of described mobile module; Wherein, when described first image and described second image all do not have the imaging of described guiding light source, then described driver element receives described pilot signal and in order to the speed turning to and/or advance of the working direction that controls described mobile module and time.
Mobile vehicle of the present invention guides the image of light source by image sensing technical limit spacing, and follows the trail of according to the imaging features of described guiding light source, initiatively can follow described guiding light source in order to make described mobile vehicle; Described mobile vehicle also configurable mobile light source further, when described guiding light source cannot be obtained by described mobile vehicle, also obtain the image of described mobile light source by far-end image sensing device, and provide control signal to guide described autonomous type mobile vehicle according to the imaging features of described mobile light source.
In order to further understand technology of the present invention, refer to following detailed description and graphic, believe feature of the present invention, when being able to concrete understanding thus.But institute's accompanying drawings and annex only provide with reference to and use is described, be not used for the present invention's in addition limitr.
Accompanying drawing explanation
Fig. 1 is the automatic following system schematic diagram of one embodiment of the invention;
Fig. 2 is the automatic following system schematic diagram that another embodiment of the present invention is suitable for remote manual control;
Fig. 2 A is obtained image schematic diagram by the sensing module of one embodiment of the invention under an operational scenario;
Fig. 3 is obtained image schematic diagram by the sensing module of one embodiment of the invention under an operational scenario;
Fig. 4 A is obtained image schematic diagram by the sensing module of one embodiment of the invention under another operational scenario;
Fig. 4 B is obtained image schematic diagram by the sensing module of one embodiment of the invention under another operational scenario;
Fig. 5 A is obtained image schematic diagram by the sensing module of one embodiment of the invention under another operational scenario; And
Fig. 5 B is obtained image schematic diagram by the sensing module of one embodiment of the invention under another operational scenario.
Description of reference numerals
100: target object
111-113: guide light source
120: guiding device
121: guide light source
122: image sensing cell
200: mobile vehicle
210: sensing module
211,212: image sensing cell
220: mobile module
230: mobile light source
250,260: image
251,261: imaging.
Embodiment
Fig. 1 is the automatic following system schematic diagram of one embodiment of the invention, and described automatic following system comprises a target object 100 and a mobile vehicle 200, and wherein said mobile vehicle 200 is in order to automatic tracing and follow described target object 100.
Described target object 100 can be a loose impediment, such as, be a golfer in the application in golf court, can be then the children of an operate game in children's play system.Described target object 100 has at least one and guides light source to carry out track identification in order to provide described mobile vehicle 200, show three in FIG and guide light source 111-113, but it only assembles for disclosing multiple can be used for the position guiding light source, only need at least one to guide light source, below namely to guide light source 111 as an example when practice.
Described mobile vehicle 200 has a sensing module 210 and mobile module 220, described sensing module 210 is in order to guide light source 111 and to produce at least one sensed image described in sensing, described mobile module 220 is according to the described sensing module 210 acquired speed turning to and/or advance and the time guiding the imaging features of light source 111-113 in sensed image to control advance.
Described sensing module 210 has at least one image sensing cell, is exemplified as two image sensing cells 211,212 in the second figure.Described image sensing cell 211,212 better can be for detecting specific wavelength, such as arrange infrared optical optical filter in camera lens front also only for detecting infrared ray light, needing that now the guiding light source 111-113 of target object 100 is certainly corresponding is an infrared light sources.
When target object 100 is a user, because user may have the action of turning round when operating, cause fixing with same side to described mobile vehicle 200, therefore described guiding light source 111-113 is better is set to ring-type or at least can sends planar light source.Such as guide light source 111-113 can comprise a pointolite and one group of optical fiber (figure does not show), wherein pointolite is arranged at the incidence surface of described optical fiber, therefore described optical fiber can be made shinny when pointolite is lighted, and during by optic fiber configureing, on the waist of user's (target object 100), (guide light source 111), leg (being guided light source 112) or (guiding light source 113) on hand, ring-type light source can be produced, and though therefore described mobile vehicle 200 in the face of described guiding light source 111-113 all effectively can be detected for which side of described user.In another embodiment, described guiding light source 111-113 also can use laser light source to arrange in pairs or groups semi-cylindrical lens (figure do not show), face shape light can be produced because laser light source projects linear laser light to described semi-cylindrical lens, therefore only several guiding light source need be set at waist, leg or hand and can reach the object producing ring-type light source.In another embodiment, this guiding light source 111-113 can be an active light source and to arrange in pairs or groups a light guide (figure do not show), and wherein this light guide is in order to be worn on target object 100 and the side that this light guide is reflective surface and wide object 100 near the side of target object 100 is that translucent construction penetrates towards the direction of wide object 100 with the reflected light this reflective surface being reflected this active light source.More specifically, as long as this guiding light source 111-113 can form a ring-type light source, there is no specific restriction.
Identify to help image sensing cell 211,212, described guiding light source 111-113 is except having except specific wavelength, also can have specific flicker frequency, certain patterns or shape, and sensing module 210 can identify described guiding light source 111-113 according to the imaging frequency of the imaged shape of the guiding light source 111-113 in obtained image, pattern or consecutive image and and then judge the image space of described guiding light source 111-113.
Described mobile module 220 is except removable described mobile vehicle 200 and except turning to, also can control described sensing module 210 further by steering structure to turn to, can get rapidly in order to make the image sensing cell 211,212 of described sensing module 210 image guiding light source 111-113.Certainly, described mobile vehicle 200 also can arrange other elements provides described sensing module 210 to turn to, and is also regarded as a part for mobile module 220 in the present invention.
Described mobile module 220 can have a driver element (not shown), in order to receive described sensing module 210 the control signal that obtains multiple image or produced by the imaging features of described image, and control the speed turning to and/or advance of advancing and time according to the imaging features of described image or control signal.
Because described mobile vehicle 200 detects the guiding light source 111-113 of described target object 100 out of doors by image sensing cell 211,212, therefore sometimes image sensing cell 211,212 may be caused temporarily to lose efficacy because of the impact of environment light source, such as described image sensing cell 211,212 detects that the guiding light source 111-113 electricity deficiency of high light or described target object 100 causes too dark brightness and makes described image sensing cell 211,212 cannot obtain the image of described guiding light source 111-113 smoothly.
With reference to figure 2 for another embodiment of the present invention is suitable for the automatic following system schematic diagram of remote manual control, namely described embodiment is mainly, when image sensing cell 211,212 effectively cannot get the image guiding light source 111-113, user can be provided to carry out manual far-end remote control to operate described mobile vehicle.
Main difference embodiment illustrated in fig. 2 and embodiment illustrated in fig. 1 is that target object 100 has a guiding device 120, described guiding device 120 is except having and guiding light source 121, and there is an image sensing cell 122, wherein said guiding light source 121 can be arranged on same body with described image sensing cell 122 or independently arrange.
Corresponding described mobile vehicle 200 arranges a mobile light source 230, described mobile light source 230 can use identical optical feature with guiding light source 111-113, such as specific wavelength, specific flicker frequency, given shape or pattern, be convenient in order to make described image sensing cell 122 from obtained image, identify described mobile light source 230.Described mobile light source 230 is better also can provide ring-like light, can detect described mobile light source 230 in order to make the described image sensing cell 122 of target object 100 from the either side of described mobile vehicle 200.
In the described embodiment, described guiding device 120 produces pilot signal, and described pilot signal is supplied to the driver element of described mobile module 220, in order to control the speed turning to and/or advance and the time of advance.Wherein, described pilot signal can be described guiding device 120 obtain image or by the imaging features producing described mobile light source in obtained image, also can be directly produce a steering order according to described imaging features, more described steering order is supplied to the driver element of described mobile module 220 as described pilot signal in addition.
Please refer to shown in Fig. 2 A, it shows the image 122I that the acquisition of this image sensing cell 122 comprises the light source image I230 of this action light source 230; Wherein, this image 122I such as has a central point 122C.Therefore, the graphics processing unit that this guiding device 120(is such as inner) then calculate vectorial D between the position (such as center or center of gravity) of this light source image I230 and this central point 122C using as this pilot signal.Must it should be noted that, the shape of the light source image I230 in this image 122I is only exemplary, and is not used to limit the present invention.
Fig. 1 can be combined with embodiment illustrated in fig. 2 simultaneously, also namely described guiding device 120 is set on described target object 100, in order to provide described guiding light source 121 and described image sensing cell 122, mobile vehicle 200 then has described sensing module 210, described mobile module 220 and described mobile light source 230.
Fig. 3 by the sensing module 210 of one embodiment of the invention under an operational scenario obtained image schematic diagram wherein image sensing cell 211 be retrieved as image 250, and image sensing cell 212 be retrieved as image 260, wherein there is in image 250 imaging 251 of described guiding light source 111, there is in described image 260 imaging 261 of described guiding light source 111.Dotted line L indication is image sensing cell 211,212 axis therebetween, usual described image sensing cell 211,212 can be arranged on two sides of described mobile vehicle 200, and therefore the direction of arrow of described axis also can be expressed as the craspedodrome direction of described mobile vehicle 200.Namely following examples are separately positioned on the first side and second side of described mobile vehicle 200 for described image sensing cell 211,212.Wherein, for ease of illustrating, below the side near dotted line L in image 250 and image 260 is referred to as inner side, the side away from dotted line L is then referred to as outside.
When described mobile vehicle 200 is just in the face of described target object 100, described guiding light source 111 can present symmetrical in described image 250,260, and also namely imaging 251 can be equal with the distance of image 260 near the side of described dotted line L with imaging 261 with the distance of image 250 near the side of described dotted line L.
Fig. 4 A is obtained image schematic diagram by the sensing module of one embodiment of the invention under another operational scenario, and wherein image 250 does not get the imaging of described guiding light source 111, and image 260 then has the imaging getting described guiding light source 111.Under this kind of operational scenario, represent described mobile vehicle 200 not just in the face of described guiding light source 111, but described guiding light source 111 is positioned at second side of being partial to described mobile vehicle 200, therefore namely the mobile module 220 of described mobile vehicle 200 drives described mobile vehicle 200 to turn to first angle to the second side, is able to just in the face of described guiding light source 111 in order to make described mobile vehicle 200.
Under described operational scenario, the first angle that described mobile module 220 turns to can be a definite value, when described image 250 does not get the imaging of described guiding light source 111 and described image 260 has the imaging getting described guiding light source 111, then described mobile module 220 drives described mobile vehicle 200 to turn to described first angle (such as 10 degree) toward the second side, after turning to, described image 250 does not get the imaging of described guiding light source 111 yet, then turn to the first angle toward the second side once again.
In addition, under described operational scenario, described first angle that described mobile module 220 turns to also can be variable, such as, be to adjust at the image space of the imaging 261 of described image 260 according to guiding light source 111.For example, when described imaging 261 is the closer to the outside of described image 260 relative to described dotted line L, then represent the dead ahead of described guiding light source 111 further from described mobile vehicle 200, therefore the steering angle of described mobile module 220 can set larger first angle (such as 15 degree); And when the closer described image 260 of described imaging 261 is near the inner side of described dotted line L, then represent described guiding light source 111 comparatively close to the dead ahead of described mobile vehicle 200, therefore the steering angle of described mobile module 220 can set less first angle (such as 5 degree).
Fig. 4 B is obtained image schematic diagram by the sensing module of one embodiment of the invention under another operational scenario, and wherein said image 250 gets the imaging 251 of described guiding light source 111, and described image 260 also gets the imaging 261 of described guiding light source 111.Because described guiding light source 111 is partial to the second side of described mobile vehicle 200, therefore imaging 251 meeting of the first side is comparatively close to the inner side near dotted line L in described image 250, and the closer described image 260 of the imaging 261 of the second side meeting is away from the outside of described dotted line L.
Under this kind of operational scenario, described mobile module 220 can drive described mobile vehicle 200 to turn to second angle toward described second side, and wherein said second angle also can be definite value or variable numerical value.Preferably, described second angle can be a variable numerical value, and mobile module 220 then can decide described second angle according to described imaging 251,261 at the image space of described image 250,260.Such as, when described imaging 261 relative to imaging 251 the closer to affiliated image outside then described mobile module 220 drive described mobile vehicle 200 toward the second side directional steering.
And described in the second angle of turning to can be determined according to the image space of imaging 251 or imaging 261, such as, when described mobile module 220 will move toward the first side, then described second angle determined by the image space of described imaging 251 in described image 250.In addition, described in the second angle of turning to also can be determined according to the image space of imaging 251 and imaging 261 simultaneously, such as more described imaging 251 simultaneously and the degree difference of imaging 261 near the outside of affiliated image decide described second angle.For example when described imaging 251 is near 10, the outside of affiliated image 250 pixel distance, and imaging 261 is near 5, the outside of affiliated image 260 pixel distance, then imaging 251 and the degree difference of imaging 261 near the outside of affiliated image are 5 pixels toward the second side, and therefore described mobile module 220 turns to and can set larger second angle (such as 15 degree); When described imaging 251 is near 10, the outside of affiliated image 250 pixel distance, and imaging 261 is near 8, the outside of affiliated image 260 pixel distance, then imaging 251 and the degree difference of imaging 261 near the outside of affiliated image are 2 pixels toward the second side, and therefore described mobile module 220 turns to and can set less second angle (such as 5 degree).
In addition, under other operational scenario, described image 250 and described image 260 all may not get the imaging of described guiding light source 111 simultaneously, now the driver element of described mobile module 220 can enter seek mode, described mobile vehicle 200 is driven to carry out clockwise or be rotated counterclockwise turning to clockwise or counterclockwise by described mobile module 220, or can be undertaken turning to clockwise or counterclockwise, in order to search described guiding light source 111 by sensing module 210 described in described mobile module 220 Direct driver.When seek mode, described driver element can detect that described guiding light source 111 leaves described seek mode by an image sensing cell 211,212 wherein, also can all detect that described guiding light source 111 just leaves described seek mode at two image sensing cells 211,212.
In one embodiment, this sensing module 210 only can move module 220 and carries out clockwise or be rotated counterclockwise turning to relative to this when this seek mode, and (non-seek mode) this sensing module 210 cannot move module 220 and turns to, correctly to control the direct of travel of this mobile vehicle 200 relative to this in advancing.
In another embodiment, this moves the single image that module 220 also can capture according to this sensing module 210 and decides steering angle; Namely, this sensing module 210 only comprises single image sensing unit 211 or 212.Such as, Fig. 5 A shows the image 270 that image sensing cell captures; Wherein, this image 270 such as has a centre line L ' (being pre-recorded in storage element).When this mobile vehicle 200 is just in the face of this target object 100, this guiding light source 111 imaging 271 meeting this centre line L relatively in this image 270 ' present symmetrical, as shown in Figure 5A.When this mobile vehicle 200 is not just in the face of this guiding light source 111, the imaging 271 of this guiding light source 111 in this image 270 can offset relative to this centre line L ' generation, as shown in Figure 5 B.Now, namely the driver element of this mobile vehicle 200 drives this mobile vehicle 200 to turn to an angle, is able to forward in the face of this guiding light source 111 to make this mobile vehicle 200.Must it should be noted that, though Fig. 5 B shows this imaging 271 relative to this centre line L ' offset left, when this mobile vehicle 200 is different from this guiding light source 111 relative position, this imaging 271 also may relative to this centre line L ' offset to the right.
Under a kind of operational scenario, imaging size or the brightness of image of the imaging 251 or 261 of described guiding light source 111 detected by the described image 250 or described image 260 exceed first threshold, then can judge described mobile vehicle 200 near described target object 100, now can reduce the translational speed of described mobile vehicle 200.And when the imaging size of imaging 251 or 261 or brightness of image exceed Second Threshold again, then can judge that described mobile vehicle 200 has arrived at described target object 100 periphery, now driver element can enter park mode.In described park mode, described mobile vehicle 200 can stop immediately, or it is mobile to turn to the rear mobile preset distance of predetermined angular or time to stop working direction again, described mobile vehicle so can be made to park side in target object 100, and the user being convenient to described target object 100 takes described mobile vehicle 200 carrying article.
In another embodiment, when this image 250 or this image 260 detect the imaging size of the imaging 251 or 261 of this guiding light source 111 or brightness of image more than a first threshold time, then can judge this mobile vehicle 200 near this target object 100, now this mobile vehicle 200 stops.Then, this mobile vehicle 200 according to this image 250 or this image 260 detect the imaging size of the imaging 251 or 261 of this guiding light source 111 or brightness of image and judge that this target object 100 whether static (size or brightness unchanged) is more than a Preset Time, if, this mobile vehicle 200 moves until the imaging size of imaging 251 or 261 or brightness of image are more than a Second Threshold towards this target object 100 once again, then can judge that this mobile vehicle 200 has arrived at this target object 100 periphery and halted, this mobile vehicle 200 carrying article so that the user of this target object 100 takes.In the present invention, this mobile vehicle 200 can be set as not initiatively away from this target object 100.
Wherein, the driver element of described mobile module 220 also can directly ignore described first threshold, also namely directly the imaging size of described imaging 251 or described imaging 261 or brightness of image is compared with described Second Threshold.
In addition, in order to the same side allowing this mobile vehicle 200 be positioned at this target object 100 all the time, such as rear, this guiding light source 111-113 can be designed to different directions and have different characteristic, and the half of such as this mobile vehicle 200 uses one first glow frequency and second half uses one second glow frequency; Namely the aforementioned optical fiber of the light illumination of different characteristic, light guide or lens are used.Whereby, this mobile vehicle 200 has the mechanism of this first glow frequency of identification and this second glow frequency relatively, and all the time towards making the imaging of the guiding light source of this mobile vehicle 200 major part this first glow frequency of acquisition or this second glow frequency, such as making in imaging more than 80% be advance, to maintain the relative space relation of this target object 100 and mobile vehicle 200 in the direction of the imaging of this first glow frequency relevant or this second glow frequency.
In sum, mobile vehicle disclosed by the present invention and its automatic following system utilize image sensing cell to detect the imaging guiding light source, and make mobile vehicle according to its imaging features (position, brightness, size ... Deng) control to move, in order to reach the object of automatically following, and cannot detect guide light source or user to have other demands time, also by user utilize navigational figure sensing unit detect mobile vehicle configure the imaging of mobile light source, and guide control signal to control the movement of described mobile vehicle to described mobile vehicle according to its imaging features by transmitting.
The foregoing is only embodiments of the invention, all equalizations done according to the claims in the present invention change and modify, and all should belong to covering scope of the present invention.

Claims (26)

1. an autonomous type mobile vehicle, guide light source in order to follow the trail of, this autonomous type mobile vehicle has axis, and this autonomous type mobile vehicle comprises:
Sensing module, this sensing module has the first image sensing cell and the second image sensing cell, wherein said first image sensing cell is positioned at the first side of described axis, in order to guide light source described in sensing and to produce at least one first image, described second image sensing cell is positioned at the second side of described axis, in order to guide light source described in sensing and to produce at least one second image; And
Mobile module, this moves module and has driver element, and this driver element is in order to control the speed turning to and/or advance and the time of the working direction of described mobile module according to the image space of described guiding light source in described first image and described second image.
2. autonomous type mobile vehicle according to claim 1, when wherein when described first image has the imaging of described guiding light source, described second image does not have the imaging of described guiding light source, the working direction that described driver element controls described mobile module turns to toward described first side.
3. autonomous type mobile vehicle according to claim 2, the value of wherein said steering angle is determined at the image space of described first image according to described guiding light source, when described image space is more partial to the outside of described first image relative to described axis, more increase steering angle.
4. autonomous type mobile vehicle according to claim 1, wherein when the image space of described guiding light source in described first image and described second image does not present symmetry and the outside of image space in described first image relatively described axis more closer than the image space in described second image time, the working direction that described driver element controls described mobile module turns to toward described first side.
5. autonomous type mobile vehicle according to claim 1, wherein when described guiding light source is greater than first threshold at the imaging size of described first image or described second image, reduces the translational speed of described mobile module.
6. autonomous type mobile vehicle according to claim 1, wherein when described guiding light source is greater than Second Threshold at the imaging size of described first image or described second image, described mobile module enters park mode.
7. autonomous type mobile vehicle according to claim 6, when wherein entering described park mode, namely described mobile module stops mobile.
8. autonomous type mobile vehicle according to claim 6, when wherein entering described park mode, working direction is turned to predetermined angular by described mobile module, and mobile preset distance or stopping movement after the time.
9. autonomous type mobile vehicle according to claim 1, wherein when described first image and described second image all do not have the imaging of described guiding light source, described mobile module enters seek mode.
10. autonomous type mobile vehicle according to claim 9, when wherein entering described seek mode, described mobile module drives described autonomous type mobile vehicle to carry out turning to clockwise or counterclockwise, or drive described sensing module to carry out turning to clockwise or counterclockwise by steering structure, and leave described seek mode in described first image or described second Image Acquisition to during the imaging of described guiding light source.
11. autonomous type mobile vehicles according to claim 1, wherein said sensing module judges the image space of described guiding light source according to the imaging frequency of the imaged shape in obtained image, pattern or consecutive image.
12. 1 kinds of autonomous type mobile vehicles, guide light source in order to follow the trail of, this autonomous type mobile vehicle comprises:
Sensing module, this sensing module has at least one first image sensing cell and assistant images sensing unit, wherein said first image sensing cell has in order to guide light source described in sensing and to produce the non-wide-angle lens of at least one non-wide angle picture, and described assistant images sensing unit has in order to guide light source described in sensing and to produce the wide-angle lens of at least one wide angle picture;
Mobile module, this moves module and has driver element, this driver element in order to control turning to of the working direction of described mobile module according to described guiding light source at the image space of described wide angle picture, or controls the speed turning to and advance and the time of the working direction of described mobile module according to described guiding light source at the image space of described non-wide angle picture.
13. autonomous type mobile vehicles according to claim 12, wherein when described non-wide angle picture does not have the imaging of described guiding light source, described driver element controls turning to of the working direction of described mobile module according to described guiding light source at the image space of described wide angle picture.
14. autonomous type mobile vehicles according to claim 12, wherein when described guiding light source is greater than first threshold at the imaging size of described non-wide angle picture, reduce the translational speed of described mobile module.
15. autonomous type mobile vehicles according to claim 12, wherein when described guiding light source is greater than Second Threshold at the imaging size of described non-wide angle picture, described mobile module enters park mode.
16. autonomous type mobile vehicles according to claim 15, when wherein entering described park mode, described mobile module stops mobile.
17. autonomous type mobile vehicles according to claim 15, when wherein entering described park mode, working direction is turned to predetermined angular by described mobile module, and mobile preset distance or stopping movement after the time.
18. autonomous type mobile vehicles according to claim 12, wherein when described non-wide angle picture and described wide angle picture all do not have the imaging of described guiding light source, described mobile module enters seek mode.
19. autonomous type mobile vehicles according to claim 18, when wherein entering described seek mode, described mobile module drives described autonomous type mobile vehicle to carry out turning to clockwise or counterclockwise, or drive described sensing module to carry out turning to clockwise or counterclockwise by steering structure, and leave described seek mode when described wide angle picture or described non-wide angle picture get the imaging of described guiding light source.
20. autonomous type mobile vehicles according to claim 12, wherein said sensing module judges the image space of described guiding light source according to the imaging frequency of the imaged shape in obtained image, pattern or consecutive image.
21. 1 kinds of automatic following systems, this automatic following system comprises:
Mobile vehicle, this mobile vehicle has at least one mobile light source, and this mobile vehicle has mobile module, and described mobile module receives pilot signal to control the speed turning to and/or advance and the time of working direction by driver element; And
Guiding device, this guiding device has sensing module, described sensing module comprises at least one navigational figure sensing unit, this navigational figure sensing unit is in order to mobile light source described in sensing and produce at least one navigational figure, and produces described control signal according to described mobile light source at the imaging features of described navigational figure.
22. automatic following systems according to claim 21, wherein said guiding device produces described control signal according to described mobile light source at the imaging size of described navigational figure, in order to control pace and the time of described mobile module.
23. automatic following systems according to claim 21, wherein said guiding device produces described control signal according to described mobile light source at the image space of described navigational figure, in order to control turning to of the working direction of described mobile module.
24. 1 kinds of automatic following systems, this automatic following system comprises:
Guiding device, this guiding device comprises:
Guide light source; And
Sensing module, this sensing module has at least one navigational figure sensing unit, this navigational figure sensing unit is in order to sensing mobile light source and produce at least one navigational figure, and produces pilot signal according to described mobile light source at the imaging features of described navigational figure; And mobile vehicle, this mobile vehicle comprises:
Described mobile light source;
Sensing module, this sensing module has the first image sensing cell and the second image sensing cell, wherein said first image sensing cell is positioned at the first side of described axis, in order to guide light source described in sensing and to produce at least one first image, described second image sensing cell is positioned at the second side of described axis, in order to guide light source described in sensing and to produce at least one second image; And
Mobile module, this moves module and has driver element, and this driver element is in order to according to described pilot signal or described guiding light source, the image space in described first image and described second image controls the speed turning to and/or advance and the time of the working direction of described mobile module;
Wherein, when described first image and described second image all do not have the imaging of described guiding light source, described driver element receives described pilot signal and in order to the speed turning to and/or advance of the working direction that controls described mobile module and time.
25. automatic following systems according to claim 24, wherein said guiding light source comprises pointolite and optical fiber, and described pointolite is located at the incidence surface of described optical fiber, in order to make described optical fiber shinny, and described guiding light source can be assemblied in waist or the leg of user, and produce ring-like light.
26. automatic following systems according to claim 24, wherein said guiding light source comprises the laser light source and the semi-cylindrical lens that produce plane light.
CN201310541245.0A 2013-10-28 2013-11-04 Autonomous type mobile vehicle and automatic following system Active CN104615132B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201310541245.0A CN104615132B (en) 2013-11-04 2013-11-04 Autonomous type mobile vehicle and automatic following system
CN201611018073.9A CN106933225B (en) 2013-11-04 2013-11-04 Automatic following system
US14/450,377 US9599988B2 (en) 2013-10-28 2014-08-04 Adapted mobile carrier and auto following system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310541245.0A CN104615132B (en) 2013-11-04 2013-11-04 Autonomous type mobile vehicle and automatic following system

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN201611018073.9A Division CN106933225B (en) 2013-11-04 2013-11-04 Automatic following system

Publications (2)

Publication Number Publication Date
CN104615132A true CN104615132A (en) 2015-05-13
CN104615132B CN104615132B (en) 2017-10-20

Family

ID=53149631

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201611018073.9A Active CN106933225B (en) 2013-11-04 2013-11-04 Automatic following system
CN201310541245.0A Active CN104615132B (en) 2013-10-28 2013-11-04 Autonomous type mobile vehicle and automatic following system

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN201611018073.9A Active CN106933225B (en) 2013-11-04 2013-11-04 Automatic following system

Country Status (1)

Country Link
CN (2) CN106933225B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106483957A (en) * 2016-10-09 2017-03-08 乐视控股(北京)有限公司 The method and apparatus that control perambulator follows target object
CN107273850A (en) * 2017-06-15 2017-10-20 上海工程技术大学 A kind of autonomous follower method based on mobile robot
CN112367887A (en) * 2018-05-04 2021-02-12 Lg电子株式会社 Multiple robot cleaner and control method thereof
CN112367888A (en) * 2018-05-04 2021-02-12 Lg电子株式会社 Multiple robot cleaner and control method thereof
US11150668B2 (en) 2018-05-04 2021-10-19 Lg Electronics Inc. Plurality of robot cleaner and a controlling method for the same
US11148290B2 (en) 2018-05-04 2021-10-19 Lg Electronics Inc. Plurality of robot cleaner and a controlling method for the same

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107168343B (en) * 2017-07-14 2020-09-15 灵动科技(北京)有限公司 Control method of luggage case and luggage case

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003266350A (en) * 2002-03-18 2003-09-24 Sony Corp Robot device and internal condition expressing device
CN1798958A (en) * 2003-05-29 2006-07-05 奥林巴斯株式会社 Stereo optical module and stereo camera
CN101590323A (en) * 2009-07-08 2009-12-02 北京工业大学 A kind of one-wheel robot system and control method thereof
CN102596517A (en) * 2009-07-28 2012-07-18 悠进机器人股份公司 Control method for localization and navigation of mobile robot and mobile robot using same
CN102788591A (en) * 2012-08-07 2012-11-21 郭磊 Visual information-based robot line-walking navigation method along guide line
CN103048995A (en) * 2011-10-13 2013-04-17 中国科学院合肥物质科学研究院 Wide-angle binocular vision identifying and positioning device for service robot
CN103170980A (en) * 2013-03-11 2013-06-26 常州铭赛机器人科技有限公司 Positioning system and positioning method for household service robot
CN103176606A (en) * 2013-04-15 2013-06-26 北京唯创视界科技有限公司 Plane interaction system and method based on binocular vision recognition

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7854108B2 (en) * 2003-12-12 2010-12-21 Vision Robotics Corporation Agricultural robot system and method
JP2007152472A (en) * 2005-12-02 2007-06-21 Victor Co Of Japan Ltd Charging system, charging station and robot guidance system
CN102411368B (en) * 2011-07-22 2013-10-09 北京大学 Active vision human face tracking method and tracking system of robot
CN102436261B (en) * 2011-12-05 2014-04-30 北京航空航天大学 Butt joint positioning and navigation strategy for robot based on single camera and light-emitting diode (LED)

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003266350A (en) * 2002-03-18 2003-09-24 Sony Corp Robot device and internal condition expressing device
CN1798958A (en) * 2003-05-29 2006-07-05 奥林巴斯株式会社 Stereo optical module and stereo camera
CN101590323A (en) * 2009-07-08 2009-12-02 北京工业大学 A kind of one-wheel robot system and control method thereof
CN102596517A (en) * 2009-07-28 2012-07-18 悠进机器人股份公司 Control method for localization and navigation of mobile robot and mobile robot using same
CN103048995A (en) * 2011-10-13 2013-04-17 中国科学院合肥物质科学研究院 Wide-angle binocular vision identifying and positioning device for service robot
CN102788591A (en) * 2012-08-07 2012-11-21 郭磊 Visual information-based robot line-walking navigation method along guide line
CN103170980A (en) * 2013-03-11 2013-06-26 常州铭赛机器人科技有限公司 Positioning system and positioning method for household service robot
CN103176606A (en) * 2013-04-15 2013-06-26 北京唯创视界科技有限公司 Plane interaction system and method based on binocular vision recognition

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李兰: "《基于立体视觉的移动机器人导航方法研究》", 《豆丁网》 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106483957A (en) * 2016-10-09 2017-03-08 乐视控股(北京)有限公司 The method and apparatus that control perambulator follows target object
CN107273850A (en) * 2017-06-15 2017-10-20 上海工程技术大学 A kind of autonomous follower method based on mobile robot
CN112367887A (en) * 2018-05-04 2021-02-12 Lg电子株式会社 Multiple robot cleaner and control method thereof
CN112367888A (en) * 2018-05-04 2021-02-12 Lg电子株式会社 Multiple robot cleaner and control method thereof
US11150668B2 (en) 2018-05-04 2021-10-19 Lg Electronics Inc. Plurality of robot cleaner and a controlling method for the same
US11148290B2 (en) 2018-05-04 2021-10-19 Lg Electronics Inc. Plurality of robot cleaner and a controlling method for the same
CN112367888B (en) * 2018-05-04 2022-04-08 Lg电子株式会社 Multiple robot cleaner and control method thereof
CN112367887B (en) * 2018-05-04 2022-11-25 Lg电子株式会社 Multiple robot cleaner and control method thereof
US11934200B2 (en) 2018-05-04 2024-03-19 Lg Electronics Inc. Plurality of robot cleaner and a controlling method for the same

Also Published As

Publication number Publication date
CN106933225B (en) 2020-05-12
CN104615132B (en) 2017-10-20
CN106933225A (en) 2017-07-07

Similar Documents

Publication Publication Date Title
CN104615132A (en) Autonomous mobile carrier and automatic following system
CN105119338B (en) Mobile robot charge control system and method
KR102398330B1 (en) Moving robot and controlling method thereof
CN103170980B (en) A kind of navigation system of household service robot and localization method
CN107167138B (en) A kind of library's intelligence Way guidance system and method
CN202771144U (en) Automatic focusing projector
CN101354441A (en) All-weather operating mobile robot positioning system
CN101916112B (en) Positioning and controlling system and method of intelligent vehicle model in indoor scene
CN106933096B (en) Self-following robot device and method for providing space positioning information for third party
CN102636152B (en) Active visual ranging system of movable platform
CN108762291A (en) A kind of method and system finding and track black winged unmanned aerial vehicle remote controller
US10638899B2 (en) Cleaner
CN205081492U (en) Mobile robot control system that charges
CN104267725A (en) Indoor navigation and positioning system for autonomous charging of sweeping robot
CN105554472B (en) The method of the video monitoring system and its positioning robot of overlay environment
US9616927B2 (en) Parking assistance system
CN108973737A (en) Electric vehicle wireless charging positioning device and positioning method thereof
CN103537099A (en) Tracking toy
CN109008806B (en) Floor sweeping robot positioning system and method based on LED intelligent lamp positioning
TWI509530B (en) Adapted mobile carrier and auto following system
CN206399422U (en) Multifunctional vision sensor and mobile robot
CN209147948U (en) Contour outline measuring set based on linear light source
CN110509297A (en) A kind of two dimensional code detection robot, detection system and detection method
CN102812326A (en) Method for controlling a measuring system and measuring system for carrying out the method
US20230286399A1 (en) Charging station, charging station system, method and apparatus for returning to station and lawnmowing robot

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant