CN203650524U - Mobile robot - Google Patents
Mobile robot Download PDFInfo
- Publication number
- CN203650524U CN203650524U CN201320231237.1U CN201320231237U CN203650524U CN 203650524 U CN203650524 U CN 203650524U CN 201320231237 U CN201320231237 U CN 201320231237U CN 203650524 U CN203650524 U CN 203650524U
- Authority
- CN
- China
- Prior art keywords
- light
- mobile robot
- processing unit
- mobile
- light beam
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn - After Issue
Links
Images
Landscapes
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Manipulator (AREA)
Abstract
The utility model discloses a mobile robot which includes a light emitting unit, a processing unit, an optical member, an image sensing unit, a control unit and a moving unit. The light emitting unit is used for emitting a main light beam; the processing unit processes the main light beam for generating multiple secondary light beams which form a light ray. When the part of the secondary light beam irradiates a first object, the first object generates multiple environmental reflected light beams which are received by the optical member to form a concentrated light beam. The image sensing unit receives the concentrated light beam and then outputs a first detected result to the control unit. According to the first detected result, the control unit generates first depth information; and then the moving unit is controlled based on the first depth information for controlling the behavior mode of the mobile robot. The utility model can measure the depth of field rapidly and effectively.
Description
Technical field
The utility model relates to a kind of mobile robot, particularly a kind of mobile robot that can measure fast and efficiently the depth of field.
Background technology
The movement of mobile robot in circumstances not known is the result that need to rely on sensing apparatus to measure, if the information deficiency of measuring, for mobile robot, movement is a dangerous thing.
Summary of the invention
The purpose of this utility model is to provide a kind of mobile robot, comprises an Optical Transmit Unit, a processing unit, an optical component, an image sensing unit, a control module and a mobile unit.Optical Transmit Unit is in order to send a main beam.Processing unit processes main beam, in order to produce light beam multiple times.Inferior light beam forms a light, and this light has a range of exposures.In the time that the part of inferior light beam is irradiated first object, the first object produces multiple Ambient light beams.Optical component reception environment folded light beam, in order to produce a convection light.Image sensing unit receives convection light and exports the first testing result.Control module produces one first depth information according to the first testing result, and by mobile unit control mobile robot's a behavior pattern.
Mobile robot of the present utility model, preferred, wherein this light ray parallel one ground.
Mobile robot of the present utility model, preferred, wherein this processing unit is towards the plurality of light beam of a ground surface launching.
Mobile robot of the present utility model, preferred, wherein this processing unit, according to this main beam, produces multiple detection light beams, and this processing unit is towards the plurality of detection light beam of a ground surface launching, this ground of this light ray parallel.
Mobile robot of the present utility model, preferably, wherein in the time that the part of the plurality of detection light beam is irradiated second object, this second object produces multiple road reflection light beams, this optical component receives the plurality of road reflection light beam, in order to produce one second convection light, this image sensing unit is according to this second convection light, produce one second testing result, this control module produces one second depth information according to this second testing result, and by this mobile unit, adjust mobile robot's travel path.
Mobile robot of the present utility model, preferred, wherein this light is a planar light or a curved surface light.
Mobile robot of the present utility model, preferred, wherein the plurality of light beam sequentially produced.
Mobile robot of the present utility model, preferred, wherein the plurality of light beam produced simultaneously.
Mobile robot of the present utility model, preferred, wherein this processing unit reflects this main beam, in order to produce light beam the plurality of time.
Mobile robot of the present utility model, preferred, wherein this processing unit comprises: a holder; One minute surface, connects this holder, and in order to reflect this main beam, wherein this control module rotates this holder.
Mobile robot of the present utility model, preferred, wherein this processing unit is a taper minute surface.
Mobile robot of the present utility model, preferred, wherein this processing unit reflects this main beam.
Mobile robot of the present utility model, preferred, wherein this processing unit is a lens pillar.
Mobile robot of the present utility model, preferably, wherein this lens pillar has a surface, this surface has a first area and a second area, this first area has one first reflectance coating, this second area has one second reflectance coating, and the thickness of this first reflectance coating equals the thickness of this second reflectance coating.
Mobile robot of the present utility model, preferably, wherein this lens pillar has a surface, this surface has a first area and a second area, this first area has one first reflectance coating, this second area has one second reflectance coating, and the thickness of this first reflectance coating is different from the thickness of this second reflectance coating.
Mobile robot of the present utility model, preferred, wherein this lens pillar has a surface, and this surface has a reflectance coating, and this reflectance coating is formed by an evaporation mode.
Mobile robot of the present utility model, preferred, wherein this this main beam of processing unit diffraction.
Mobile robot of the present utility model, preferred, wherein this processing unit is a grating lens.
Mobile robot of the present utility model, preferred, wherein this Optical Transmit Unit comprises a generating laser.
Mobile robot of the present utility model, preferred, wherein this light is a class wire light.
Mobile robot of the present utility model, preferably, wherein this control module is according to this depth information, from at least one default behavior pattern, select the behavior pattern or according to this depth information and a random fashion of one as this mobile robot, the behavior pattern or according to this depth information and a control instruction that determines this mobile robot, determines this mobile robot's behavior pattern, and this control instruction is set in direct or indirect mode by a user.
Mobile robot of the present utility model, preferably, wherein this mobile robot is according to this default behavior pattern, rectilinear movement, along barrier move, random number determines that moving direction, fixed point rotary, Scroll-tupe rotate, turn, accelerate, slow down, fall back or stop while walking.
Mobile robot of the present utility model, preferred, wherein user adjusts this mobile robot's behavior pattern with the outer manipulation of sighting distance.
The beneficial effects of the utility model are, mobile robot of the present utility model, and for measuring fast and efficiently the mobile robot of the depth of field, the danger of having avoided mobile robot to bring because measuring information deficiency.
For feature and advantage of the present utility model can be become apparent, cited below particularly go out preferred embodiment, and coordinate accompanying drawing, be described in detail below:
Accompanying drawing explanation
Figure 1A is that mobile robot's of the present utility model may embodiment.
Figure 1B is another possibility embodiment of mobile robot of the present utility model.
Fig. 2 is mobile robot's of the present utility model front view.
Fig. 3-Fig. 5 is mobile robot's of the present utility model side view.
Wherein, description of reference numerals is as follows:
100,100 ': mobile robot; 101: Optical Transmit Unit; 102,108: processing unit; 103: optical component; 104: image sensing unit; 105: control module; 106: mobile unit; 110,12: curved surface light; 201,202: border; 203: angle; 300: shell; 301: holder; 302: minute surface; 303: driver; 304: roller; 401: lens pillar; 501: grating lens; L
m: main beam; L
s1~L
s5S: inferior light beam; L
r1~L
r4: Ambient light beam; S
dT1, S
dT2: convection light; IFM
1, IFM
2: testing result; S
c1~S
c2: control signal; L
s6~L
s10: detect light beam; L
r5~L
r8: road reflection light beam.
The specific embodiment
Figure 1A is that mobile robot's of the present utility model may embodiment.As shown in the figure, mobile robot 100 comprises an Optical Transmit Unit 101, a processing unit 102, an optical component 103, an image sensing unit 104, a control module 105 and a mobile unit 106.In the present embodiment, mobile robot 100 sends a light beam, and carries out universe detection (detecting the surrounding environment of 360 °), sets up a depth information, then according to depth information, planning one behavior pattern (relevant behavior).
Optical Transmit Unit 101 sends a main beam L
m.The utility model does not limit the kind of main beam.In a possibility embodiment, main beam L
mbe a visible ray or a black light.In addition, the utility model does not also limit the internal circuit framework of Optical Transmit Unit 101.As long as can send the circuit framework of light beam, all can be used as Optical Transmit Unit 101.In a possibility embodiment, Optical Transmit Unit 101 has a generating laser (not shown), in order to send laser.In other embodiments, Optical Transmit Unit 101 has a RF transmitter (not shown).
Class wire light is irradiated in the pattern that plane institute projection is made up of luminous point, from this group of luminous point set, get the subclass A more than becoming at 3, this subclass A calculating linear regression obtains the coefficient of determination (coefficient of determination) and is greater than 0.9.
Or in this group of luminous point set, get the subclass B more than becoming at 4, this subclass B calculating quadratic regression obtains the coefficient of determination (coefficient of determination) and is greater than 0.9.
In a possibility embodiment, control module 105 has at least one default behavior pattern.Control module 105, according to depth information, from least one default behavior pattern, is selected the behavior pattern of one as mobile robot 100.Default behavior pattern at least comprises, rectilinear movement, along barrier move, random number determines that moving direction, fixed point rotary, Scroll-tupe rotate, turn, accelerate, slow down, fall back and stop while walking ... Deng.
In another possibility embodiment, the random behavior patterns that produce of control module 105.Control module 105 is according to the result of all sensing signals feedback and/or according to the weight of each sensor or priority condition, the combination of the single behavior pattern of output or multiple behavior patterns.In this example, the behavior pattern that control module 105 produces not is preset value.
In another embodiment, control module 105, according to a control instruction (not shown), produces a corresponding behavior pattern.In this example, control instruction can be user and sets in direct or indirect mode.One may embodiment in, user is directly with the way of contact, as presses the attached function button of mobile robot 100, adjust mobile robot 100 behavior pattern, or contactless mode, as remote controller, program software, sighting distance manipulates outward, adjusts mobile robot 100 behavior pattern.
The outer manipulation of sighting distance refer to can by barrier stop form under the situation that cannot directly observe, the mode that still can manipulate.Such as transfer instruction can by interconnected telephone network radio signal light sound smell heat transmit vibration the width mode such as penetrate transmit.
In certain embodiments, the behavior pattern that control module 105 produces changes mobile robot 100 movement position, direction, angle, speed, angular speed, acceleration and angular acceleration ... Deng.In other embodiments, the behavior pattern that control module 105 produces changes the relativeness between at least one object in mobile robot 100 and surrounding environment, or changes the elements relative state of mobile robot 100 inside or the relative status of change mobile robot 100 and periphery adapting device ... Deng.
The utility model is the internal circuit framework of limiting controling unit 105 not.In a possibility embodiment, control module 105 has multiple microcontrollers (micro-controller), processor (processor), internal memory (memory) and logic circuit (1ogic circuit).In other embodiments, control module 105 produces another control signal S
c2, in order to open or to close Optical Transmit Unit 101.For example, in a first period, control module 105 is closed Optical Transmit Unit 101.Now, optical component 103 receives mobile robot 100 reverberation around.Image sensing unit 104, according to the reception result of optical component 103, produces one first testing result.Control module 105 is stored the first testing result.In a second phase, control module 105 is opened Optical Transmit Unit 101.Now, processing unit 102 produces time light beam L
s1~L
s5.As inferior light beam L
s1~L
s5while being irradiated to object, object reflection time light beam L
s1~L
s5, in order to produce Ambient light beam L
r1~L
r4.Optical component 103 reception environment folded light beam L
r1~L
r4.Image sensing unit 104, according to the reception result of optical component 103, produces one second testing result.Relatively first and second testing result of control module 105, in order to distinguish folded light beam L in the second testing result
r1~L
r4, and according to distinguishing result, learn mobile robot 100 and the distance between object around.In this example, Optical Transmit Unit 101 is suitably opened or closed to control module 105, and according to two close pictures of time, tries to achieve mobile robot 100 and the distance between object around.
Because optical component 103 has universe visual capacity, namely there is the detection angles of 360 °, therefore, optical component 103 can receive mobile robot 100 all reverberation around.In the time that mobile robot 100 enters a narrow space, control module 105 can, according to the reception result of optical component 103, be adjusted the rotation direction of roller, in order to leave fast crawl space, to avoid mobile robot 100 to continue to sink in crawl space.
Figure 1B is another possibility embodiment of mobile robot of the present utility model.The similar Figure 1A of Figure 1B, difference be Figure 1B mobile robot more than 100 ' processing unit 108.In the present embodiment, processing unit 108 is processed main beam L
m, in order to produce multiple detection light beams.For convenience of description, Figure 1B only shows detection light beam L
s6~L
s10.Detect light beam L
s6~L
s10form a light 120, wherein light 120 has a range of exposures, detects light beam L
s6~L
s10the scope including.
When detecting light beam L
s6~L
s10part while irradiating an object, this object produces multiple road reflection light beam L
r5~L
r8.Optical component 103 receives road reflection light beam L
r5~L
r8, in order to produce a convection light S
dT2.Image sensing unit 104 is according to convection light S
dT2, produce a testing result IFM
2.Control module 105 is according to testing result IFM
2, produce control signal S
c1, in order to adjust the rotation direction of roller.
In one may implement, in the time that processing unit 102 produces light 110, processing unit 108 stops producing light 120.Therefore, 103 of optical components can receive Ambient light beam L
r1~L
r4.In another possibility embodiment, in the time that processing unit 108 produces light 120, processing unit 102 stops producing light 110.Therefore, 103 of optical components can receive road reflection light beam L
r5~L
r8.In other embodiments, when processing unit 102 produces light 110, processing unit 108 also produces light 120.Therefore, optical component 103 can receive Ambient light beam L simultaneously
r1~L
r4and road reflection light beam L
r5~L
r8.
The utility model does not limit the direction of illumination of light 110 and 120.In a possibility embodiment, the parallel ground of light 110 that processing unit 102 produces, in order to detect the object space in surrounding environment, and processing unit 108 detects light beam L towards ground surface launching
s6~L
s10, in order to detect the situation on road surface, as whether being rough road surface.By the received Ambient light beam L of optical component 103
r1~L
r4, can avoid mobile robot 100 to collide the barrier in surrounding environment.In addition, by the received road reflection light beam L of optical component 103
r5~L
r8, can avoid mobile robot 100 to throw from eminence.
The utility model does not limit the shape of light 110 and 120.In a possibility embodiment, light 110 and 120 is plane light or a curved surface light.In addition, the utility model does not also limit the mode of processing unit 102 and 108 generation light 110 and 120.In the present embodiment, the main beam L that processing unit 102 and 108 is sent according to same Optical Transmit Unit (as 101)
m, produce light 110 and 120.For example, by adjusting the position of Optical Transmit Unit 101, just can make main beam L
mtreatment with irradiation unit 102 or 108, in order to produce light 110 or 120.In other embodiments, by adjusting the position of processing unit 102 and 108, just can receive the main beam L that Optical Transmit Unit 101 sends
m.In another possibility embodiment, processing unit 102 and 108 produces light 110 and 120 according to different main beams respectively.In this example, mobile robot 100 ' has two Optical Transmit Units.
Fig. 2 is mobile robot's of the present utility model front view.As shown in the figure, the inferior light beam L that mobile robot 100 sends
s1~L
s5can form a light 110.Light 110 has border 201 and 202.Between border 201 and 202, there is an angle 203.The utility model is the size of predetermined angle 203 not.In a possibility embodiment, the angle 203 between border 201 and 202 is greater than 180 degree.In another possibility embodiment, angle 203 approaches 360 degree.
Fig. 3 is a mobile robot's of the present utility model side view.As shown in the figure, mobile robot 100 comprises a shell 300.Optical transmitting set 101 is arranged among shell 300, and launches a main beam L
m.Processing unit 102 is arranged at outside shell 300, in order to receive and to process main beam L
m.In the present embodiment, processing unit 102 reflects main beam L
m, in order to produce time light beam L
s1~L
s5, wherein time light beam L
s1~L
s5form light 110.In a possibility embodiment, the parallel ground 305 of light 110.
The utility model does not limit the inside structure of processing unit 102.As long as a beam treatment can be become to the framework of multiple light beams, all can be used as processing unit 102.As shown in the figure, processing unit 102 comprises a holder 301 and a minute surface 302.Minute surface 302 has an angle of inclination, and connection fixing base 301 center.The utility model does not limit the size at the angle of inclination of minute surface 302.In a possibility embodiment, approximately 45 °, the angle of inclination of minute surface 302.In other embodiments, by controlling the angle of inclination of minute surface 302, just can control the direction of illumination of light 110, as (as shown in Figure 3) irradiated in 305 irradiations or the front towards mobile robot 100 earthward.
In the present embodiment, control module 105 rotary fixing bases 301.Therefore, minute surface 302 can reflex to different directions by main beam LM, and the reverberation wherein extending toward different directions is called time light beam.Moreover the reverberation that minute surface 302 produces can form light 110.In this example, minute surface 302 sequentially produces the reverberation of different directions, that is to say that multiple the light that minute surface 302 produces not produce simultaneously.
In other embodiments, can utilize a taper minute surface (not shown) to replace minute surface 302.Because taper minute surface can be according to a main beam L
mproduce multiple folded light beams, and the past different directions extension of different folded light beams system, therefore, control module 105 does not need rotary fixing base.In a possibility embodiment, can omit holder 301.In addition, taper minute surface produces multiple folded light beams simultaneously.
Fig. 4 is another embodiment of processing unit of the present utility model.Fig. 4 similar diagram 3, difference is that the processing unit 102 of Fig. 4 is in the mode of refraction, processing main beam L
m.Because the operating principle of other element of Fig. 4 is identical with Fig. 3, therefore repeat no more.In the present embodiment, processing unit 102 is a lens pillar 401.Lens pillar 401 reflects main beam L
m, in order to produce multiple divergent beams, wherein multiple divergent beams extend toward different directions, and form light 110.Light 110 has a dispersion angle, rough 120 °.In a possibility embodiment, lens pillar 401 produces multiple divergent beams simultaneously.
In another may embodiment, can be at the plated surface last layer reflectance coating of lens pillar 401, in order to improve lens pillar 401 dispersion angle.In addition, by adjusting the position of Optical Transmit Unit 101 and lens pillar 401, just can control the direction of illumination of light 110.In the present embodiment, light 110 irradiates towards mobile robot 100 front.
The utility model does not limit the mode that forms reflectance coating.In a possibility embodiment, by an evaporation mode (deposition), just can form a reflectance coating on the surface of lens pillar 401.In addition, reflectance coating may evenly or unevenly be formed on the surface of lens pillar 401.Suppose, a surface of lens pillar 401 has a first area and a second area.First area has one first reflectance coating, and second area has one second reflectance coating.In a possibility embodiment, the thickness of the first reflectance coating is similar and different in the thickness of the second reflectance coating.In other embodiments, the thickness of similar and different another the lip-deep reflectance coating in lens pillar 401 of the thickness of a lip-deep reflectance coating of lens pillar 401.
Fig. 5 is another embodiment of processing unit of the present utility model.Fig. 5 similar diagram 3, difference is that the processing unit 102 of Fig. 5 is in the mode of diffraction, processes main beam L
m.Because other element of Fig. 5 is identical with Fig. 3, therefore no longer show.In the present embodiment, processing unit 102 is a grating lens 501.Grating lens 501 has a specific pattern (not shown), in order to by main beam L
mdiffraction becomes light beam multiple times.In this example, grating lens 501 produces light beam multiple times simultaneously.
In Fig. 3-Fig. 5, mobile robot 100 only has single processing unit 102, in order to produce a light 110, but not in order to limit the utility model.In another embodiment, can increase by a processing unit 108 in the mobile robot 100 shown in Fig. 3-Fig. 5, in order to produce another light 120.By adjusting the setting position of Optical Transmit Unit 101 and processing unit 102,108, just can control the direction of illumination of light 110 and 120, as the front towards mobile robot 100 or below irradiation.
In other embodiments, mobile robot 100 may have two Optical Transmit Units and two processing units.The main beam that the Optical Transmit Unit that different processing unit processes is different produces.In another embodiment, mobile robot 100 may have two Optical Transmit Units and a processing unit.The main beam that processing unit produces according to different Optical Transmit Units, provides the light of different directions.In this example, when two Optical Transmit Units possibility whiles or difference, launch main beam.
Unless otherwise defined, all belong to (comprising technology and science vocabulary) the utility model the technical staff in the technical field's general understanding at this all vocabulary.In addition,, unless clear expression, the definition of vocabulary in general dictionary should be interpreted as consistent with meaning in the article of its correlative technology field, and should not be construed as perfect condition or too formal voice.
Although the utility model discloses as above with preferred embodiment; so it is not in order to limit the utility model; any the technical staff in the technical field; not departing from spirit and scope of the present utility model; when doing a little change and retouching, therefore protection domain of the present utility model is when being as the criterion depending on the claim person of defining.
Claims (23)
1. a mobile robot, is characterized in that, comprising:
One Optical Transmit Unit, in order to send a main beam;
One processing unit, processes this main beam, in order to produce light beam multiple times, wherein the plurality of light beam forms a light, this light has a range of exposures, and in the time that the part of the plurality of light beam is irradiated first object, this first object produces multiple Ambient light beams;
One optical component, receives the plurality of Ambient light beam, in order to produce one first convection light;
One image sensing unit, according to this first convection light, produces one first testing result;
One control module, according to this first testing result, produces a depth information; And
One mobile unit, in order to mobile this mobile robot, wherein this control module is according to this depth information, and by this mobile unit, controls this mobile robot's a behavior pattern.
2. mobile robot as claimed in claim 1, wherein this light ray parallel one ground.
3. mobile robot as claimed in claim 1, wherein this processing unit is towards the plurality of light beam of a ground surface launching.
4. mobile robot as claimed in claim 1, wherein this processing unit, according to this main beam, produces multiple detection light beams, and this processing unit is towards the plurality of detection light beam of a ground surface launching, this ground of this light ray parallel.
5. mobile robot as claimed in claim 4, wherein in the time that the part of the plurality of detection light beam is irradiated second object, this second object produces multiple road reflection light beams, this optical component receives the plurality of road reflection light beam, in order to produce one second convection light, this image sensing unit is according to this second convection light, produce one second testing result, this control module produces one second depth information according to this second testing result, and by this mobile unit, adjust mobile robot's travel path.
6. mobile robot as claimed in claim 1, wherein this light is a planar light or a curved surface light.
7. mobile robot as claimed in claim 1, wherein the plurality of light beam sequentially produced.
8. mobile robot as claimed in claim 1, wherein the plurality of light beam produced simultaneously.
9. mobile robot as claimed in claim 1, wherein this processing unit reflects this main beam, in order to produce light beam the plurality of time.
10. mobile robot as claimed in claim 9, wherein this processing unit comprises:
One holder:
One minute surface, connects this holder, and in order to reflect this main beam, wherein this control module rotates this holder.
11. mobile robots as claimed in claim 1, wherein this processing unit is a taper minute surface.
12. mobile robots as claimed in claim 1, wherein this processing unit reflects this main beam.
13. mobile robots as claimed in claim 12, wherein this processing unit is a lens pillar.
14. mobile robots as claimed in claim 13, wherein this lens pillar has a surface, this surface has a first area and a second area, this first area has one first reflectance coating, this second area has one second reflectance coating, and the thickness of this first reflectance coating equals the thickness of this second reflectance coating.
15. mobile robots as claimed in claim 13, wherein this lens pillar has a surface, this surface has a first area and a second area, this first area has one first reflectance coating, this second area has one second reflectance coating, and the thickness of this first reflectance coating is different from the thickness of this second reflectance coating.
16. mobile robots as claimed in claim 13, wherein this lens pillar has a surface, and this surface has a reflectance coating, and this reflectance coating is formed by an evaporation mode.
17. mobile robots as claimed in claim 1, wherein this this main beam of processing unit diffraction.
18. mobile robots as claimed in claim 17, wherein this processing unit is a grating lens.
19. mobile robots as claimed in claim 1, wherein this Optical Transmit Unit comprises a generating laser.
20. mobile robots as claimed in claim 1, wherein this light is a class wire light.
21. mobile robots as claimed in claim 1, wherein this control module is according to this depth information, from at least one default behavior pattern, select the behavior pattern or according to this depth information and a random fashion of one as this mobile robot, determine this mobile robot's behavior pattern or according to this depth information and a control instruction, the behavior pattern that determines this mobile robot, this control instruction is set in direct or indirect mode by a user.
22. mobile robots as claimed in claim 21, wherein this mobile robot is according to this default behavior pattern, rectilinear movement, along barrier move, random number determines that moving direction, fixed point rotary, Scroll-tupe rotate, turn, accelerate, slow down, fall back or stop while walking.
23. mobile robots as claimed in claim 22, wherein user adjusts this mobile robot's behavior pattern with the outer manipulation of sighting distance.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201320231237.1U CN203650524U (en) | 2013-04-26 | 2013-04-26 | Mobile robot |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201320231237.1U CN203650524U (en) | 2013-04-26 | 2013-04-26 | Mobile robot |
Publications (1)
Publication Number | Publication Date |
---|---|
CN203650524U true CN203650524U (en) | 2014-06-18 |
Family
ID=50916960
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201320231237.1U Withdrawn - After Issue CN203650524U (en) | 2013-04-26 | 2013-04-26 | Mobile robot |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN203650524U (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104117987A (en) * | 2013-04-26 | 2014-10-29 | 恩斯迈电子(深圳)有限公司 | Mobile robot |
-
2013
- 2013-04-26 CN CN201320231237.1U patent/CN203650524U/en not_active Withdrawn - After Issue
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104117987A (en) * | 2013-04-26 | 2014-10-29 | 恩斯迈电子(深圳)有限公司 | Mobile robot |
CN104117987B (en) * | 2013-04-26 | 2017-05-10 | 恩斯迈电子(深圳)有限公司 | Mobile robot |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104117987B (en) | Mobile robot | |
CN110490922B (en) | Method and processing system for updating first image based on second image | |
EP3238431B1 (en) | Time-of-flight camera system and method to improve measurement quality of weak field-of-view signal regions | |
US10531073B2 (en) | Method and apparatus for automatic calibration of RGBZ sensors utilizing epipolar geometry and scanning beam projector | |
JP6483127B2 (en) | Detector for determining the position of at least one object | |
CN108603936B (en) | Laser scanning system, laser scanning method, non-transitory computer-readable storage medium | |
AU2014280332B2 (en) | Detector for optically detecting at least one object | |
WO2014200589A2 (en) | Determining positional information for an object in space | |
CN110178156A (en) | Range sensor including adjustable focal length imaging sensor | |
US20210072398A1 (en) | Information processing apparatus, information processing method, and ranging system | |
EP3167304A1 (en) | Detector for determining a position of at least one object | |
BR112020022099A2 (en) | Flight time distance measurement with variable emission fields | |
CN110326364A (en) | Method for calibrating rotatable and pivotable technology stage setting | |
US20200380727A1 (en) | Control method and device for mobile device, and storage device | |
CN203650524U (en) | Mobile robot | |
US20120050752A1 (en) | Large scale metrology apparatus and method | |
WO2015045844A1 (en) | Robot, control method, and program | |
JP2021521432A (en) | Methods and devices for object detection with built-in sidelobe features of metamaterial antennas | |
JP2018155658A (en) | Object detector, method for object detection, and object detection program | |
Moriya et al. | Effective Trilateration-based Indoor Localization Method Utilizing Active Control of Lighting Devices. | |
US20210181346A1 (en) | Object specific measuring with an opto-electronic measuring device | |
US11233937B1 (en) | Autonomously motile device with image capture | |
CN101281590B (en) | Operating unit as well as video system containing the same | |
RU2706912C2 (en) | Method for adaptive scanning of underlying surface with laser locator beam in low-altitude flight information support mode | |
KR20210014951A (en) | Method and system for detecting aerial vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
AV01 | Patent right actively abandoned |
Granted publication date: 20140618 Effective date of abandoning: 20170510 |
|
AV01 | Patent right actively abandoned |