CN107076557A - Mobile robot recognition positioning method, device, system and mobile robot - Google Patents

Mobile robot recognition positioning method, device, system and mobile robot Download PDF

Info

Publication number
CN107076557A
CN107076557A CN201680002465.8A CN201680002465A CN107076557A CN 107076557 A CN107076557 A CN 107076557A CN 201680002465 A CN201680002465 A CN 201680002465A CN 107076557 A CN107076557 A CN 107076557A
Authority
CN
China
Prior art keywords
mobile robot
information
image information
led
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201680002465.8A
Other languages
Chinese (zh)
Inventor
包玉奇
贝世猛
戚晓林
苗向鹏
梁博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Dajiang Innovations Technology Co Ltd
Original Assignee
Shenzhen Dajiang Innovations Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Dajiang Innovations Technology Co Ltd filed Critical Shenzhen Dajiang Innovations Technology Co Ltd
Publication of CN107076557A publication Critical patent/CN107076557A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Manipulator (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

A kind of mobile robot (1) recognition positioning method, device, system and mobile robot (1), this method includes:Obtain the image information (S101) of the light source (91) of mobile robot (1) carrying;According to the image information of light source (91), identification, positioning mobile robot (1) (S102).The image information of the light source (91) carried by mobile robot (1), determine the color of light source in image information (91), shape, and light source (91) is relative to the position of image information, robot (1) can recognize that according to the color and shape of light source in image information (91), position of the mobile robot (1) in place (80) can determine that relative to the position of image information according to light source (91), improve the precision of identification mobile robot (1), the positioning precision of mobile robot (1) is improved simultaneously.

Description

Mobile robot recognition positioning method, device, system and mobile robot
Technical field
The present embodiments relate to robot field, more particularly to a kind of mobile robot recognition positioning method, device, System and mobile robot.
Background technology
In robot battle contest, multiple mobile robots are divided into be resisted between ourselves and the enemy.Each mobile robot can To receive wireless remote control control, and each robot is with fight weapon, for example, BB ball emitters, beam emissions rifle etc..
However, when Same Site multiple robots occurs simultaneously, it is necessary to each robot is identified and positioned, it is existing In technology, in order to distinguish each robot, different Sign Boards are provided with each robot, a Sign Board is used to identify one Individual robot.In addition, in order to be positioned to each robot, each robot is separately installed with location equipment, the location equipment can To receive multiple wireless signals, and the strong and weak information of each wireless signal is sent to server, the server is according to each wireless communication Number strong and weak information determine the location equipment location information, the i.e. corresponding robot of the location equipment location information.
When robot is resisted in motion process or with other robot, the Sign Board with robot is easy Fall, cause the identity of robot to be difficult to;In addition, the strong and weak information of the multiple wireless signals received according to location equipment The location information for the location equipment determined causes robot with there is relatively large deviation before the physical location of the location equipment Positioning accurate accuracy it is relatively low.
The content of the invention
The embodiment of the present invention provides a kind of mobile robot recognition positioning method, device, system and mobile robot, To improve identification and the positioning accurate accuracy of mobile robot.
The one side of the embodiment of the present invention is to provide a kind of mobile robot recognition positioning method, including:
Obtain the image information of the light source of mobile robot carrying;
According to the image information of the light source, recognize, position the mobile robot.
The other side of the embodiment of the present invention is to provide a kind of mobile robot identification alignment system, including:
One or more processors, either individually or collectively work, and the processor is used for:
Obtain the image information of the light source of mobile robot carrying;
According to the image information of the light source, recognize, position the mobile robot.
The other side of the embodiment of the present invention is to provide a kind of camera, provided with camera lens module, and the camera also includes:
Housing, the housing outer surface is provided with lamp window;
Multiple LEDs, in the housing, and the light sent passes through the lamp window;
Controller, is electrically connected with the multiple LED;
Wherein, the controller drives the LED to light, and controls the working condition of the multiple LED.
The other side of the embodiment of the present invention is to provide a kind of mobile robot, including:
Fuselage;
Mobile device, is connected with fuselage, the power for providing the fuselage movement;
Camera, installed in the top of the fuselage, the camera is provided with camera lens module, in addition to:
Housing, the housing outer surface is provided with lamp window;
Multiple LEDs, in the housing, and the light sent passes through the lamp window;
Controller, is electrically connected with the multiple LED;
Wherein, the controller drives the LED to light, and controls the working condition of the multiple LED.
Mobile robot recognition positioning method provided in an embodiment of the present invention, device, system and mobile robot, lead to The image information of the light source of mobile robot carrying is crossed, determines that the color of light source, shape and light source are relative in image information In the position of image information, robot can recognize that according to the color and shape of light source in image information, according to light source relative to figure As the position of information can determine that position of the mobile robot in place, the precision of identification mobile robot is improved, together When improve the positioning precision of mobile robot.
Camera provided in an embodiment of the present invention is provided with multiple LEDs, and the controller of camera drives the LED to light, And the working condition of the multiple LED is controlled, so that multiple LEDs send the light of predetermined pattern, by default figure The identification of the image information of case, can recognize the mobile robot for being mounted with the camera, so as to easily recognize removable motivation Device people.Meanwhile, camera is as the module of identifying system, and it installs more convenient with dismounting.
Brief description of the drawings
In order to illustrate more clearly about the embodiment of the present invention or technical scheme of the prior art, below will be to embodiment or existing There is the accompanying drawing used required in technology description to be briefly described, it should be apparent that, drawings in the following description are this hairs Some bright embodiments, for those of ordinary skill in the art, without having to pay creative labor, can be with Other accompanying drawings are obtained according to these accompanying drawings.
Fig. 1 is the flow chart for the mobile robot recognition positioning method that the embodiment of the present invention one is provided;
Figure 1A is the network structure that the mobile robot recognition positioning method that the embodiment of the present invention one is provided is applicable;
Figure 1B is the schematic diagram for the image information that the embodiment of the present invention one is provided;
Fig. 1 C are the schematic diagram for the image information that the embodiment of the present invention one is provided;
Fig. 2 is the flow chart for the mobile robot recognition positioning method that the embodiment of the present invention two is provided;
Fig. 3 is the flow chart for the mobile robot recognition positioning method that the embodiment of the present invention three is provided;
Fig. 3 A are the schematic diagram for the editable LED array that the embodiment of the present invention three is provided;
Fig. 3 B are the schematic diagram for the first image information that the embodiment of the present invention three is provided;
Fig. 3 C are the schematic diagram for the first image information that the embodiment of the present invention three is provided;
Fig. 3 D are the schematic diagram for the first image information that the embodiment of the present invention three is provided;
Fig. 4 is the flow chart for the mobile robot recognition positioning method that the embodiment of the present invention four is provided;
Fig. 4 A embodiment of the present invention four provides a kind of mobile robot recognition positioning method applicable network structure;
Fig. 5 is the structure chart that the mobile robot that the embodiment of the present invention five is provided recognizes alignment system;
Fig. 6 is the structure chart that the mobile robot that the embodiment of the present invention eight is provided recognizes alignment system;
Fig. 7 is the structure chart for the camera that the embodiment of the present invention nine is provided;
Fig. 8 is the structure chart for the camera that the embodiment of the present invention ten is provided;
Fig. 9 A are the explosive view for the camera that the embodiment of the present invention 11 is provided;
Fig. 9 B are the left view for the camera that the embodiment of the present invention 11 is provided;
Fig. 9 C are the front view for the camera that the embodiment of the present invention 11 is provided;
Fig. 9 D are the top view for the camera that the embodiment of the present invention 11 is provided;
Fig. 9 E are the axonometric drawing for the camera that the embodiment of the present invention 11 is provided;
Figure 10 is the structure chart for the mobile robot that the embodiment of the present invention 12 is provided.
Reference:
1- mobile robot 21- capture apparatus 22- servers
23- display screen 51- processor 30- lens protection glass
31- camera lens 32-LED lamp window 33-LED lamps
34- figures pass plate 35- line 36- wire casing clamps
38- base 39- camera boards are covered on 37-
40-5.8G antenna 41- power panel 42- fans
43- top cover 52- imaging sensor 53- radio receivers
54- wireless base station apparatus 55- display 71- robot B image
72- robots A image 81 is to 86- the first camera 89- camera lens modules
Many LED 92- controllers of 90- housings 91-
93- lamp window 94- image transmission 100- lamp chambers
101- camera lens chamber 1001- mobile device 1002- cameras
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is carried out clear, complete Site preparation is described, it is clear that described embodiment is only a part of embodiment of the invention, rather than whole embodiments.It is based on Embodiment in the present invention, it is every other that those of ordinary skill in the art are obtained under the premise of creative work is not made Embodiment, belongs to the scope of protection of the invention.
It should be noted that when component is referred to as " being fixed on " another component, it can be directly on another component Or can also have component placed in the middle.When a component is considered as " connection " another component, it can be directly connected to To another component or it may be simultaneously present component placed in the middle.
Unless otherwise defined, all of technologies and scientific terms used here by the article is with belonging to technical field of the invention The implication that technical staff is generally understood that is identical.Term used in the description of the invention herein is intended merely to description tool The purpose of the embodiment of body, it is not intended that in the limitation present invention.Term as used herein " and/or " include one or more phases The arbitrary and all combination of the Listed Items of pass.
Below in conjunction with the accompanying drawings, some embodiments of the present invention are elaborated.It is following in the case where not conflicting Feature in embodiment and embodiment can be mutually combined.
Embodiment one
The embodiment of the present invention one provides a kind of mobile robot recognition positioning method.Fig. 1 is that the embodiment of the present invention one is carried The flow chart of the mobile robot recognition positioning method of confession.As shown in figure 1, the method in the present embodiment, can include:
Step S101, the image information for obtaining the light source that mobile robot is carried.
In embodiments of the present invention, mobile robot carries light source, and the light source includes following at least one:It is multiple LED, fluorescent lamp, infrared ray.For example, being provided with multiple LEDs on the fuselage of mobile robot, sent out in multiple LEDs Different color and arrangement mode can be presented in the LED of light;Or it is provided with different face on the fuselage of different mobile robots The fluorescent lamp of color and shape;It is provided with multiple infrared light emission points, Duo Gehong again or on the fuselage of each mobile robot The luminous shape that the arrangement mode of outside line luminous point is different, present is different.
Figure 1A is the network structure that the mobile robot recognition positioning method that the embodiment of the present invention one is provided is applicable.Such as Shown in Figure 1A, there are two mobile robots A and B in the coverage of capture apparatus 21, two mobile robots are carried Photosensitive material is provided with any light source as described above, the camera lens of capture apparatus 21, photosensitive material may be connected to shooting and set Standby 21 processor, when photosensitive material senses the light that light source is sent, sends electric signal, so that processor control to processor The shooting push button of capture apparatus 21 processed, so as to automatically snap luminous light source, and is sent to server by the image information of light source 22.The embodiment of the present invention does not limit the number of the coverage inner machine people of capture apparatus 21.
Step S102, the image information according to the light source, recognize, position the mobile robot.
If two mobile robots shown in Figure 1A carry the LED lighted in multiple LEDs, multiple LEDs respectively Different color and arrangement mode can be presented in lamp, and Figure 1B is the schematic diagram for the image information that the embodiment of the present invention one is provided.As schemed Shown in 1B, the image information is the image information for the light source that capture apparatus 21 is shot, and includes a red in the image information Numeral 1 and a blue numeral 2, wherein, the numeral 2 of red numeral 1 and blueness is made up of the LED lighted , if the prespecified red red team of correspondence, the blue team of blueness correspondence, server 22 is according to the LED lighted in the image information The color and arrangement mode presented, it may be determined that go out No. 2 that two mobile robots are No. 1 of red team and blue team respectively, from And realize enemy and we's identification.If, can be by the side of this step in addition, when capture apparatus 21 only photographs a mobile robot One mobile robot is identified method.
If two mobile robots shown in Figure 1A carry multiple infrared emitting points, multiple infrared light emissions respectively The luminous shape that the arrangement mode of point is different, present is different, and Fig. 1 C are the signal for the image information that the embodiment of the present invention one is provided Figure.As shown in Figure 1 C, a circle and a square are included in the image information, wherein, circular and square be by What multiple infrared emitting points were constituted, if the prespecified circular red team of correspondence, the blue team of square correspondence, server 22 is according to this The given shape that multiple infrared emitting points are constituted in image information, it may be determined that it is from red respectively to go out two mobile robots Team and blue team, it is achieved thereby that enemy and we recognize.If in addition, when capture apparatus 21 only photographs a mobile robot, can lead to One mobile robot is identified the method for crossing this step.
If two mobile robots shown in Figure 1A are carried without color, fluorescent lamp of different shapes respectively, take Fluorescent lamp is presented in the image information that business device 22 is shot according to capture apparatus 21 color and form identification mobile robot.
In addition, image information as shown in figure ib or figure 1 c, the image information can be specifically mobile robot place The image information in place, server 22 determines position of the mobile robot in place, example according to light source position in the picture Digital 2 position in the picture, can represent the place that blue No. 2 mobile robots of team are shot in capture apparatus 21 as shown in Figure 1B In position.
In addition, the position of mobile robot changes in real time, capture apparatus 21 can obtain mobile robot carrying in real time Light source image information so that server 22 in real time determine mobile robot position.As shown in Figure 1A, server 22 is gone back Display screen 23 is connected with, display screen 23 shows positional information, identity information and the motion track information of mobile robot Deng.
It is worth noting that, Figure 1B, Fig. 1 C are merely illustrative the image information of the light source of mobile robot carrying, and The concrete form of illuminating source presentation is not limited.
The image information for the light source that the present embodiment is carried by mobile robot, determines the face of light source in image information Color, shape and light source can recognize that machine relative to the position of image information according to the color and shape of light source in image information People, can determine that position of the mobile robot in place, improving identification can according to light source relative to the position of image information The precision of mobile robot, while improving the positioning precision of mobile robot.
Embodiment two
The embodiment of the present invention two provides a kind of mobile robot recognition positioning method.The present embodiment is carried in embodiment one On the basis of the technical scheme of confession, the light source is multiple LEDs, and the image information of the light source includes following at least one: The colouring information of the multiple LED, the arrangement information of the multiple LED, the positional information of the multiple LED.Fig. 2 is The flow chart for the mobile robot recognition positioning method that the embodiment of the present invention two is provided.As shown in Fig. 2 the side in the present embodiment Method, can include:
Step S201, the image information for obtaining multiple LEDs that mobile robot is carried.
In the present embodiment, the light source of mobile robot carrying is specially multiple LEDs, and it is every that capture apparatus 21 is shot The image information of multiple LEDs of individual mobile robot carrying includes following at least one:The color letter of the multiple LED Breath, the arrangement information of the multiple LED, the positional information of the multiple LED, wherein, the positional information tool of multiple LEDs Body is positional information of multiple LEDs in image information.
The executive agent of the present embodiment can be the server 22 in Figure 1A, and server 22 obtains removable from capture apparatus 21 The image information of multiple LEDs of mobile robot carrying, specific acquisition process is consistent with the method in above-described embodiment one, this Place is repeated no more.
Step S202, the colouring information according to the multiple LED and arrangement information, recognize the mobile robot.
Step S203, the positional information according to the multiple LED, position the mobile robot.
Server 22 recognizes the mobile robot according to the colouring information and arrangement information of the multiple LED Method and the positional information according to the multiple LED, position the method and above-described embodiment one of the mobile robot In method it is consistent, here is omitted.
The image information for the light source that the present embodiment is carried by mobile robot, determines the face of light source in image information Color, shape and light source can recognize that machine relative to the position of image information according to the color and shape of light source in image information People, can determine that position of the mobile robot in place, improving identification can according to light source relative to the position of image information The precision of mobile robot, while improving the positioning precision of mobile robot.
Embodiment three
The embodiment of the present invention three provides a kind of mobile robot recognition positioning method.The present embodiment is carried in embodiment two On the basis of the technical scheme of confession, the multiple LED constitutes editable LED array.Fig. 3 is that the embodiment of the present invention three is provided Mobile robot recognition positioning method flow chart.As shown in figure 3, the method in the present embodiment, can include:
Step S301, the first image information for obtaining the mobile robot, wherein, described first image information includes The image information of the LED array;
In the present embodiment, the first image information is the image information for the mobile robot that capture apparatus 21 is shot, often Individual mobile robot, which is carried, not only includes mobile robot in an editable LED array, the first image information, Including editable LED array.
Fig. 3 A are the schematic diagram for the editable LED array that the embodiment of the present invention three is provided.As shown in Figure 3A, it is editable LED array includes multiple lines and multiple rows, and the crosspoint of each row and column has the LED on a LED, the first row first row to be designated as 11, The arrangement mode for multiple LEDs that the light on and off of each LED are controllable, light in color controllable system, and LED array is also controllable System.The LED gone out such as is represented with the circle of white, luminous LED is represented with the circle of black, luminous multiple LEDs can be in Existing different shape, such as Arabic numerals 1 or other characters, numeral, wherein, luminous multiple LEDs can show same Color, can also show not same color.In addition, the embodiment of the present invention does not limit the size of LED array.
Fig. 3 B are the schematic diagram for the first image information that the embodiment of the present invention three is provided.As shown in Figure 3 B, capture apparatus 21 First image information of the mobile robot of shooting includes robot B image 71 and robot A image 72, the He of image 71 Include an editable LED array in image 72 respectively, the colouring information that each LED array is presented is different with arrangement information, tool Body, the editable LED array in image 71 presents the editable LED array in a blue numeral 2, image 72 Present a red numeral 1.
Step S302, the colouring information according to the corresponding LED array of described first image information and arrangement information, know Not described mobile robot;
The embodiment of the present invention can according in LED array light LED color, determine belonging to mobile robot Group, for example, the LED lighted in LED array shows red, then it represents that carry the mobile robot category of the LED array In red team, or, the LED lighted in LED array shows blueness, then it represents that carry the mobile robot of the LED array Belong to blue team.When the group of competition is more, can also according in LED array light LED color combination, it is determined that can Group belonging to mobile robot, for example, representing group 1 with red and blueness combination, being represented with the combination of red and yellow Group 2, the like.In addition, the arrangement mode of the LED lighted in LED array can have a variety of, for example, luminous LED Arabic numerals, specific figure or character etc. are showed, the numbering of mobile robot is determined according to Arabic numerals, or Person determines the identity of mobile robot according to specific figure or character.According to the color of the LED lighted in LED array Information and arrangement information, you can it is which of which competition group member to determine mobile robot, it is achieved thereby that right The accurate identification of mobile robot.
The first image information as shown in Figure 3 B includes the numeral 2 of blueness and the numeral 1 of red, due to B pairs of robot Image 71, robot A correspondence images 72 are answered, then can determine that robot B is No. 2 robots of blue team, robot A is the 1 of red team Number robot.
Step S303, the positional information according to the LED array in described first image information, are determined described removable Position of the robot in the corresponding place of described first image information.
In addition, server 22 can also be according to red numeral 1 in the first image information positional information, determine robot Positions of the A in the place that capture apparatus 21 is shot, i.e. red positional representation machine of the numeral 1 relative to the first image information Positions of the people A in place.Similarly, positional information of the server 22 according to the numeral 2 of blueness in the first image information, it is determined that Positions of the robot B in the place that capture apparatus 21 is shot, i.e. blue position table of the numeral 2 relative to the first image information Show positions of the robot B in place.
Step S304, according to positional information of each LED in described first image information in the LED array, it is determined that Direction of the mobile robot in the corresponding place of described first image information.
LED each luminous has two information in LED array:Positional information and colouring information, therefore, LED gusts LED each luminous is represented by (X, Y, C) in row, wherein, X represents abscissa of the light emitting LED lamp in LED array, Y tables Show ordinate of the light emitting LED lamp in LED array, C represents the color of light emitting LED lamp.Each mobile robot of reasonable assumption The size of the LED array of carrying is identical.Position of each LED in the first image information can be according to the first row in LED array Position and each LED skew relative to LED 11 of the LED 11 of one row in the first image information is determined, so that really Surely position and direction of the figure that luminous LED is constituted in the first image information.
Fig. 3 C are the schematic diagram for the first image information that the embodiment of the present invention three is provided.As shown in Figure 3 C, according to LED array In the direction of figure that constitutes of the LED that lights determine the direction of mobile robot, according to up north and down south, left west and right east, such as Shown in Fig. 3 C, blue numeral 2 northwards, represents robot B in place towards north;Blue red numeral 1 northwards, represents machine People A is in place towards north.
Position of the mobile robot in place may change, and the direction in place may also change, such as removable Mobile robot station is turned round in same point, changes the direction of itself.Fig. 3 D are the first image information that the embodiment of the present invention three is provided Schematic diagram.As shown in Figure 3 D, on the basis of Fig. 3 C, the LED 11 of the first row first row is in the first image in each LED array Position in information is changed, and illustrates the figure of the LED lighted in each LED array composition in the first image information Direction changed, specifically, eastwards, blue red numeral 1 is westwards, then it represents that robot B is on the scene for blue numeral 2 Towards east in ground, robot A is in place towards west.
The embodiment of the present invention carries LED array by mobile robot, obtains the removable motivation for carrying LED array The image information of device people, can accurately be recognized removable according to the color of light emitting LED lamp in LED array in image information and arrangement Machine, according to positional information of the LED array in image information, can accurately position mobile robot, in addition, according to LED The direction for the figure that the LED lighted in array is constituted, may further determine that direction of the mobile robot in image information.
Example IV
The embodiment of the present invention four provides a kind of mobile robot recognition positioning method.The present embodiment is carried in embodiment three On the basis of the technical scheme of confession, field division residing for mobile robot is multiple subfield, the top on each subfield ground To that should have first camera.Fig. 4 is the flow for the mobile robot recognition positioning method that the embodiment of the present invention four is provided Figure.As shown in figure 4, the method in the present embodiment, can include:
Step S401, removable movement machine described in the subfield ground obtained by the first camera above each subfield ground The first image information of people;
Fig. 4 A embodiment of the present invention four provides a kind of mobile robot recognition positioning method applicable network structure.Such as Shown in Fig. 4 A, the place 80 residing for mobile robot can be divided into 6 sub- places, region as shown by dashed lines, each subfield The top on ground is to that should have first camera, and such as the first camera 81-86 corresponds to a sub- place, each first shooting respectively The image pickup scope of head is the subfield that dotted arrow is pointed to.The capture apparatus 21 that above-described embodiment is related to can be the first shooting Any one in head 81-86, the image information that above-described embodiment is related to can be any one in the first camera 81-86 The image on the ground of subfield below of individual shooting, as shown in Figure 4 A, each subfield have mobile robot on the ground.
In addition, in embodiments of the present invention, LED array is arranged on the top of each mobile robot, each first shooting Head can directly photograph the LED array at the top of mobile robot under the direction of subfield ground during shooting image, so that pair can Mobile robot carries out whole audience positioning.The present embodiment does not limit the division methods on subfield ground, and the number on subfield ground is not limited yet.
Step S402, the colouring information according to the corresponding LED array of described first image information and arrangement information, know Not described mobile robot;
Step S403, the positional information according to the LED array in described first image information, are determined described removable Position of the robot in the corresponding place of described first image information;
Step S404, according to positional information of each LED in described first image information in the LED array, it is determined that Direction of the mobile robot in the corresponding place of described first image information;
Step S402- steps S404 is consistent with above-mentioned steps S302- steps S304 respectively, and specific method is no longer gone to live in the household of one's in-laws on getting married herein State.
Step S405, the second image information for obtaining the second camera shooting that the mobile robot is carried;
In embodiments of the present invention, each mobile robot also carries second camera, and the second camera can be with Light source is set together, can also be separately positioned with light source, i.e., second camera is the top that may be provided at mobile robot, Middle part or bottom can also be arranged on.The second camera is used for the image generation second for shooting mobile robot surrounding environment Image information.It is preferred that, second image information is the image information of no compression, and second camera is connected with wireless transmission dress Put, second camera directly sends out the second image information in the way of without compression after photographing the second image information of surrounding environment Give server.If the distance between mobile robot and server are beyond the wireless transmission distance of wireless base station apparatus, Then the second image information can be sent to server by mobile robot by trunking.
Step S406, according to second image information, determine the ambient condition information of the mobile robot;
Server is received after the second image information, is carried out image procossing to the second image information, is determined removable movement machine Whether whether have around the ambient condition information of people, such as mobile robot around barrier, mobile robot has Whether the robot of enemy, mobile robot has been run out of outside place etc..
Step S407, the ambient condition information according to the mobile robot, control the fortune of the mobile robot Dynamic direction.
Server receives the ambient condition information of mobile robot, and the ambient condition information of mobile robot is shown Show in display screen 23, display screen 23 may also display place 80, divide after subfield, and each subfield ground is removable Robot, user can input control instruction by the mobile robot on operating display 23, and server inputs user Control instruction is sent to mobile robot, to control the direction of motion of mobile robot.A certain machine is shown on display screen 23 There is barrier around device people, then user on display screen 23 by operating the robot, so that the robot cut-through thing Mode input control instruction, the control instruction is sent to the robot in place by server, make robot in place around Cross actual barrier, that is, the method for realizing user's remote control robot.
Step S408, the electrical parameter information for receiving the electrical source of power that the mobile robot is sent and/or remaining life Information;
In addition, mobile robot is when place is fought or is moved, electrical source of power constantly consumes electricity, electrical source of power Electrical parameter information includes following at least one:Electric current, voltage, power, dump energy.
Mobile robot is settable pressure sensor, when the size for the pressure that the pressure sensor is sensed exceedes Threshold value, illustrates that mobile robot is larger by extraneous impulsive force, the mobile robot may receive other side robot Processor inside heavy strike, the mobile robot is according to the location of pressure sensor and pressure sensor The size of the pressure sensed, determines the fatal degree that mobile robot is subject to, removable movement machine is determined according to fatal degree The remaining life-information of people.
Or mobile robot is settable photosensitive material, the robot of every team can hold infrared light beam rifle, work as sense Luminescent material senses ultrared exposure intensity or irradiation time beyond threshold value, illustrates the mobile robot by other side's machine The infrared light beam rifle that people is held is fired upon, and the processor inside mobile robot senses ultrared according to photosensitive material Position, ultrared exposure intensity and ultrared irradiation time, determine the injured degree or fatal of the mobile robot Degree, the remaining life-information of mobile robot is determined according to injured degree or fatal degree.
The electrical parameter information of electrical source of power and/or remaining life-information can be sent to service in real time by mobile robot Device.
Step S409, the ambient image of the display mobile robot, the positional information of the mobile robot with And the status information of the mobile robot.
Server 22 receives the ambient condition information of mobile robot, by the ambient condition information of mobile robot Display screen 23 is shown in, in addition, server 22 can also be by the positional information of mobile robot and the removable movement machine The status information of people is shown in display screen 23.
Wherein, the positional information of the mobile robot and the status information of the mobile robot are with embedded Form is shown in the ambient image of the mobile robot.
The positional information of the mobile robot includes following at least one:The positioning letter of the mobile robot Breath, the motion track information of the mobile robot.
The status information of the mobile robot includes following at least one:The mark letter of the mobile robot Breath, the orientation information of the mobile robot, the electrical parameter information of the electrical source of power of the mobile robot is described removable The remaining life-information of mobile robot.
The second image that the embodiment of the present invention receives the second camera shooting that mobile robot is carried by server is believed Breath, the second image information is the image information of no compression, reduces the outflow delay of image information, it is ensured that server can be quick The second image information is received, server can also control removable movement machine according to the ambient condition information in the second image information The direction of motion of people;In addition, server is connected with display, server by the ambient image of mobile robot, positional information, And status information is shown over the display, location information, movement locus letter for user's real time inspection mobile robot Breath and status information, and then the direction of motion of remote control mobile robot.
Embodiment five
The embodiment of the present invention five provides a kind of mobile robot identification alignment system.Fig. 5 is that the embodiment of the present invention five is carried The mobile robot of confession recognizes the structure chart of alignment system, as shown in figure 5, mobile robot identification alignment system 50 includes One or more processors 51, one or more processors 51 can either individually or collectively work, and processor 51 is used for:Acquisition can The image information of the light source of mobile robot carrying;According to the image information of the light source, recognize, position the removable movement machine People.
In embodiments of the present invention, mobile robot identification alignment system 50 also includes communicating with the processor 51 connecting The imaging sensor 52 connect, imaging sensor 52 is used for the image information for catching light source, and the image information of the light source is passed Give the processor 51.
In addition, the light source includes following at least one:Multiple LEDs, fluorescent lamp, infrared ray.
The embodiment of the present invention five provide mobile robot identification alignment system concrete principle and implementation with Embodiment one is similar, and here is omitted.
The image information for the light source that the present embodiment is carried by mobile robot, determines the face of light source in image information Color, shape and light source can recognize that machine relative to the position of image information according to the color and shape of light source in image information People, can determine that position of the mobile robot in place, improving identification can according to light source relative to the position of image information The precision of mobile robot, while improving the positioning precision of mobile robot.
Embodiment six
The embodiment of the present invention six provides a kind of mobile robot identification alignment system.The technical side provided in embodiment five On the basis of case, the light source is multiple LEDs, and the image information of the light source includes following at least one:The multiple LED The colouring information of lamp, the arrangement information of the multiple LED, the positional information of the multiple LED.
Processor 51 is recognized described removable specifically for the colouring information according to the multiple LED and arrangement information Robot.
Processor 51 positions the mobile robot specifically for the positional information according to the multiple LED.
The embodiment of the present invention six provide mobile robot identification alignment system concrete principle and implementation with Embodiment two is similar, and here is omitted.
The image information for the light source that the present embodiment is carried by mobile robot, determines the face of light source in image information Color, shape and light source can recognize that machine relative to the position of image information according to the color and shape of light source in image information People, can determine that position of the mobile robot in place, improving identification can according to light source relative to the position of image information The precision of mobile robot, while improving the positioning precision of mobile robot.
Embodiment seven
The embodiment of the present invention seven provides a kind of mobile robot identification alignment system.The technical side provided in embodiment six On the basis of case, the multiple LED constitutes editable LED array.
Imaging sensor 52 specifically for catching the first image information of the mobile robot, wherein, described first Image information includes the image information of the LED array.
Accordingly, processor 51 with specific reference to the corresponding LED array of described first image information colouring information and Arrangement information, recognizes the mobile robot.
Positional information of the processor 51 also with specific reference to the LED array in described first image information, it is determined that described Position of the mobile robot in the corresponding place of described first image information.
Processor 51 is additionally operable to according to positional information of each LED in described first image information in the LED array, Determine direction of the mobile robot in the corresponding place of described first image information.
The embodiment of the present invention seven provide mobile robot identification alignment system concrete principle and implementation with Embodiment three is similar, and here is omitted.
The embodiment of the present invention carries LED array by mobile robot, obtains the removable motivation for carrying LED array The image information of device people, can accurately be recognized removable according to the color of light emitting LED lamp in LED array in image information and arrangement Machine, according to positional information of the LED array in image information, can accurately position mobile robot, in addition, according to LED The direction for the figure that the LED lighted in array is constituted, may further determine that direction of the mobile robot in image information.
Embodiment eight
The embodiment of the present invention eight provides a kind of mobile robot identification alignment system.The technical side provided in embodiment seven On the basis of case, the field division is multiple subfields, and the top on each subfield ground be to that should have first camera;Accordingly , processor 51 obtains the of mobile robot described in the subfield ground by the first camera above each subfield ground One image information.
Fig. 6 is the structure chart that the mobile robot that the embodiment of the present invention eight is provided recognizes alignment system, as shown in fig. 6, Mobile robot identification alignment system 50 also includes communicating the radio receiver 53 being connected, wireless receiving dress with processor 51 Putting 53 is used to receive the second image information that the second camera of the mobile robot carrying shoots and sent.
Processor 51 is additionally operable to determine the surrounding environment letter of the mobile robot according to second image information Breath;According to the ambient condition information of the mobile robot, generate for controlling the mobile robot direction of motion Control command.
As shown in fig. 6, mobile robot identification alignment system 50 also includes communicating the wireless hair being connected with processor 51 Device 54 is sent, wireless base station apparatus 54 is used to send the control command to the mobile robot, so that described removable Robot changes the direction of motion according to the control command.
Further, second image information is the image information of no compression.
Radio receiver 53 be additionally operable to receive the electrical source of power that the mobile robot is sent electrical parameter information and/ Or remaining life-information.
As shown in fig. 6, mobile robot identification alignment system 50 also includes communicating the display being connected with processor 51 55, display 55 be used to showing the ambient image of the mobile robot, the positional information of the mobile robot and The status information of the mobile robot.
Further, the positional information of the mobile robot and the status information of the mobile robot are with embedding The form entered is shown in the ambient image of the mobile robot.
The positional information of the mobile robot includes following at least one:The positioning letter of the mobile robot Breath, the motion track information of the mobile robot.
The status information of the mobile robot includes following at least one:The mark letter of the mobile robot Breath, the orientation information of the mobile robot, the electrical parameter information of the electrical source of power of the mobile robot is described removable The remaining life-information of mobile robot.
The embodiment of the present invention eight provide mobile robot identification alignment system concrete principle and implementation with Example IV is similar, and here is omitted.
The second image that the embodiment of the present invention receives the second camera shooting that mobile robot is carried by server is believed Breath, the second image information is the image information of no compression, reduces the outflow delay of image information, it is ensured that server can be quick The second image information is received, server can also control removable movement machine according to the ambient condition information in the second image information The direction of motion of people;In addition, server is connected with display, server by the ambient image of mobile robot, positional information, And status information is shown over the display, location information, movement locus letter for user's real time inspection mobile robot Breath and status information, and then the direction of motion of remote control mobile robot.
Embodiment nine
The embodiment of the present invention nine provides a kind of camera.Fig. 7 is the structure chart for the camera that the embodiment of the present invention nine is provided, and is such as schemed Shown in 7, the camera is provided with camera lens module 89, additionally includes:Housing 90, multiple LEDs 91 and controller 92, wherein, housing 90 outer surfaces are provided with lamp window 93;Multiple LEDs 91 are arranged in housing 90, and the light sent passes through lamp window 93;Controller 92 electrically connect with multiple LEDs 91;The driving LED of controller 92 lights, and controls the work shape of the multiple LED 91 State.
The working condition includes following at least one:The glow color of the LED, luminous the multiple LED Arrangement mode.
The LED is RGB LEDs, and the controller controls the glow color of the RGB LEDs.
Camera provided in an embodiment of the present invention is provided with multiple LEDs, and the controller of camera drives the LED to light, And the working condition of the multiple LED is controlled, so that multiple LEDs send the light of predetermined pattern, by default figure The identification of the image information of case, can recognize the mobile robot for being mounted with the camera, so as to easily recognize removable motivation Device people.Meanwhile, camera is as the module of identifying system, and it installs more convenient with dismounting.
Embodiment ten
The embodiment of the present invention ten provides a kind of camera.Fig. 8 is the structure chart for the camera that the embodiment of the present invention ten is provided, and is such as schemed Shown in 8, on the basis of embodiment illustrated in fig. 7, the camera also includes the image transmission 94 being connected with camera lens module 89, figure As the original image that transmitting device 94 is used to catch the camera lens module 89 is directly transferred out in the way of without compression.
In addition, the internal cavities of housing 90 are separated into lamp chamber 100 and camera lens chamber 101, multiple LEDs 91 are arranged on lamp In chamber 100, camera lens module 89 is arranged in camera lens chamber 101.
The embodiment of the present invention is transferred out by way of the original image for catching camera lens module is direct with without compression, can Compression of images and the time of decompression are saved, figure is reduced and passes delay, improve figure transfer efficiency.
Embodiment 11
The embodiment of the present invention 11 provides a kind of camera.Fig. 9 A are the blast for the camera that the embodiment of the present invention 11 is provided Figure, as shown in Figure 9 A, camera include top cover 43, upper lid 37 and base 38, and upper lid 37 is located between top cover 43 and base 38, Upper lid 37 is spliced to form lamp chamber 100 as shown in Figure 8 jointly with top cover 43, and upper lid 37 is spliced to form such as Fig. 8 jointly with base 38 Shown camera lens chamber 101.
As shown in Figure 9 A, the camera lens module 89 in lens protection glass 30 and the pie graph 7 of camera lens 31 or Fig. 8;LED window Lamp window 93 in 32 specially Fig. 7 or Fig. 8, and an one LED window 32 of correspondence of LED 33, multiple LEDs are constituted can The LED array of editor;It is specially the image transmission 94 in Fig. 8 that figure, which passes plate 34,;Line 35 can be used for the camera being fixed on On the fuselage of mobile robot;Wire casing clamp 36 can be used for the upper lid 37 of connection and base 38;Camera board 39 can be used for fixed mirror First 31;5.8G antennas 40 can be used for server radio transmitting image information, the positional information of mobile robot, status information Deng;Power panel 41 can be used for powering to the camera;Fan 42 can be used for freezing to LED 33, it is to avoid LED 33 is because persistently luminous And burn out, accurately to recognize and position mobile robot.
Fig. 9 B are the left view for the camera that the embodiment of the present invention 11 is provided, and as shown in Figure 9 B, the left view of camera includes Lens protection glass 30 shown in Fig. 9 A;Fig. 9 C are the front view for the camera that the embodiment of the present invention 11 is provided;Fig. 9 D are this The top view for the camera that inventive embodiments 11 are provided;Fig. 9 E are the axonometric drawing for the camera that the embodiment of the present invention 11 is provided.
The enclosure interior cavity of camera is divided into lamp chamber and camera lens chamber by the embodiment of the present invention, and multiple LEDs are arranged on into lamp Intracavitary, is arranged on camera lens intracavitary, by the way that multiple LED and camera lens module are separated by camera lens module, it is to avoid camera lens module is sent out The light gone out influences the light that multiple LED are sent, and further increases the accuracy of identification mobile robot.
Embodiment 12
The embodiment of the present invention 12 provides a kind of mobile robot.Figure 10 is the removable of the offer of the embodiment of the present invention 12 The structure chart of mobile robot, in the present embodiment, mobile robot are illustrated by taking remote control chassis vehicle as an example.
As shown in Figure 10, mobile robot 1 includes fuselage 1003, mobile device 1001, camera 1002, wherein, it is mobile Device 1001 is connected with fuselage, the power for providing the fuselage movement;Camera 1002 is arranged on the top of fuselage, camera 1002 can be specifically the camera described in any embodiment in embodiment nine, embodiment ten, embodiment 11.As shown in fig. 7, phase Machine is provided with camera lens module 89, in addition, the camera also includes:Housing 90, multiple LEDs 91 and controller 92, wherein, outside housing 90 Surface is provided with lamp window 93;Multiple LEDs 91 are arranged in housing 90, and the light sent passes through lamp window 93;Controller 92 with Multiple LEDs 91 are electrically connected;The driving LED of controller 92 lights, and controls the working condition of the multiple LED 91.
The working condition includes following at least one:The glow color of the LED, luminous the multiple LED Arrangement mode.
The LED is RGB LEDs, and the controller controls the glow color of the RGB LEDs.
As shown in figure 8, the camera also includes the image transmission 94 being connected with camera lens module 89, image transmission 94 Original image for the camera lens module 89 to be caught directly is transferred out in the way of without compression.
In addition, the internal cavities of housing 90 are separated into lamp chamber 100 and camera lens chamber 101, multiple LEDs 91 are arranged on lamp In chamber 100, camera lens module 89 is arranged in camera lens chamber 101.
As shown in Figure 9 A, camera includes top cover, upper lid and base, and the upper lid is located at the top cover and the base Between, the upper lid is spliced to form lamp chamber 100 as shown in Figure 8 jointly with the top cover, and the upper lid is common with the base It is spliced to form camera lens chamber 101 as shown in Figure 8.
As shown in Figure 9 A, the camera lens module 89 in lens protection glass 30 and the pie graph 7 of camera lens 31 or Fig. 8;LED window Lamp window 93 in 32 specially Fig. 7 or Fig. 8, and an one LED window 32 of correspondence of LED 33, multiple LEDs are constituted can The LED array of editor;It is specially the image transmission 94 in Fig. 8 that figure, which passes plate 34,;Line 35 can be used for the camera being fixed on On the fuselage of mobile robot 1;Wire casing clamp 36 can be used for the upper lid 37 of connection and base 38;Camera board 39 can be used for fixed mirror First 31;5.8G antennas 40 can be used for believing to server radio transmitting image information, the positional information of mobile robot 1, state Breath etc.;Power panel 41 can be used for powering to the camera;Fan 42 can be used for freezing to LED 33, it is to avoid LED 33 is because persistently sending out Light and burn out, accurately to recognize and position mobile robot 1.
The image information for the light source that the present embodiment is carried by mobile robot 1, determines the face of light source in image information Color, shape and light source can recognize that machine relative to the position of image information according to the color and shape of light source in image information People, can determine that position of the mobile robot 1 in place, improving identification can according to light source relative to the position of image information The precision of mobile robot 1, while improving the positioning precision of mobile robot 1.
, can be by it in several embodiments provided by the present invention, it should be understood that disclosed apparatus and method Its mode is realized.For example, device embodiment described above is only schematical, for example, the division of the unit, only Only a kind of division of logic function, can there is other dividing mode when actually realizing, such as multiple units or component can be tied Another system is closed or is desirably integrated into, or some features can be ignored, or do not perform.It is another, it is shown or discussed Coupling each other or direct-coupling or communication connection can be the INDIRECT COUPLINGs or logical of device or unit by some interfaces Letter connection, can be electrical, machinery or other forms.
The unit illustrated as separating component can be or may not be it is physically separate, it is aobvious as unit The part shown can be or may not be physical location, you can with positioned at a place, or can also be distributed to multiple On NE.Some or all of unit therein can be selected to realize the mesh of this embodiment scheme according to the actual needs 's.
In addition, each functional unit in each embodiment of the invention can be integrated in a processing unit, can also That unit is individually physically present, can also two or more units it is integrated in a unit.Above-mentioned integrated list Member can both be realized in the form of hardware, it would however also be possible to employ hardware adds the form of SFU software functional unit to realize.
The above-mentioned integrated unit realized in the form of SFU software functional unit, can be stored in an embodied on computer readable and deposit In storage media.Above-mentioned SFU software functional unit is stored in a storage medium, including some instructions are to cause a computer Equipment (can be personal computer, server, or network equipment etc.) or processor (processor) perform the present invention each The part steps of embodiment methods described.And foregoing storage medium includes:USB flash disk, mobile hard disk, read-only storage (Read- Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disc or CD etc. it is various Can be with the medium of store program codes.
Those skilled in the art can be understood that, for convenience and simplicity of description, only with above-mentioned each functional module Division progress for example, in practical application, can distribute complete by different functional modules by above-mentioned functions as needed Into the internal structure of device being divided into different functional modules, to complete all or part of function described above.On The specific work process of the device of description is stated, the corresponding process in preceding method embodiment is may be referred to, will not be repeated here.
Finally it should be noted that:Various embodiments above is merely illustrative of the technical solution of the present invention, rather than its limitations;To the greatest extent The present invention is described in detail with reference to foregoing embodiments for pipe, it will be understood by those within the art that:Its according to The technical scheme described in foregoing embodiments can so be modified, or which part or all technical characteristic are entered Row equivalent substitution;And these modifications or replacement, the essence of appropriate technical solution is departed from various embodiments of the present invention technology The scope of scheme.

Claims (53)

1. a kind of mobile robot recognition positioning method, it is characterised in that including:
Obtain the image information of the light source of mobile robot carrying;
According to the image information of the light source, recognize, position the mobile robot.
2. according to the method described in claim 1, it is characterised in that the light source includes following at least one:Multiple LEDs, Fluorescent lamp, infrared ray.
3. method according to claim 2, it is characterised in that the light source is multiple LEDs, the image of the light source is believed Breath includes following at least one:The colouring information of the multiple LED, the arrangement information of the multiple LED is the multiple The positional information of LED.
4. method according to claim 3, it is characterised in that the image information according to the light source, identification, positioning The mobile robot, including:
According to the colouring information of the multiple LED and arrangement information, the mobile robot is recognized.
5. method according to claim 3, it is characterised in that the image information according to the light source, identification, positioning The mobile robot, including:
According to the positional information of the multiple LED, the mobile robot is positioned.
6. the method according to claim 4 or 5, it is characterised in that the multiple LED constitutes editable LED array.
7. method according to claim 6, it is characterised in that the image of the light source of the acquisition mobile robot carrying Information, including:
The first image information of the mobile robot is obtained, wherein, described first image information includes the LED array Image information;
Accordingly, the colouring information according to the multiple LED and arrangement information, recognize the mobile robot, bag Include:
According to the colouring information of the corresponding LED array of described first image information and arrangement information, recognize described removable Robot.
8. method according to claim 7, it is characterised in that the positional information according to the multiple LED, positioning The mobile robot, including:
According to positional information of the LED array in described first image information, determine the mobile robot described Position in the corresponding place of first image information.
9. method according to claim 8, it is characterised in that described to be believed according to the LED array in described first image Positional information in breath, after determining position of the mobile robot in the corresponding place of described first image information, Also include:
According to positional information of each LED in described first image information in the LED array, the removable movement machine is determined Direction of the people in the corresponding place of described first image information.
10. method according to claim 8 or claim 9, it is characterised in that the field division is multiple subfields, per height The top in place is to that should have first camera;
First image information for obtaining the mobile robot, including:
The first image of mobile robot described in the subfield ground is obtained by the first camera above each subfield ground Information.
11. the method according to any one of claim 1-5,8,9, it is characterised in that also include:
Obtain the second image information that the second camera of the mobile robot carrying is shot;
According to second image information, the ambient condition information of the mobile robot is determined;
According to the ambient condition information of the mobile robot, the direction of motion of the mobile robot is controlled.
12. method according to claim 11, it is characterised in that second image information is the image letter of no compression Breath.
13. method according to claim 12, it is characterised in that also include:
Receive the electrical parameter information and/or remaining life-information for the electrical source of power that the mobile robot is sent.
14. method according to claim 13, it is characterised in that also include:
Show the ambient image, the positional information of the mobile robot and the removable motivation of the mobile robot The status information of device people.
15. method according to claim 14, it is characterised in that the positional information of the mobile robot and described The status information of mobile robot is shown in embedded form in the ambient image of the mobile robot.
16. method according to claim 15, it is characterised in that the positional information of the mobile robot includes as follows It is at least one:The location information of the mobile robot, the motion track information of the mobile robot.
17. method according to claim 16, it is characterised in that the status information of the mobile robot includes as follows It is at least one:The identification information of the mobile robot, the orientation information of the mobile robot, the removable movement machine The electrical parameter information of the electrical source of power of people, the remaining life-information of the mobile robot.
18. a kind of mobile robot recognizes alignment system, it is characterised in that including:One or more processors, individually or Jointly work, the processor is used for:
Obtain the image information of the light source of mobile robot carrying;
According to the image information of the light source, recognize, position the mobile robot.
19. mobile robot according to claim 18 recognizes alignment system, it is characterised in that also include:
The imaging sensor being connected is communicated with the processor, described image sensor is used for the image information for catching light source, and Send the image information of the light source to the processor.
20. mobile robot according to claim 19 recognizes alignment system, it is characterised in that the light source is included such as Lower at least one:Multiple LEDs, fluorescent lamp, infrared ray.
21. mobile robot according to claim 20 recognizes alignment system, it is characterised in that the light source is multiple LED, the image information of the light source includes following at least one:The colouring information of the multiple LED, the multiple LED The arrangement information of lamp, the positional information of the multiple LED.
22. mobile robot according to claim 21 recognizes alignment system, it is characterised in that the processing implement body For the colouring information according to the multiple LED and arrangement information, the mobile robot is recognized.
23. mobile robot according to claim 21 recognizes alignment system, it is characterised in that the processing implement body For the positional information according to the multiple LED, the mobile robot is positioned.
24. the mobile robot identification alignment system according to claim 22 or 23, it is characterised in that the multiple LED constitutes editable LED array.
25. mobile robot according to claim 24 recognizes alignment system, it is characterised in that described image sensor The first image information specifically for catching the mobile robot, wherein, described first image information includes described LED gusts The image information of row;
Accordingly, colouring information and row of the processor with specific reference to the corresponding LED array of described first image information Cloth information, recognizes the mobile robot.
26. mobile robot according to claim 25 recognizes alignment system, it is characterised in that the processor also has Positional information of the body according to the LED array in described first image information, determines the mobile robot described Position in the corresponding place of one image information.
27. mobile robot according to claim 26 recognizes alignment system, it is characterised in that the processor is also used According to positional information of each LED in described first image information in the LED array, the mobile robot is determined Direction in the corresponding place of described first image information.
28. the mobile robot identification alignment system according to claim 26 or 27, it is characterised in that draw in the place With being divided into multiple subfields, the top on each subfield ground is to that should have first camera;
Accordingly, the processor is obtained by the first camera above each subfield ground and may move described in the subfield ground First image information of robot.
29. the mobile robot identification alignment system according to any one of claim 18-23,26,27, its feature exists In, in addition to:
The radio receiver being connected is communicated with the processor, the radio receiver is used to receive the removable movement machine The second image information that the second camera of people's carrying shoots and sent;
The processor is additionally operable to, according to second image information, determine the ambient condition information of the mobile robot; According to the ambient condition information of the mobile robot, the control for controlling the mobile robot direction of motion is generated Order.
30. mobile robot according to claim 29 recognizes alignment system, it is characterised in that also include:
The wireless base station apparatus being connected is communicated with the processor, the wireless base station apparatus is used for the mobile robot The control command is sent, so that the mobile robot changes the direction of motion according to the control command.
31. mobile robot according to claim 30 recognizes alignment system, it is characterised in that the second image letter Breath is the image information of no compression.
32. mobile robot according to claim 31 recognizes alignment system, it is characterised in that the wireless receiving dress Put the electrical parameter information for being additionally operable to receive the electrical source of power that the mobile robot is sent and/or remaining life-information.
33. mobile robot according to claim 32 recognizes alignment system, it is characterised in that also include:
The display being connected is communicated with the processor, the display is used for the environment map for showing the mobile robot The status information of picture, the positional information of the mobile robot and the mobile robot.
34. mobile robot according to claim 33 recognizes alignment system, it is characterised in that the removable movement machine The positional information of people and the status information of the mobile robot are shown in the removable motivation in embedded form In the ambient image of device people.
35. mobile robot according to claim 34 recognizes alignment system, it is characterised in that the removable movement machine The positional information of people includes following at least one:The location information of the mobile robot, the fortune of the mobile robot Dynamic trace information.
36. mobile robot according to claim 35 recognizes alignment system, it is characterised in that the removable movement machine The status information of people includes following at least one:The identification information of the mobile robot, the court of the mobile robot To information, the electrical parameter information of the electrical source of power of the mobile robot, the remaining life-information of the mobile robot.
37. a kind of camera, provided with camera lens module, it is characterised in that the camera also includes:
Housing, the housing outer surface is provided with lamp window;
Multiple LEDs, in the housing, and the light sent passes through the lamp window;
Controller, is electrically connected with the multiple LED;
Wherein, the controller drives the LED to light, and controls the working condition of the multiple LED.
38. the camera according to claim 37, it is characterised in that the working condition includes following at least one:It is described The glow color of LED, the arrangement mode of luminous the multiple LED.
39. the camera according to claim 38, it is characterised in that the LED is RGB LEDs, the controller control Make the glow color of the RGB LEDs.
40. the camera according to claim 39, it is characterised in that also include:
The image transmission being connected with the camera lens module, described image transmitting device is used for catch the camera lens module Original image is directly transferred out in the way of without compression.
41. camera according to claim 40, it is characterised in that the internal cavities of the housing be separated into lamp chamber and Camera lens chamber, the multiple LED is arranged in the lamp chamber, and the camera lens module is arranged on the camera lens intracavitary.
42. camera according to claim 41, it is characterised in that the housing includes top cover, upper lid and base, institute State lid to be located between the top cover and the base, the upper lid is spliced to form the lamp chamber jointly with the top cover, described Upper lid is spliced to form the camera lens chamber jointly with the base.
43. the camera according to claim any one of 37-42, it is characterised in that LED one lamp window of correspondence.
44. the camera according to claim any one of 37-42, it is characterised in that the multiple LED constitutes editable LED array.
45. a kind of mobile robot, it is characterised in that including:
Fuselage;
Mobile device, is connected with fuselage, the power for providing the fuselage movement;
Camera, installed in the top of the fuselage, the camera is provided with camera lens module, in addition to:
Housing, the housing outer surface is provided with lamp window;
Multiple LEDs, in the housing, and the light sent passes through the lamp window;
Controller, is electrically connected with the multiple LED;
Wherein, the controller drives the LED to light, and controls the working condition of the multiple LED.
46. mobile robot according to claim 45, it is characterised in that the working condition includes following at least one Kind:The glow color of the LED, the arrangement mode of luminous the multiple LED.
47. mobile robot according to claim 46, it is characterised in that the LED is RGB LEDs, described Controller controls the glow color of the RGB LEDs.
48. mobile robot according to claim 47, it is characterised in that the camera also includes and the camera lens mould The image transmission of group connection, described image transmitting device is used for the original image for catching the camera lens module directly with nothing The mode of compression is transferred out.
49. mobile robot according to claim 48, it is characterised in that the internal cavities of the housing are separated into Lamp chamber and camera lens chamber, the multiple LED are arranged in the lamp chamber, and the camera lens module is arranged on the camera lens intracavitary.
50. mobile robot according to claim 49, it is characterised in that the housing include top cover, upper lid and Base, the upper lid is located between the top cover and the base, and the upper lid is spliced to form the lamp jointly with the top cover Chamber, the upper lid is spliced to form the camera lens chamber jointly with the base.
51. the mobile robot according to claim any one of 45-50, it is characterised in that a LED correspondence one Lamp window.
52. the mobile robot according to claim any one of 45-50, it is characterised in that the multiple LED is constituted Editable LED array.
53. the mobile robot according to claim any one of 45-50, it is characterised in that the mobile robot For remote control chassis vehicle.
CN201680002465.8A 2016-06-07 2016-06-07 Mobile robot recognition positioning method, device, system and mobile robot Pending CN107076557A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/085140 WO2017210866A1 (en) 2016-06-07 2016-06-07 Mobile robot identification and positioning method, device and system, and mobile robot

Publications (1)

Publication Number Publication Date
CN107076557A true CN107076557A (en) 2017-08-18

Family

ID=59623424

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201680002465.8A Pending CN107076557A (en) 2016-06-07 2016-06-07 Mobile robot recognition positioning method, device, system and mobile robot

Country Status (2)

Country Link
CN (1) CN107076557A (en)
WO (1) WO2017210866A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107958144A (en) * 2017-12-18 2018-04-24 王军 Unmanned plane identification system, recognition methods and control device
CN107967437A (en) * 2017-11-17 2018-04-27 深圳市易成自动驾驶技术有限公司 A kind of image processing method, device and computer-readable recording medium
CN108781258A (en) * 2018-02-12 2018-11-09 深圳前海达闼云端智能科技有限公司 Environment information determination method, device, robot and storage medium
CN110297105A (en) * 2018-03-23 2019-10-01 丰田自动车株式会社 Moving body
CN110352117A (en) * 2018-04-25 2019-10-18 深圳市大疆创新科技有限公司 Intelligent game place and system, system server, robot, control method
CN111844041A (en) * 2020-07-23 2020-10-30 上海商汤临港智能科技有限公司 Positioning assistance device, robot and visual positioning system
CN112305499A (en) * 2019-08-02 2021-02-02 华为技术有限公司 Method and device for positioning according to light source

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109008806B (en) * 2018-06-25 2023-06-20 东莞市光劲光电有限公司 Floor sweeping robot positioning system and method based on LED intelligent lamp positioning

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1297394A (en) * 1999-03-24 2001-05-30 索尼公司 Robot
CN2613661Y (en) * 2003-03-17 2004-04-28 上海金星报警器材厂 LED guiding mark lamp
US20090048772A1 (en) * 2007-08-13 2009-02-19 Samsung Electronics Co., Ltd. Method and apparatus for searching target location
CN101537618A (en) * 2008-12-19 2009-09-23 北京理工大学 Visual system for ball picking robot in stadium
CN102306026A (en) * 2010-08-02 2012-01-04 重庆大学 Perception system of soccer robot
CN102542294A (en) * 2011-12-29 2012-07-04 河海大学常州校区 Centralized control type soccer robot identification system and identification method for double visual information fusion
CN102661796A (en) * 2012-04-17 2012-09-12 中北大学 Active photoelectric marking method for MEMS infrared light supply array
CN103413450A (en) * 2013-08-12 2013-11-27 成都谱视科技有限公司 Positioning device
CN103837147A (en) * 2014-03-13 2014-06-04 北京理工大学 Active infrared dot-matrix type artificial road sign, intelligent body locating system and intelligent body locating method
CN103970134A (en) * 2014-04-16 2014-08-06 江苏科技大学 Multi-mobile-robot system collaborative experimental platform and visual segmentation and positioning method thereof
CN105352483A (en) * 2015-12-24 2016-02-24 吉林大学 Automotive body pose parameter detection system based on LED arrays
CN105527960A (en) * 2015-12-18 2016-04-27 燕山大学 Mobile robot formation control method based on leader-follow
CN105573316A (en) * 2015-12-01 2016-05-11 武汉科技大学 Autonomous-formation mobile swarm robot
CN205660739U (en) * 2016-06-07 2016-10-26 深圳市大疆创新科技有限公司 But but camera that mobile robot used and mobile robot

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1297394A (en) * 1999-03-24 2001-05-30 索尼公司 Robot
CN2613661Y (en) * 2003-03-17 2004-04-28 上海金星报警器材厂 LED guiding mark lamp
US20090048772A1 (en) * 2007-08-13 2009-02-19 Samsung Electronics Co., Ltd. Method and apparatus for searching target location
CN101537618A (en) * 2008-12-19 2009-09-23 北京理工大学 Visual system for ball picking robot in stadium
CN102306026A (en) * 2010-08-02 2012-01-04 重庆大学 Perception system of soccer robot
CN102542294A (en) * 2011-12-29 2012-07-04 河海大学常州校区 Centralized control type soccer robot identification system and identification method for double visual information fusion
CN102661796A (en) * 2012-04-17 2012-09-12 中北大学 Active photoelectric marking method for MEMS infrared light supply array
CN103413450A (en) * 2013-08-12 2013-11-27 成都谱视科技有限公司 Positioning device
CN103837147A (en) * 2014-03-13 2014-06-04 北京理工大学 Active infrared dot-matrix type artificial road sign, intelligent body locating system and intelligent body locating method
CN103970134A (en) * 2014-04-16 2014-08-06 江苏科技大学 Multi-mobile-robot system collaborative experimental platform and visual segmentation and positioning method thereof
CN105573316A (en) * 2015-12-01 2016-05-11 武汉科技大学 Autonomous-formation mobile swarm robot
CN105527960A (en) * 2015-12-18 2016-04-27 燕山大学 Mobile robot formation control method based on leader-follow
CN105352483A (en) * 2015-12-24 2016-02-24 吉林大学 Automotive body pose parameter detection system based on LED arrays
CN205660739U (en) * 2016-06-07 2016-10-26 深圳市大疆创新科技有限公司 But but camera that mobile robot used and mobile robot

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
周跃前: "基于多机并行的大场地机器人足球视觉系统的研究", 《中国优秀硕士学位论文全文数据库(电子期刊)》 *
徐德,谭民,李原: "《机器人视觉测量与控制》", 31 January 2016 *

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107967437A (en) * 2017-11-17 2018-04-27 深圳市易成自动驾驶技术有限公司 A kind of image processing method, device and computer-readable recording medium
CN107967437B (en) * 2017-11-17 2021-04-20 深圳市易成自动驾驶技术有限公司 Image processing method and device and computer readable storage medium
CN107958144A (en) * 2017-12-18 2018-04-24 王军 Unmanned plane identification system, recognition methods and control device
CN108781258A (en) * 2018-02-12 2018-11-09 深圳前海达闼云端智能科技有限公司 Environment information determination method, device, robot and storage medium
CN108781258B (en) * 2018-02-12 2021-05-28 达闼机器人有限公司 Environment information determination method, device, robot and storage medium
CN110297105A (en) * 2018-03-23 2019-10-01 丰田自动车株式会社 Moving body
CN110297105B (en) * 2018-03-23 2022-02-18 丰田自动车株式会社 Moving body
US11419193B2 (en) 2018-03-23 2022-08-16 Toyota Jidosha Kabushiki Kaisha Moving body
CN110352117A (en) * 2018-04-25 2019-10-18 深圳市大疆创新科技有限公司 Intelligent game place and system, system server, robot, control method
CN112305499A (en) * 2019-08-02 2021-02-02 华为技术有限公司 Method and device for positioning according to light source
CN111844041A (en) * 2020-07-23 2020-10-30 上海商汤临港智能科技有限公司 Positioning assistance device, robot and visual positioning system
CN111844041B (en) * 2020-07-23 2021-11-09 上海商汤临港智能科技有限公司 Positioning assistance device, robot and visual positioning system

Also Published As

Publication number Publication date
WO2017210866A1 (en) 2017-12-14

Similar Documents

Publication Publication Date Title
CN107076557A (en) Mobile robot recognition positioning method, device, system and mobile robot
US8818083B2 (en) System of drones provided with recognition beacons
US9635737B2 (en) Directional lighting system and method
CN100419376C (en) Machine vision detecting system and method
US20200389951A1 (en) Directional lighting system and method
JP3779308B2 (en) Camera calibration system and three-dimensional measurement system
CN110476148B (en) Display system and method for providing multi-view content
CN106595639A (en) Positioning system and positioning method and device thereof and robot
US20150241176A1 (en) Adaptive visual camouflage
US10171735B2 (en) Panoramic vision system
CN205660739U (en) But but camera that mobile robot used and mobile robot
US20180259171A1 (en) Light Ball Apparatus and Method
CN105517643B (en) For the reminding method and device and remote control battlebus of the information for pointing out substantial game role
CN110493569B (en) Monitoring target shooting tracking method and system
Soetedjo et al. Detecting laser spot in shooting simulator using an embedded camera
CN217745691U (en) Unmanned aerial vehicle fight system
CN206674127U (en) A kind of more camera lens photometric stereo camera devices of centralization
CN110430420A (en) A kind of five face CAVE display system integrated approaches based on small spacing LED
CN113082689B (en) Universal positioning system and method for firing target paper targets by ball firing and laser
EP3890300A1 (en) A self-propelled vehicle
CN109974531B (en) Video accurate target reading system and video accurate target reading method
CN108731677B (en) Robot navigation road sign and identification method
CN216448707U (en) Laser electronic target and target shooting system
CN213724805U (en) Wireless connection's infrared positioning system
CN109244272B (en) Terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20170818

RJ01 Rejection of invention patent application after publication