CN205660739U - But but camera that mobile robot used and mobile robot - Google Patents

But but camera that mobile robot used and mobile robot Download PDF

Info

Publication number
CN205660739U
CN205660739U CN201620552778.8U CN201620552778U CN205660739U CN 205660739 U CN205660739 U CN 205660739U CN 201620552778 U CN201620552778 U CN 201620552778U CN 205660739 U CN205660739 U CN 205660739U
Authority
CN
China
Prior art keywords
mobile robot
led
camera
information
image information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201620552778.8U
Other languages
Chinese (zh)
Inventor
包玉奇
贝世猛
戚晓林
苗向鹏
梁博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Dajiang Innovations Technology Co Ltd
Original Assignee
Shenzhen Dajiang Innovations Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Dajiang Innovations Technology Co Ltd filed Critical Shenzhen Dajiang Innovations Technology Co Ltd
Priority to CN201620552778.8U priority Critical patent/CN205660739U/en
Application granted granted Critical
Publication of CN205660739U publication Critical patent/CN205660739U/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Manipulator (AREA)

Abstract

An embodiment of the utility model provides a but but camera that mobile robot used and mobile robot. This camera is equipped with the camera lens module, and in addition, this camera still includes: casing, a plurality of LED lamp and controller, wherein, a plurality of LED lamp fixings are installed in the casing at casing, controller to be connected with a plurality of LED lamps electricity, controller drive LED lamp is luminous to control the operating condition of a plurality of LED lamps, so that a plurality of LED lamp sends the light of predetermineeing the pattern. The embodiment of the utility model provides a be equipped with a plurality of LED lamps on the camera, the controller drive LED lamp of camera is luminous to control the operating condition of a plurality of LED lamps, so that a plurality of LED lamp sends the light of predetermineeing the pattern, through the image information's to predetermineeing the pattern discernment, but can discern to load and have the mobile robot of this camera, thereby but conveniently discern mobile robot. Simultaneously, the camera is as identification system's module, its installation with dismantle comparatively convenient.

Description

The camera of mobile robot and mobile robot
Technical field
This utility model embodiment relates to robot field, particularly relates to the camera of a kind of mobile robot and can move Mobile robot.
Background technology
Robot, as the installations of execution work automatically, both can accept mankind commander, and can run again and compile in advance The program of row, plays the most important effect in producing industry, building industry or high-risk operations.
When Same Site multiple robot occurs simultaneously, need each robot is identified and is positioned, prior art In, in order to distinguish each robot, with each robot, it being provided with different Sign Boards, a Sign Board is for one machine of mark Device people.It addition, in order to position each robot, each robot is separately installed with location equipment, this location equipment can connect Receiving multiple wireless signal, and the strong and weak information of each wireless signal is sent to server, this server is according to each wireless signal Strong and weak information determines the location information of this location equipment, the location information of the robot that i.e. this location equipment is corresponding.
When robot is in motor process or when resisting with other robot, and the Sign Board with robot is easy Fall, cause the identity of robot to be difficult to;It addition, the identification system of existing mobile robot is directly integrated in and can move In mobile robot, it installs relatively complicated with dismounting.
Utility model content
This utility model embodiment provides camera and the mobile robot of a kind of mobile robot, existing to solve In technology, the identity of robot is difficult to and the identification system of existing mobile robot is difficult to install and the skill of dismounting Art problem.
One aspect of this utility model embodiment is to provide a kind of camera, is provided with camera lens module, and described camera also includes:
Housing;
Multiple LED, are arranged on described housing;
Controller, is arranged in described housing, and electrically connects with the plurality of LED;
Wherein, described controller drives described LED luminous, and controls the duty of the plurality of LED, so that Multiple LED send the light of predetermined pattern.
Another aspect of this utility model embodiment is to provide a kind of mobile robot, including:
Fuselage;
Mobile device, is connected with fuselage, for the power providing described fuselage to move;
Described camera, described camera is arranged on the top of described fuselage.
Another aspect of this utility model embodiment is to provide a kind of mobile robot identification alignment system, including:
Described mobile robot, and identify positioner;
Wherein, described identification positioner includes one or more processor and is connected with described processor communication Imageing sensor;
Described imageing sensor is used for catching the image information of the LED of described camera, and by the LED of described camera Image information sends described processor to;
Described processor for the image information of the image information according to the LED of described camera, identify, position described can Mobile robot.
The camera that this utility model embodiment provides is provided with multiple LED, and the controller of camera drives described LED to send out Light, and control the duty of the plurality of LED, so that multiple LED sends the light of predetermined pattern, by presetting The identification of the image information of pattern, it is possible to identify the mobile robot being mounted with this camera, thus identify removable easily Robot.Meanwhile, camera is as the module of the system of identification, and it installs more convenient with dismounting.
Accompanying drawing explanation
The flow chart of the mobile robot recognition positioning method that Fig. 1 provides for this utility model embodiment one;
The network structure that the mobile robot recognition positioning method that Figure 1A provides for this utility model embodiment one is suitable for;
The schematic diagram of the image information that Figure 1B provides for this utility model embodiment one;
The schematic diagram of the image information that Fig. 1 C provides for this utility model embodiment one;
The flow chart of the mobile robot recognition positioning method that Fig. 2 provides for this utility model embodiment two;
The flow chart of the mobile robot recognition positioning method that Fig. 3 provides for this utility model embodiment three;
The schematic diagram of the editable LED array that Fig. 3 A provides for this utility model embodiment three;
The schematic diagram of the first image information that Fig. 3 B provides for this utility model embodiment three;
The schematic diagram of the first image information that Fig. 3 C provides for this utility model embodiment three;
The schematic diagram of the first image information that Fig. 3 D provides for this utility model embodiment three;
The flow chart of the mobile robot recognition positioning method that Fig. 4 provides for this utility model embodiment four;
Fig. 4 A this utility model embodiment four provides the network structure that a kind of mobile robot recognition positioning method is suitable for;
The structure chart of the camera that Fig. 5 provides for this utility model embodiment five;
The structure chart of the camera that Fig. 6 provides for this utility model embodiment six;
The explosive view of the camera that Fig. 7 A provides for the embodiment of the present invention 11;
The left view of the camera that Fig. 7 B provides for the embodiment of the present invention 11;
The front view of the camera that Fig. 7 C provides for the embodiment of the present invention 11;
The top view of the camera that Fig. 7 D provides for the embodiment of the present invention 11;
The axonometric drawing of the camera that Fig. 7 E provides for the embodiment of the present invention 11;
The structure chart of the mobile robot that Fig. 8 provides for this utility model embodiment eight;
The structure chart of the mobile robot identification alignment system that Fig. 9 provides for this utility model embodiment nine;
The structure chart identifying positioner that Figure 10 provides for this utility model embodiment nine;
The structure chart identifying positioner that Figure 11 provides for this utility model embodiment nine.
Reference:
1-mobile robot 21-capture apparatus 22-server
23-display screen 51-processor 30-lens protection glass
31-camera lens 32-LED lamp window 33-LED lamp
34-figure passes plate 35-line 36-wire casing clamp
37-upper cover 38-base 39-camera board
40-5.8G antenna 41-power panel 42-fan
43-top cover 52-imageing sensor 53-radio receiver
The image of 54-wireless base station apparatus 55-display 71-robot B
Image 81 to 86-the first photographic head 89-camera lens module of 72-robot A
Many LED 92-controllers of 90-housing 91-
93-lamp window 94-image transmission 100-lamp chamber
101-camera lens chamber 1001-mobile device 1002-camera
Detailed description of the invention
Below in conjunction with the accompanying drawing in this utility model embodiment, the technical scheme in this utility model embodiment is carried out Clearly and completely describe, it is clear that described embodiment is only a part of embodiment of this utility model rather than whole Embodiment.Based on the embodiment in this utility model, those of ordinary skill in the art are not under making creative work premise The every other embodiment obtained, broadly falls into the scope of this utility model protection.
It should be noted that when assembly is referred to as " being fixed on " another assembly, and it can be directly on another assembly Or assembly placed in the middle can also be there is.When an assembly is considered as " connection " another assembly, and it can be to be directly connected to To another assembly or may be simultaneously present assembly placed in the middle.
Unless otherwise defined, all of technology used herein and scientific terminology are led with belonging to technology of the present utility model The implication that the technical staff in territory is generally understood that is identical.At term used in the description of the present utility model it is simply herein The purpose of specific embodiment is described, it is not intended that in limiting this utility model.Term as used herein " and/or " include Arbitrary and all of combination of one or more relevant Listed Items.
Below in conjunction with the accompanying drawings, embodiments more of the present utility model are elaborated.In the case of not conflicting, under Feature in the embodiment stated and embodiment can be mutually combined.
Embodiment one
This utility model embodiment one provides a kind of mobile robot recognition positioning method.Fig. 1 is that this utility model is real Execute the flow chart of the mobile robot recognition positioning method that example one provides.As it is shown in figure 1, the method in the present embodiment, permissible Including:
The image information of the light source that step S101, acquisition mobile robot carry.
In this utility model embodiment, mobile robot carries light source, this light source include following at least one: many Individual LED, fluorescent lamp, infrared ray.Such as, the fuselage of mobile robot is provided with multiple LED, in multiple LED Luminous LED can present different colors and arrangement mode;Or it is provided with difference on the fuselage of different mobile robots Color and the fluorescent lamp of shape;Again or it is provided with multiple infrared light emission point on the fuselage of each mobile robot, multiple The luminous shape that the arrangement mode of infrared light emission point is different, present is different.
The network structure that the mobile robot recognition positioning method that Figure 1A provides for this utility model embodiment one is suitable for Figure.As shown in Figure 1A, two mobile robot A and B, two mobile robots are had to hold in the coverage of capture apparatus 21 Being loaded with any one light source as above, the camera lens of capture apparatus 21 is provided with photosensitive material, photosensitive material may be connected to clap Take the photograph the processor of equipment 21, when photosensitive material senses the light that light source sends, send the signal of telecommunication to processor, so that processing Device controls the shooting push button of capture apparatus 21, thus automatically snaps the light source of luminescence, and the image information of light source is sent to clothes Business device 22.This utility model embodiment does not limit the number of the coverage inner machine people of capture apparatus 21.
Step S102, image information according to described light source, identify, position described mobile robot.
If two shown in Figure 1A mobile robot carries multiple LED respectively, LED luminous in multiple LED Lamp can present different colors and arrangement mode, the schematic diagram of the image information that Figure 1B provides for this utility model embodiment one. As shown in Figure 1B, this image information is the image information of the light source of capture apparatus 21 shooting, includes one in this image information Red numeral 1 and a blue numeral 2, wherein, red numeral 1 and blue numeral 2 are all by luminous LED structure Becoming, if the corresponding red team of prespecified redness, blue corresponding blue team, then server 22 is according to LED luminous in this image information Color that lamp is presented and arrangement mode, it may be determined that going out two mobile robots is No. 1 and No. 2 of blue team of red team respectively, It is achieved thereby that enemy and we identify.If during it addition, capture apparatus 21 only photographs a mobile robot, can be by this step One mobile robot is identified by method.
If two shown in Figure 1A mobile robot carries multiple infrared emitting point, multiple infrared light emissions respectively The luminous shape that the arrangement mode of point is different, present is different, the image information that Fig. 1 C provides for this utility model embodiment one Schematic diagram.As shown in Figure 1 C, this image information includes a circle and a square, wherein, circular equal with square It is made up of multiple infrared emitting points, if the corresponding red team of prespecified circle, the corresponding blue team of square, then server 22 The given shape constituted according to multiple infrared emitting points in this image information, it may be determined that go out two mobile robots and come respectively From red team and blue team, it is achieved thereby that enemy and we identify.If during it addition, capture apparatus 21 only photographs a mobile robot, By the method for this step, one mobile robot can be identified.
If two shown in Figure 1A mobile robot carries respectively without color, difform fluorescent lamp, then take Color that in the image information that business device 22 shoot according to capture apparatus 21, fluorescent lamp presents and form identification mobile robot.
It addition, image information as shown in figure ib or figure 1 c, this image information can be specifically mobile robot place The image information in place, server 22, according to light source position in the picture, determines mobile robot position in place, example Numeral 2 positions in the picture as shown in Figure 1B, can represent the place that No. 2 mobile robots of blue team shoot at capture apparatus 21 In position.
Additionally, the position of mobile robot changes in real time, capture apparatus 21 can obtain mobile robot carrying in real time The image information of light source so that server 22 determines the position of mobile robot in real time.As shown in Figure 1A, server 22 is also Connecting has display screen 23, display screen 23 to show the positional information of mobile robot, identity information and motion track information Deng.
It should be noted that Figure 1B, Fig. 1 C is merely illustrative the image information of the light source of mobile robot carrying, and Do not limit the concrete form that illuminating source presents.
The image information of the light source that the present embodiment is carried by mobile robot, determines the face of light source in image information Color, shape and light source, relative to the position of image information, can recognize that machine according to the CF of light source in image information People, can determine that mobile robot position in place according to light source relative to the position of image information, and improve identification can The precision of mobile robot, improves the positioning precision of mobile robot simultaneously.
Embodiment two
This utility model embodiment two provides a kind of mobile robot recognition positioning method.The present embodiment is in embodiment On the basis of one technical scheme provided, described light source is multiple LED, and the image information of described light source includes following at least one Kind: the colouring information of the plurality of LED, the arrangement information of the plurality of LED, the positional information of the plurality of LED.Figure The flow chart of the 2 mobile robot recognition positioning methods provided for this utility model embodiment two.As in figure 2 it is shown, this enforcement Method in example, may include that
The image information of multiple LED that step S201, acquisition mobile robot carry.
In the present embodiment, the light source of mobile robot carrying is specially multiple LED, and it is every that capture apparatus 21 shoots The image information of multiple LED of individual mobile robot carrying include following at least one: the color letter of the plurality of LED Breath, the arrangement information of the plurality of LED, the positional information of the plurality of LED, wherein, the positional information tool of multiple LED Body is multiple LED positional information in image information.
The executive agent of the present embodiment can be the server 22 in Figure 1A, and server 22 obtains can move from capture apparatus 21 The image information of multiple LED of mobile robot carrying, concrete acquisition process is consistent with the method in above-described embodiment one, this Place repeats no more.
Step S202, according to the colouring information of the plurality of LED and arrangement information, identify described mobile robot.
Step S203, positional information according to the plurality of LED, position described mobile robot.
Server 22, according to the colouring information of the plurality of LED and arrangement information, identifies described mobile robot Method and the positional information according to the plurality of LED, position method and above-described embodiment one of described mobile robot In method consistent, here is omitted.
The image information of the light source that the present embodiment is carried by mobile robot, determines the face of light source in image information Color, shape and light source, relative to the position of image information, can recognize that machine according to the CF of light source in image information People, can determine that mobile robot position in place according to light source relative to the position of image information, and improve identification can The precision of mobile robot, improves the positioning precision of mobile robot simultaneously.
Embodiment three
This utility model embodiment three provides a kind of mobile robot recognition positioning method.The present embodiment is in embodiment On the basis of two technical schemes provided, the plurality of LED constitutes editable LED array.Fig. 3 is that this utility model is implemented The flow chart of the mobile robot recognition positioning method that example three provides.As it is shown on figure 3, the method in the present embodiment, can wrap Include:
Step S301, obtaining the first image information of described mobile robot, wherein, described first image information includes The image information of described LED array;
In the present embodiment, the first image information is the image information of the mobile robot of capture apparatus 21 shooting, often Individual mobile robot carries an editable LED array, not only includes mobile robot, also in the first image information Including editable LED array.
The schematic diagram of the editable LED array that Fig. 3 A provides for this utility model embodiment three.As shown in Figure 3A, can compile The LED array collected includes multiple lines and multiple rows, and there is a LED in the cross point of each row and column, and the LED on the first row first row is designated as 11, the light on and off of each LED can control, the arrangement mode of multiple LED luminous in color controllable system, and LED array also Can control.Such as represent, with the circle of white, the LED gone out, represent luminous LED, luminous multiple LED with the circle of black Can present different shapes, such as Arabic numerals 1 or other characters, numeral, wherein, luminous multiple LED can show same A kind of color, it is also possible to show not same color.It addition, this utility model embodiment is not intended to the size of LED array.
The schematic diagram of the first image information that Fig. 3 B provides for this utility model embodiment three.As shown in Figure 3 B, shooting sets First image information of the mobile robot of standby 21 shootings includes the image 71 of robot B and the image 72 of robot A, image 71 and image 72 in include an editable LED array respectively, colouring information that each LED array presents and arrangement information are not With, concrete, the editable LED array in image 71 presents a blue numeral 2, editable in image 72 LED array presents a red numeral 1.
Step S302, according to the colouring information of described LED array corresponding to described first image information and arrangement information, know The most described mobile robot;
This utility model embodiment can determine mobile robot institute according to the color of LED luminous in LED array The group belonged to, such as, LED luminous in LED array all shows redness, then it represents that carry the removable machine of this LED array Genus Homo in red team, or, LED luminous in LED array all shows blueness, then it represents that carry the removable machine of this LED array Device Genus Homo is in blue team.When the group of competition is more, it is also possible to according to the combination of the color of LED luminous in LED array, really Determine the group belonging to mobile robot, such as, represent group 1 with red and blue combination, use the red and combination of yellow Expression group 2, the like.It addition, the arrangement mode of LED luminous in LED array can have multiple, such as, luminescence LED presents Arabic numerals, specific figure or character etc., determines the volume of mobile robot according to Arabic numerals Number, or the identity of mobile robot is determined according to specific figure or character.According to LED luminous in LED array Colouring information and arrangement information, i.e. can determine that mobile robot is which member in which competition group, thus real Show the accurate identification to mobile robot.
The first image information as shown in Figure 3 B includes the numeral 2 of blueness and red numeral 1, due to B pair, robot Answering image 71, robot A correspondence image 72, then can determine that robot B is No. 2 robots of blue team, robot A is the 1 of red team Number robot.
Step S303, according to described LED array positional information in described first image information, determine described removable Robot position in the place that described first image information is corresponding.
It addition, server 22 also can determine robot according to the red numeral 1 positional information in the first image information A position in the place that capture apparatus 21 shoots, i.e. red numeral 1 is relative to the positional representation machine of the first image information People A position in place.In like manner, server 22, according to the blue numeral 2 positional information in the first image information, determines Robot B position in the place that capture apparatus 21 shoots, i.e. blue numeral 2 is relative to the location tables of the first image information Show robot B position in place.
Step S304, according to LED each in described LED array positional information in described first image information, determine Described mobile robot in the place that described first image information is corresponding towards.
In LED array, the LED of each luminescence has two information: positional information and colouring information, therefore, and LED battle array In row, the LED of each luminescence is represented by (X, Y, C), and wherein, X represents light emitting LED lamp abscissa in LED array, Y table Showing light emitting LED lamp vertical coordinate in LED array, C represents the color of light emitting LED lamp.Each mobile robot of reasonable assumption The size of the LED array of carrying is identical.In LED array, each LED position in the first image information can be according to the first row The LED 11 of string position in the first image information and each LED determine relative to the skew of LED 11, thus really The figure that fixed luminous LED is constituted position in the first image information and direction.
The schematic diagram of the first image information that Fig. 3 C provides for this utility model embodiment three.As shown in Figure 3 C, according to LED The direction of the figure that LED luminous in array is constituted determine mobile robot towards, according to up north and down south, left west is right East, as shown in Figure 3 C, blue numeral 2 northwards, represent robot B in place towards north;The numeral 1 of bluish red color northwards, table Show robot A in place towards north.
Mobile robot position in place may change, in place towards being likely to change, such as can move Mobile robot stands in same point and turns round, change self towards.The first image that Fig. 3 D provides for this utility model embodiment three The schematic diagram of information.As shown in Figure 3 D, on the basis of Fig. 3 C, in each LED array, the LED 11 of the first row first row is first Position in image information all there occurs change, illustrates that the figure that LED luminous in each LED array is constituted is believed at the first image Direction in breath all there occurs change, and specifically, eastwards, the numeral 1 of bluish red color is westwards, then it represents that robot B for blue numeral 2 Towards east in place, robot A in place towards west.
This utility model embodiment carries LED array by mobile robot, obtains and carries moving of LED array The image information of mobile robot, can identify accurately according to the color of light emitting LED lamp in LED array in image information and arrangement can Mobile machine, according to LED array positional information in image information, can position mobile robot accurately, it addition, according to The direction of the figure that LED luminous in LED array is constituted, may further determine that mobile robot in image information towards.
Embodiment four
This utility model embodiment four provides a kind of mobile robot recognition positioning method.The present embodiment is in embodiment On the basis of three technical schemes provided, the field division residing for mobile robot is multiple subfield ground, each subfield ground Top is to there being first photographic head.The mobile robot identification location side that Fig. 4 provides for this utility model embodiment four The flow chart of method.As shown in Figure 4, the method in the present embodiment, may include that
Step S401, obtain removable machine described in described subfield ground by the first photographic head above each subfield ground First image information of people;
Fig. 4 A this utility model embodiment four provides the network structure that a kind of mobile robot recognition positioning method is suitable for Figure.As shown in Figure 4 A, the place 80 residing for mobile robot can be divided into 6 sub-places, and region as shown by dashed lines is each The top on subfield ground to there being first photographic head, the such as first photographic head 81-86 respectively corresponding sub-place, each first The image pickup scope of photographic head is the subfield ground that dotted arrow points to.The capture apparatus 21 that above-described embodiment relates to can be first Any one in photographic head 81-86, the image information that above-described embodiment relates to can be appointing in the first photographic head 81-86 Anticipating the image on the ground of subfield below of a shooting, as shown in Figure 4 A, each subfield on the ground has mobile robot.
It addition, in this utility model embodiment, LED array is arranged on the crown of each mobile robot, each first Photographic head, when subfield ground shooting image down over, can directly photograph the LED array on the mobile robot crown, thus Mobile robot is carried out whole audience location.The present embodiment does not limit the division methods on subfield ground, does not the most limit the individual of subfield ground Number.
Step S402, according to the colouring information of described LED array corresponding to described first image information and arrangement information, know The most described mobile robot;
Step S403, according to described LED array positional information in described first image information, determine described removable Robot position in the place that described first image information is corresponding;
Step S404, according to LED each in described LED array positional information in described first image information, determine Described mobile robot in the place that described first image information is corresponding towards;
Step S402-step S404 is consistent with above-mentioned steps S302-step S304 respectively, and concrete grammar is the most superfluous State.
Step S405, obtain described mobile robot carrying second camera shooting the second image information;
In this utility model embodiment, each mobile robot also carries second camera, and this second camera can To be set together with light source, it is also possible to and light source is separately positioned, i.e. second camera i.e. may be provided at mobile robot The crown, it is also possible to be arranged on metastomium.This second camera generates for the image shooting mobile robot surrounding Second image information.Preferably, described second image information is the image information without compression, and second camera connects wireless Device, second camera is sent directly to be believed by the second image in the way of without compression after photographing the second image information of surrounding Breath is sent to server.If the distance between mobile robot and server beyond wireless base station apparatus be wirelessly transferred away from From, then the second image information can be sent to server by trunking by mobile robot.
Step S406, according to described second image information, determine the ambient condition information of described mobile robot;
After server receives the second image information, the second image information is carried out image procossing, determine removable machine Whether the ambient condition information of people, the such as surrounding of mobile robot have barrier, and whether the surrounding of mobile robot has The robot of enemy, whether mobile robot has been run out of outside place etc..
Step S407, ambient condition information according to described mobile robot, control the fortune of described mobile robot Dynamic direction.
Server receives the ambient condition information of mobile robot, the ambient condition information of mobile robot is shown Show at display screen 23, display screen 23 may also display place 80, divide after subfield ground, and each subfield on the ground removable Robot, user can be by the mobile robot input control instruction on operating display 23, and user is inputted by server Control instruction is sent to mobile robot, to control the direction of motion of mobile robot.A certain machine is shown on display screen 23 There is barrier in the surrounding of device people, then user is by operating this robot on display screen 23, so that this robot cut-through thing Mode input control instruction, the robot that this control instruction is sent in place by server, make robot in place around Cross the barrier of reality, the method i.e. achieving user's remote control robot.
Step S408, the electrical quantity information receiving the electrical source of power that described mobile robot sends and/or residue life Information;
During it addition, mobile robot is fought in place or moves, electrical source of power constantly consumes electricity, electrical source of power Electrical quantity information include following at least one: electric current, voltage, power, dump energy.
The trunk of mobile robot can be provided with pressure transducer, when the size of the pressure that this pressure transducer senses Having exceeded threshold value, illustrated that mobile robot is relatively big by extraneous impulsive force, this mobile robot may receive the other side The strike that robot is heavy, the processor within this mobile robot is according to pressure transducer location and pressure The size of the pressure that sensor senses, determines the fatal degree that mobile robot is subject to, determines can move according to fatal degree The residue life-information of mobile robot.
Or the trunk of mobile robot can be provided with sensitive material, the robot of every team can hold infrared light beam Rifle, when sensitive material senses ultrared exposure rate or irradiation time beyond threshold value, illustrates this mobile robot quilt The infrared light beam rifle that the other side robot is held is fired upon, and the processor within mobile robot senses according to sensitive material Ultrared position, ultrared exposure rate and ultrared irradiation time, determine the injured journey of this mobile robot Degree or fatal degree, determine the residue life-information of mobile robot according to injured degree or fatal degree.
The electrical quantity information of electrical source of power and/or residue life-information can be sent to service by mobile robot in real time Device.
Step S409, show the ambient image of described mobile robot, described mobile robot positional information with And the status information of described mobile robot.
Server 22 receives the ambient condition information of mobile robot, by the ambient condition information of mobile robot Display is at display screen 23, it addition, server 22 can also be by the positional information of mobile robot and described removable machine The status information of people shows at display screen 23.
Wherein, the positional information of described mobile robot and the status information of described mobile robot are with embedding Form shows in the ambient image of described mobile robot.
The positional information of described mobile robot include following at least one: the location of described mobile robot letter Breath, the motion track information of described mobile robot.
The status information of described mobile robot include following at least one: the mark of described mobile robot letter Breath, the orientation information of described mobile robot, the electrical quantity information of the electrical source of power of described mobile robot, described move The residue life-information of mobile robot.
This utility model embodiment receives the second figure of the second camera shooting of mobile robot carrying by server As information, the second image information is the image information without compression, and reduce image information spreads out of time delay, it is ensured that server can be fast Fast receives the second image information, and server also can control removable according to the ambient condition information in the second image information The direction of motion of robot;It addition, server connects display, server is by the ambient image of mobile robot, position Information and status information show over the display, for location information, the motion rail of user's real time inspection mobile robot Mark information and status information, and then remotely control the direction of motion of mobile robot.
Embodiment five
This utility model embodiment five provides a kind of camera.The knot of the camera that Fig. 5 provides for this utility model embodiment five Composition, as it is shown in figure 5, this camera is provided with camera lens module 89, additionally includes: housing 90, multiple LED 91 and controller 92, Wherein, multiple LED 91 are arranged on housing 90;Controller 92 is arranged in housing 90, and electrically connects with multiple LED 91; Controller 92 drives LED luminous, and controls the duty of multiple LED 91, so that multiple LED 91 sends presets figure The light of case.
Described duty include following at least one: the glow color of described LED, luminous the plurality of LED Arrangement mode.
Described LED is RGB LED, and described controller controls the glow color of described RGB LED.
The camera that this utility model embodiment provides is provided with multiple LED, and the controller of camera drives described LED to send out Light, and control the duty of the plurality of LED, so that multiple LED sends the light of predetermined pattern, by presetting The identification of the image information of pattern, it is possible to identify the mobile robot being mounted with this camera, thus identify removable easily Robot.Meanwhile, camera is as the module of the system of identification, and it installs more convenient with dismounting.
Embodiment six
This utility model embodiment six provides a kind of camera.The knot of the camera that Fig. 6 provides for this utility model embodiment six Composition, as shown in Figure 6, on the basis of embodiment illustrated in fig. 5, this camera also includes the image transmitting being connected with camera lens module 89 Device 94, image transmission 94 directly transmits for the original image caught by described camera lens module 89 in the way of without compression Go out.
It addition, the internal cavities of housing 90 is separated into lamp chamber 100 and camera lens chamber 101, multiple LED 91 are arranged on lamp In chamber 100, camera lens module 89 is arranged in camera lens chamber 101.
This utility model embodiment is transmitted out by the way of the original image caught by camera lens module directly compresses with nothing Go, the time of compression of images and decompression can be saved, reduce figure and pass time delay, improve figure transfer efficiency.
Embodiment seven
This utility model embodiment seven provides a kind of camera.The blast of the camera that Fig. 7 A provides for the embodiment of the present invention 11 Figure, as shown in Figure 7 A, camera includes top cover 43, upper cover 37 and base 38, upper cover 37 between top cover 43 and base 38, Upper cover 37 and top cover 43 are spliced to form lamp chamber 100 as shown in Figure 6 jointly, and upper cover 37 and base 38 are spliced to form jointly such as Fig. 6 Shown camera lens chamber 101.
As shown in Figure 7 A, the camera lens module 89 in lens protection glass 30 and camera lens 31 pie graph 5 or Fig. 6;LED window 32 are specially the lamp window 93 in Fig. 5 or Fig. 6, and a corresponding LED window 32 of LED 33, and multiple LED are constituted can The LED array of editor;Figure passes plate 34 and is specially the image transmission 94 in Fig. 6;Line 35 can be used for being fixed on this camera On the fuselage of mobile robot;Wire casing clamp 36 can be used for connecting upper cover 37 and base 38;Camera board 39 can be used for fixed mirror 31;5.8G antenna 40 can be used for server radio transmitting image information, the positional information of mobile robot, status information Deng;Power panel 41 can be used for powering to this camera;Fan 42 can be used for freezing to LED 33, it is to avoid LED 33 is because of the most luminous And burn out, in order to identify accurately and position mobile robot.
The left view of the camera that Fig. 7 B provides for the embodiment of the present invention 11;Fig. 7 C provides for the embodiment of the present invention 11 The front view of camera;The top view of the camera that Fig. 7 D provides for the embodiment of the present invention 11;Fig. 7 E is the embodiment of the present invention 11 The axonometric drawing of the camera provided.
The enclosure interior cavity of camera is divided into lamp chamber and camera lens chamber by this utility model embodiment, multiple LED is installed In lamp chamber, camera lens module is arranged on camera lens intracavity, by by separated to multiple LED and camera lens module, it is to avoid camera lens mould The light that group sends affects the light that multiple LED send, and further increases the degree of accuracy identifying mobile robot.
Embodiment eight
This utility model embodiment eight provides a kind of mobile robot.Fig. 8 provides for this utility model embodiment eight The structure chart of mobile robot, in this utility model embodiment, mobile robot is said as a example by remote control chassis Bright.
As shown in Figure 8, mobile robot 1 includes fuselage 1003, mobile device 1001, camera 1002, wherein, mobile dress Put 1001 to be connected with fuselage, for the power providing described fuselage to move;Camera 1002 is arranged on the top of fuselage, camera 1002 It is specially the camera described in embodiment five, six, seven any embodiment.
The image information of the light source that this utility model embodiment is carried by mobile robot, determines light in image information The color in source, shape and light source, relative to the position of image information, can be known according to the CF of light source in image information Other robot, can determine that mobile robot position in place according to light source relative to the position of image information, improves Identify the precision of mobile robot, improve the positioning precision of mobile robot simultaneously.
Embodiment nine
This utility model embodiment nine provides a kind of mobile robot identification alignment system.Fig. 9 is that this utility model is real Execute the structure chart of the mobile robot identification alignment system that example nine provides, as it is shown in figure 9, mobile robot identification location system System 110 includes the mobile robot 1 described in above-described embodiment, and identifies positioner 50, identifies that positioner 50 is used for Identify, position mobile robot 1.
The structure chart identifying positioner that Figure 10 provides for this utility model embodiment nine.As shown in Figure 10, identification is fixed Position device 50 includes one or more processor 51, and the imageing sensor 52 being connected with processor 51 communication, one or many Individual processor 51 can either individually or collectively work, and imageing sensor 52 is used for catching the image information of the LED of camera 1002, And send the image information of the LED of camera 1002 to processor 51;Processor 51 is for the LED according to camera 1002 Image information, identifies, positions mobile robot 1.
Further, imageing sensor 52 is specifically for catching the first image information of described mobile robot, wherein, Described first image information includes the image information of described LED array;Accordingly, processor 51 is according to the face of described LED array Color information and arrangement information, identify described mobile robot;According to described LED array position in described first image information Confidence ceases, and determines described mobile robot position in the place that described first image information is corresponding;According to described LED battle array Each LED positional information in described first image information in row, determines that described mobile robot is at described first image In the place that information is corresponding towards.
The structure chart identifying positioner that Figure 11 provides for this utility model embodiment nine.On the basis of Figure 10, as Shown in Figure 11, identify that positioner 50 also includes that radio receiving unit 53, radio receiving unit 53 are connected with processor 51 communication, The second image letter that radio receiving unit 53 shoots for the second camera receiving the carrying of described mobile robot and sends Breath, described second image information is the image information of the surrounding of described mobile robot.
Further, identify that positioner 50 also includes wireless transmission unit 54, wireless transmission unit 54 and processor 51 Communication connects, and wireless transmission unit 54 is used for described mobile robot transmitting control commands, so that described removable machine People changes the direction of motion according to described control command.Preferably, described second image information is the image information without compression.
On the basis of Figure 10, identify that positioner 50 also includes that display unit 55, display unit 55 lead to processor 51 News connect, and display unit 55 is for showing the position letter of the ambient image of described mobile robot, described mobile robot Breath and the status information of described mobile robot.
Further, the positional information of described mobile robot include following at least one: described mobile robot Location information, the motion track information of described mobile robot.And/or, the status information bag of described mobile robot Include following at least one: the identification information of described mobile robot, the orientation information of described mobile robot, described move The electrical quantity information of the electrical source of power of mobile robot, the residue life-information of described mobile robot.
The image information of the light source that this utility model embodiment is carried by mobile robot, determines light in image information The color in source, shape and light source, relative to the position of image information, can be known according to the CF of light source in image information Other robot, can determine that mobile robot position in place according to light source relative to the position of image information, improves Identify the precision of mobile robot, improve the positioning precision of mobile robot simultaneously.
In several embodiments provided by the utility model, it should be understood that disclosed apparatus and method, Ke Yitong The mode crossing other realizes.Such as, device embodiment described above is only schematically, such as, and drawing of described unit Point, it is only a kind of logic function and divides, actual can have other dividing mode when realizing, and the most multiple unit or assembly can To combine or to be desirably integrated into another system, or some features can be ignored, or does not performs.Another point, shown or beg for The coupling each other of opinion or direct-coupling or communication connection can be the INDIRECT COUPLING by some interfaces, device or unit Or communication connection, can be electrical, machinery or other form.
The described unit illustrated as separating component can be or may not be physically separate, shows as unit The parts shown can be or may not be physical location, i.e. may be located at a place, or can also be distributed to multiple On NE.Some or all of unit therein can be selected according to the actual needs to realize the mesh of the present embodiment scheme 's.
It addition, each functional unit in each embodiment of this utility model can be integrated in a processing unit, also Can be that unit is individually physically present, it is also possible to two or more unit are integrated in a unit.Above-mentioned integrated Unit both can with use hardware form realize, it would however also be possible to employ hardware add SFU software functional unit form realize.
The above-mentioned integrated unit realized with the form of SFU software functional unit, can be stored in an embodied on computer readable and deposit In storage media.Above-mentioned SFU software functional unit is stored in a storage medium, including some instructions with so that a computer Equipment (can be personal computer, server, or the network equipment etc.) or processor (processor) perform this utility model The part steps of method described in each embodiment.And aforesaid storage medium includes: USB flash disk, portable hard drive, read only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disc or CD Etc. the various media that can store program code.
Those skilled in the art are it can be understood that arrive, for convenience and simplicity of description, only with above-mentioned each functional module Division be illustrated, in actual application, can be as desired by complete by different functional modules for above-mentioned functions distribution Become, the internal structure of device will be divided into different functional modules, to complete all or part of function described above.On State the specific works process of the device of description, be referred to the corresponding process in preceding method embodiment, do not repeat them here.
Last it is noted that various embodiments above is only in order to illustrate the technical solution of the utility model, rather than it is limited System;Although being described in detail this utility model with reference to foregoing embodiments, those of ordinary skill in the art should Understand: the technical scheme described in foregoing embodiments still can be modified by it, or to the most some or all of Technical characteristic carries out equivalent;And these amendments or replacement, do not make the essence of appropriate technical solution depart from this practicality new The scope of type each embodiment technical scheme.

Claims (14)

1. a camera, is provided with camera lens module, it is characterised in that described camera also includes:
Housing;
Multiple LED, are arranged on described housing;
Controller, is arranged in described housing, and electrically connects with the plurality of LED;
Wherein, described controller drives described LED luminous, and controls the duty of the plurality of LED, so that multiple LED sends the light of predetermined pattern.
Camera the most according to claim 1, it is characterised in that described duty include following at least one: described LED The glow color of lamp, the arrangement mode of luminous the plurality of LED.
Camera the most according to claim 2, it is characterised in that described LED is RGB LED, described controller controls The glow color of described RGB LED.
Camera the most according to claim 3, it is characterised in that also include:
Image transmission, is connected with described camera lens module, and the original image being used for catching described camera lens module is directly with nothing The mode of compression transfers out.
Camera the most according to claim 4, it is characterised in that the internal cavities of described housing is separated into lamp chamber and mirror Head cavity, the plurality of LED is arranged in described lamp chamber, and described camera lens module is arranged on described camera lens intracavity.
Camera the most according to claim 5, it is characterised in that described housing includes top cover, upper cover and base, described Upper cover is between described top cover and described base, and described upper cover and described top cover are spliced to form described lamp chamber jointly, described on Lid and described base are spliced to form described camera lens chamber jointly.
7. according to the camera described in any one of claim 1-6, it is characterised in that described housing is provided with multiple lamp window, a LED The corresponding lamp window of lamp.
8. a mobile robot, it is characterised in that including:
Fuselage;
Mobile device, is connected with fuselage, for the power providing described fuselage to move;
Camera described in any one of claim 1-7, described camera is arranged on the top of described fuselage.
9. a mobile robot identification alignment system, it is characterised in that including:
Mobile robot described in claim 8;And
Identify positioner, be used for identifying, positioning described mobile robot;
Wherein, described identification positioner includes one or more processor and the image being connected with described processor communication Sensor;
Described imageing sensor is for catching the image information of the LED of described camera, and the image of the LED by described camera Information sends described processor to;
Described processor, for the image information of the image information of the LED according to described camera, identifies, positions described may move Robot.
Mobile robot identification alignment system the most according to claim 9, it is characterised in that described imageing sensor Specifically for catching the first image information of described mobile robot, wherein, described first image information includes described LED battle array The image information of row;
Accordingly, described processor, according to the colouring information of described LED array and arrangement information, identifies described removable machine People;According to described LED array positional information in described first image information, determine that described mobile robot is described Position in the place that one image information is corresponding;According to LED each in described LED array position in described first image information Confidence cease, determine described mobile robot in the place that described first image information is corresponding towards.
11. mobile robot identification alignment systems according to claim 10, it is characterised in that also include:
Radio receiving unit, is connected with described processor communication, for receiving the second shooting of described mobile robot carrying Head shooting the second image information sent, described second image information is the image of the surrounding of described mobile robot Information.
12. mobile robot identification alignment systems according to claim 11, it is characterised in that also include:
Wireless transmission unit, is connected with described processor communication, is used for described mobile robot transmitting control commands, so that Described mobile robot changes the direction of motion according to control command;
And/or, described second image information is the image information without compression.
13. mobile robot identification alignment systems according to claim 11, it is characterised in that also include:
Display unit, is connected with described processor communication, for showing the ambient image of described mobile robot, described moving The positional information of mobile robot and the status information of described mobile robot.
14. mobile robot identification alignment systems according to claim 13, it is characterised in that described removable machine The positional information of people include following at least one: the location information of described mobile robot, the fortune of described mobile robot Dynamic trace information;
And/or, the status information of described mobile robot include following at least one: the mark of described mobile robot letter Breath, the orientation information of described mobile robot, the electrical quantity information of the electrical source of power of described mobile robot, described move The residue life-information of mobile robot.
CN201620552778.8U 2016-06-07 2016-06-07 But but camera that mobile robot used and mobile robot Expired - Fee Related CN205660739U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201620552778.8U CN205660739U (en) 2016-06-07 2016-06-07 But but camera that mobile robot used and mobile robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201620552778.8U CN205660739U (en) 2016-06-07 2016-06-07 But but camera that mobile robot used and mobile robot

Publications (1)

Publication Number Publication Date
CN205660739U true CN205660739U (en) 2016-10-26

Family

ID=57156750

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201620552778.8U Expired - Fee Related CN205660739U (en) 2016-06-07 2016-06-07 But but camera that mobile robot used and mobile robot

Country Status (1)

Country Link
CN (1) CN205660739U (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107076557A (en) * 2016-06-07 2017-08-18 深圳市大疆创新科技有限公司 Mobile robot recognition positioning method, device, system and mobile robot
CN107341498A (en) * 2017-05-22 2017-11-10 深圳市奇脉电子技术有限公司 A kind of biological identification device based on CIS starts lighting apparatus and starts method
CN107958144A (en) * 2017-12-18 2018-04-24 王军 Unmanned plane identification system, recognition methods and control device

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107076557A (en) * 2016-06-07 2017-08-18 深圳市大疆创新科技有限公司 Mobile robot recognition positioning method, device, system and mobile robot
WO2017210866A1 (en) * 2016-06-07 2017-12-14 深圳市大疆创新科技有限公司 Mobile robot identification and positioning method, device and system, and mobile robot
CN107341498A (en) * 2017-05-22 2017-11-10 深圳市奇脉电子技术有限公司 A kind of biological identification device based on CIS starts lighting apparatus and starts method
CN107958144A (en) * 2017-12-18 2018-04-24 王军 Unmanned plane identification system, recognition methods and control device

Similar Documents

Publication Publication Date Title
WO2017210866A1 (en) Mobile robot identification and positioning method, device and system, and mobile robot
US9635737B2 (en) Directional lighting system and method
CN205660739U (en) But but camera that mobile robot used and mobile robot
US20200389951A1 (en) Directional lighting system and method
CN109496042B (en) Light control method and device based on camera shooting assembly and light control equipment
JP2012510839A (en) Set of drone with recognition marker
KR102370732B1 (en) Method for performance directing and system using thereof
CN104144353B (en) Multizone environment light regime control method based on smart television
CN110023779B (en) Control device, wireless communication terminal, and position estimation system
CN104797311A (en) Ambient light control and calibration via console
WO2014132580A1 (en) Unit device and wireless power supply information providing system
CN114786311B (en) BIM-based visual basement light source arrangement method
CN102184007B (en) Interactive intelligent conference system based on pattern recognition and using method thereof
US20200257831A1 (en) Led lighting simulation system
KR100930950B1 (en) Lighting control system of miniature
WO2020088990A1 (en) Management of light effects in a space
Le Francois et al. Top-down illumination photometric stereo imaging using light-emitting diodes and a mobile device
CN212411073U (en) Follow spot lamp automatic control system
JP6370733B2 (en) Information transmission device and information acquisition device
CN109621402B (en) Universal positioning system and method for image ball firing and laser firing
CN112422931B (en) Optical communication device and method for transmitting and receiving information
CN113661357B (en) Lighting control system
CN217787789U (en) Novel face identification reliability test system
WO2023282092A1 (en) Lighting system, processing terminal, and lighting control data generation method
CN215338827U (en) Light source board device and terminal testing device

Legal Events

Date Code Title Description
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20161026

CF01 Termination of patent right due to non-payment of annual fee