CN108318050A - Central controller and the system and method for utilizing the central controller mobile navigation - Google Patents

Central controller and the system and method for utilizing the central controller mobile navigation Download PDF

Info

Publication number
CN108318050A
CN108318050A CN201711342765.3A CN201711342765A CN108318050A CN 108318050 A CN108318050 A CN 108318050A CN 201711342765 A CN201711342765 A CN 201711342765A CN 108318050 A CN108318050 A CN 108318050A
Authority
CN
China
Prior art keywords
robot
central controller
point
location point
virtual route
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201711342765.3A
Other languages
Chinese (zh)
Other versions
CN108318050B (en
Inventor
荣乐天
黄强
李铭
代怀荣
罗为
谢恺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Fulian Fugui Precision Industry Co Ltd
Original Assignee
Fuhua Precision Industry (shenzhen) Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuhua Precision Industry (shenzhen) Co Ltd filed Critical Fuhua Precision Industry (shenzhen) Co Ltd
Priority to CN201711342765.3A priority Critical patent/CN108318050B/en
Publication of CN108318050A publication Critical patent/CN108318050A/en
Application granted granted Critical
Publication of CN108318050B publication Critical patent/CN108318050B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3602Input other than that of destination using image analysis, e.g. detection of road signs, lanes, buildings, real preceding vehicles using a camera

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a central controllers, including:Input/output interface, for receiving starting point and ending point information;Network element, for providing communication connection between the central controller and robot and camera;Memory, for storing data;Processor, for generating virtual route on the virtual map of navigation space according to the starting point and ending point information, wherein the virtual route includes multiple location points;The processor is additionally operable to send move to robot, is walked along the virtual route according to the move with controlling the robot;The processor is additionally operable to judge whether the robot reaches next location point in the virtual route by camera;And when the robot reaches next location point in the virtual route, judge whether next location point is terminating point, and terminates to navigate when next location point is terminating point.The present invention also provides the system and methods using the central controller mobile navigation.

Description

Central controller and the system and method for utilizing the central controller mobile navigation
Technical field
The present invention relates to navigation field more particularly to a kind of central controller and utilize the central controller mobile navigation System and method.
Background technology
With the fast development of modern production, many occasions are all made of robot assistance or replace being accomplished manually many withered Dry, cumbersome and dangerous work.Robot navigation method in the prior art includes four kinds.The first is by being laid on ground Guidance tape is come to guide the robot, the guidance tape can be magnetic rubber strip or coloured adhesive tape, the Robot The guidance tape walking;Second is laser navigation, by being equipped with rotary laser transceiver on the robot top, Guidance path in the environment as being equipped with reflector in wall, pillar or stationary machines, the laser transceiver is counted automatically The angle and distance that any reflector in the range of visibility returns is calculated, then by the angle and distance and is stored in robot In reflector layout be compared, to obtain the current location of the robot.And by the present bit of the robot It sets and is compared with the position on guidance path to confirm whether the robot walks on guidance path;The third is that inertia is led Boat confirms whether robot walks along guidance path by the transponder mounted on navigation space, is mounted on the robot On gyroscope can detect change of the robot on direction of travel and correct the direction of travel of the robot so that The Robot guidance path walking;4th kind is vision guide navigation, is remembered by the camera in the robot The characteristic information in travelling route is recorded, and when the robot advances again along the route, the spy of the record can be utilized Reference, which ceases, guides the robot, and 360 ° of images are shot by the stereocamera of special designing to obtain the feature in the method Information recycles the image to establish 3D maps, to allow the robot to navigate along guidance path, without it is artificial, its He realizes navigation feature at characteristic information, road sign or positioning system help.
Above-mentioned four kinds of methods all have some disadvantages, for example, needing to be laid with magnetic guidance tape in first way or have The guidance tape of color, and change route needs and re-lay, change is not easy;Laser navigation in the second way needs Expensive laser is installed in robot, while needing that reflector is installed in the environment in navigation routine;The inertial navigation Accumulated error is increasing at any time;Based on the camera in the robot, realization is led for the vision guide navigation Boat process is more complicated, and is not mature enough in real use.In addition, the laser navigation and vision guide navigation Use environment have limitation, can only navigate in visible environment.
Invention content
In view of the foregoing, it is necessary to which a kind of central controller and the system using the central controller mobile navigation are provided And method, the central controller controls robot navigation can be assisted by the camera in navigation space.
A kind of mobile navigation system, including central controller, robot and camera, the central controller respectively with institute It states and is communicated to connect between robot and camera, which includes:
The central controller, for after identifying the robot, receiving the starting point and ending point information of input;
The central controller is additionally operable to according to the starting point and ending point information on the virtual map of navigation space A virtual route is generated, wherein the virtual route includes multiple location points;
The robot, the move sent for receiving the central controller, and should according to move edge Virtual route is walked;
The central controller is additionally operable to judge whether the robot reaches the virtual route by the camera In next location point;And
When the robot reaches next location point in the virtual route, described in the central controller judgement Whether next location point is terminating point, and terminates to navigate when next location point is terminating point.
Preferably, which further includes:The central controller judge the robot next location point whether It needs to turn to, and when the robot is when next location point needs to turn to, the central controller sends steering order To the robot.
Preferably, the central controller includes a memory, is prestored on the virtual route in the memory Position of the position coordinates, the robot of each location point in the location drawing picture of each location point and each location drawing picture Set coordinate, wherein the position coordinates of each position point refer to being based on a navigation space region institute on the virtual route Coordinate in the first coordinate system established, the position coordinates in each location drawing picture refer to being established based on the location drawing picture The second coordinate system in coordinate.
Preferably, the memory also prestores the correspondence of first coordinate system and second coordinate system, The central controller is according to the correspondence of first coordinate system and second coordinate system, by the robot in the position It sets the position coordinates in image and is scaled coordinate of the robot in first coordinate system, so that it is determined that the robot is in institute State the location point in virtual route.
Preferably, the central controller is according between next location point in the starting point and the virtual route Distance and the robot movement speed, calculate the robot and be moved to time needed for next location point, in described Centre controller judges whether the robot will reach next location point according to the time.
Preferably, when the robot will reach next location point, this is next for the central controller controls The corresponding camera of location point shoots the location drawing picture of the robot, and sends the location drawing picture to the center control Device.
Preferably, the central controller judges the position coordinates of the location drawing picture and the pre-stored robot When the position coordinates in location drawing picture corresponding to next location point match, determine that the robot moves to reach under this One location point.
A kind of mobile navigation method, be applied to central controller in, the central controller respectively with robot and camera shooting Head communication connection, this method include:
After central controller identifies the robot, the starting point and ending point information of input is received;
The central controller generates one according to the starting point and ending point information on the virtual map of navigation space Virtual route;
The robot receives the move that the central controller is sent, and according to the move along the virtual road Diameter is walked;
The central controller judges whether the robot reaches next location point in the virtual route;
When the robot reaches next location point in the virtual route, described in the central controller judgement Whether robot needs to turn in next location point;
When the robot is when next location point needs to turn to, the robot receives the central controller The steering order of transmission, and continue to walk along the virtual route after being turned to according to the steering order, until reaching the terminating point.
Preferably, in step, " central controller judges whether the robot reaches the virtual route to this method In next location point " include:
The central controller is according to the distance between next location point in the starting point and the virtual route And the movement speed of the robot, it calculates the robot and is moved to time needed for next location point, the center control Device judges whether the robot will reach next location point according to the time;
When the robot will reach next location point, the central controller controls next location point pair The camera answered shoots the location drawing picture of the robot, and sends the location drawing picture to the central controller;And
When the position coordinates of the location drawing picture and the pre-stored robot are corresponding to next location point Location drawing picture in position coordinates when matching, the central controller determines that the robot moves to reach next position Point.
A kind of central controller, for controlling robot mobile navigation, which includes:
Input/output interface, the starting point and ending point information for receiving input;
Network element, for providing communication connection between the central controller and robot and camera;
Memory, for storing data;
Processor, for generating a void on the virtual map of a navigation space according to the starting point and ending point information Quasi- path, wherein the virtual route includes multiple location points;
The processor is additionally operable to send move to the robot, to control the robot according to the move It walks along the virtual route;
The processor is additionally operable to judge whether the robot reaches in the virtual route by the camera Next location point;And when the robot reaches next location point in the virtual route, judge described next Whether location point is terminating point, and terminates to navigate when next location point is terminating point.
Preferably, the processor is additionally operable to judge whether the robot needs to turn in next location point, and When the robot is when next location point needs to turn to, steering order is sent to the robot.
Preferably, position coordinates, the machine of each location point on the virtual route are prestored in the memory Position coordinates of the device people in the location drawing picture of each location point and each location drawing picture, wherein on the virtual route The position coordinates of each position point refer to the coordinate in the first coordinate system established based on a navigation space region, institute It refers to the coordinate in the second coordinate system established based on the location drawing picture to state the position coordinates in each location drawing picture.
Preferably, the memory also prestores the correspondence of first coordinate system and second coordinate, institute Correspondence of the central controller according to first coordinate system and second coordinate system is stated, in the position by the robot Position coordinates in image are scaled coordinate of the robot in first coordinate system, so that it is determined that the robot is described Location point in virtual route.
Preferably, the processor according between next location point in the starting point and the virtual route away from From and the robot movement speed, calculate the robot and be moved to time needed for next location point, the center control Device processed judges whether the robot will reach next location point according to the time.
Preferably, when the robot will reach next location point, the processor controls next position The corresponding camera of point shoots the location drawing picture of the robot, and sends the location drawing picture to the central controller.
Preferably, the processor judges the position coordinates of the location drawing picture with the pre-stored robot at this When the position coordinates in location drawing picture corresponding to next location point match, it is next to determine that the robot moves to reach this Location point.
Compared to the prior art, the central controller in this case and the system using the central controller mobile navigation and side Method can assist the central controller controls robot to lead using existing multiple cameras in navigation space Boat solves robot navigation in the prior art and the devices such as navigation sensor, ground magnetic stripe and laser sensor is needed to assist and band The cost problem come.
Description of the drawings
Fig. 1 is the schematic diagram of mobile navigation system in an embodiment of the present invention.
Fig. 2 is the schematic diagram of the central controller in mobile navigation system described in an embodiment of the present invention.
Fig. 3 is the schematic diagram of the robot in mobile navigation system described in an embodiment of the present invention.
Fig. 4 is the schematic diagram that mobile navigation system navigates in a navigation space in an embodiment of the present invention.
Fig. 5 is the flow chart of mobile navigation method in an embodiment of the present invention.
Main element symbol description
Following specific implementation mode will be further illustrated the present invention in conjunction with above-mentioned attached drawing.
Specific implementation mode
Referring to FIG. 1, showing the schematic diagram of mobile navigation system 100 in an embodiment of the present invention.The mobile navigation System 100 includes, but are not limited to central controller 10, robot 20 and multiple cameras 30.In the present embodiment, described It can be communicatively coupled by wired or wireless mode between central controller 10 and the multiple camera 30, in described Centre controller 10 can also be wirelessly communicatively coupled between the robot 20.The central controller 10 The robot 20 navigation can be assisted to control by the multiple camera 30 in navigation space.When the center When controller 10 receives the instruction that control robot 20 input by user navigates in the navigation space, the center control Device 10 generates a virtual route according to terminating point input by user and 20 position of the robot, and sends move extremely The robot 20 is walked with controlling the robot 20 according to the virtual route.In the present embodiment, the navigation space can Can also be the exterior space to be the interior space (such as workshop).
Referring to FIG. 2, the central controller 10 showing in an embodiment of the present invention in mobile navigation system 100 shows It is intended to.In the present embodiment, the central controller 10 is include but are not limited to, input/output interface 110, network element 111, memory 112 and processor 113.Above-mentioned input/output interface 110, network element 111, memory 112 and processor 113 Between be electrically connected.
In the present embodiment, user can be interacted by the input/output interface 110 and central controller 10. The input/output interface 110 can use contactless input mode, such as action input, outside acoustic control etc. either one The remote control unit set sends control command to processor 113 by way of wirelessly or non-wirelessly communicating.Input/output interface 110 can also be the either mechanical key input unit such as capacitive touch screen, resistive touch screen, other optical touch screens, Such as keyboard, driving lever, flywheel enter key etc..
In the present embodiment, the network element 111 is used to through wired or wireless network transmission mode be that center is controlled Device 10 processed provides network communication function.To which the central controller 10 can be logical with robot 20 and 30 network of multiple cameras Letter connection.
The cable network can be any types of traditional wire communication, such as internet, LAN.The wireless network can Think conventional wireless communication any types, such as radio, Wireless Fidelity (Wireless Fidelity, WIFI), honeycomb, Satellite, broadcast etc..Wireless communication technique may include, but be not limited to, global system for mobile communications (Global System for Mobile Communications, GSM), General Packet Radio Service (General Packet Radio Service, GPRS), CDMA (Code Division Multiple Access, CDMA), wideband code division multiple access (W-CDMA), CDMA2000, IMT single carrier (IMT Single Carrier), enhanced data rates for gsm evolution (Enhanced Data Rates for GSM Evolution, EDGE), it is Long Term Evolution (Long-Term Evolution, LTE), advanced long-term Evolution technology, time-division Long Term Evolution (Time-Division LTE, TD-LTE), the 5th third-generation mobile communication technology (5G), height Performance radio lan (High Performance Radio Local Area Network, HiperLAN), high-performance without Line electricity wide area network (High Performance Radio Wide Area Network, HiperWAN), local multiple spot distribute industry Be engaged in (Local Multipoint Distribution Service, LMDS), full micro-wave access global inter communication (Worldwide Interoperability for Microwave Access, WiMAX), ZigBee protocol (ZigBee), bluetooth, orthogonal frequency division multiplexing It is empty with technology (Flash Orthogonal Frequency-Division Multiplexing, Flash-OFDM), large capacity Division multiple access (High Capacity Spatial Division Multiple Access, HC-SDMA), General Mobile electricity Letter system (Universal Mobile Telecommunications System, UMTS), Universal Mobile Telecommunications System time-division Duplexing (UMTS Time-Division Duplexing, UMTS-TDD), evolved high-speed packet access (Evolved High Speed Packet Access, HSPA+), TD SDMA (Time Division Synchronous Code Division Multiple Access, TD-SCDMA), evolution data optimization (Evolution-Data Optimized, EV-DO), Digital Enhanced Cordless Communications (Digital Enhanced Cordless Telecommunications, DECT) and Other.
In the present embodiment, the memory 112 for store the software program being installed in the central controller 10 and Data.In the present embodiment, which can be the internal storage unit of the central controller 10, such as the center The hard disk or memory of controller 10.In other embodiments, the outside of the memory 112 or the central controller 10 The plug-in type hard disk being equipped in storage device, such as the central controller 10, intelligent memory card (Smart Media Card, SMC), secure digital (Secure Digital, SD) blocks, flash card (Flash Card) etc..
In the present embodiment, the memory 112 is also stored with the virtual map (such as electronic map) of the navigation space, The virtual map includes a plurality of virtual route.The virtual route is closed by the connection between location point and location point and location point System's composition.The virtual route defines multiple location points and each location point of the robot 20 on entire virtual route Corresponding position coordinates, and each robot 20 is by the sequencing of the multiple location point.Multiple location point includes Turning point and dwell point.In the present embodiment, the dwell point is defined as the robot 20 in the virtual route The point for needing one preset time of stop, for example, starting point and ending point.The turning point is defined as the robot 20 in institute The point of direction of travel can be changed by stating in virtual route.
In one embodiment, also prestored in the memory 112 robot 20 each location point figure As the position coordinates of (for convenience of description, hereinafter referred to as " location drawing picture ") and robot 20 in each location drawing picture. The virtual route is that the central controller 10 is defeated according to 20 place starting point of robot and user on the virtual map The shortest path that the terminating point entered generates.The central controller 10 plans the shortest path using diameter algorithm is sought.It is described Virtual route includes multiple dwell points and multiple turning points.
In present embodiment, the position coordinates of each location point on the virtual route can refer to based on entire navigation Coordinate in the first coordinate system (XOY) that space region is established.And the robot 20 is in each location drawing picture Position coordinates refer to the coordinate in the second coordinate system (X'O'Y') established based on the location drawing picture.Second coordinate system (X'O'Y') coordinate in corresponds to the pixel of the location drawing picture.In present embodiment, also deposited in advance in the memory 112 Store up the position coordinates corresponding to the scanning range of each camera 30, wherein the position coordinates can refer to being sat described first Coordinate in mark system.
In the present embodiment, it is empty mounted on the navigation to be also stored with the multiple camera 30 for the memory 112 Between location information, and the location point information on the virtual route that can scan of camera 30 in the position.
In the present embodiment, the processor 113 can be central processing unit (Central Processing Unit, CPU) or other microprocessors or other data processing chips for being able to carry out control function.The processor 113 is used In execution software program code or operational data etc..
In the present embodiment, the multiple camera 30 is mounted on the eminence of navigation space.The navigation space can be with Be the interior space can also be the exterior space.When the navigation space is the interior space, the multiple camera 30 can pacify Mounted in indoor ceiling;When the navigation space is the exterior space, the multiple camera 30 may be mounted at outdoor pillar On.It is understood that the range that the multiple camera 30 can scan needs to cover the navigation space, so as to scan The robot 20 in the navigation space.
In present embodiment, the central controller 10 can be computer, smart mobile phone, tablet computer, individual digital Assistant, notebook computer etc..
Referring to FIG. 3, the schematic diagram for the robot showing in mobile navigation system described in an embodiment of the present invention. In the present embodiment, the robot 20 can be one or more.The robot 20 includes, but are not limited to battery 210, walking unit 211, radio-cell 212 and controller 213.Above-mentioned battery 210, walking unit 211, radio-cell 212 and It is electrically connected between controller 213.
In the present embodiment, which is used for the walking unit 211, radio-cell 212 and controller 213 It is powered.The walking unit 211 is used to be walked according to the move that the robot 20 receives.The walking unit 211 It can be wheeled, caterpillar or leg formula.The radio-cell 212 be used for for the robot 20 and central controller 10 it Between network connection is provided.The controller 213 is walked for controlling the walking unit 211 according to the virtual route, the control Device 213 processed can also control the speed of travel and the direction of the robot 20.
In the present embodiment, the robot 20 can also include a charhing unit (not shown), the charging Unit is used to provide electricity for the battery 210.
In the present embodiment, the central controller 10 shoots the image of the robot 20 by the camera 30 To identify the robot 20.Specifically, the robot 20 possesses unique shape, the camera 30 obtains the machine The image can be sent after the image of device people 20 to the central controller 10, the central controller 10 utilizes image recognition skill It is described to identify that art compares the image that the image prestored to the robot 20 in memory 112 is shot with the camera 30 Robot 20.When the similarity between the image of the robot 20 of the image and shooting of pre-stored robot 20 is more than or equal to When one preset value, the central controller 10 can identify the robot 20, and can send control instruction to control the machine Device people 20;When the similarity between the image of the robot 20 of the image and shooting of pre-stored robot 20 is less than described pre- If when value, 10 None- identified of the central controller robot 20.
In another embodiment, the central controller 10 scans spraying by the camera 30 or is pasted onto described The Quick Response Code on 20 surface of robot identifies the robot 20.Specifically, the camera 30 can scan the robot The coding information is sent to the center by the Quick Response Code of 20 top surfaces to obtain the coding information of the robot 20 Controller 10.The central controller 10 by compare the coding information prestored to the robot 20 in memory 112 with The camera 30 scans the two-dimensional code the coding information of acquisition to identify the robot 20.When pre-stored robot 20 When coding information is consistent with the coding information of acquisition, the central controller 10 can identify the robot 20, and can send out Control instruction is sent to control the robot;When the coding information of pre-stored robot 20 and the coding information of acquisition are inconsistent When, 10 None- identified of the central controller robot 20.
It is understood that 20 surface of the robot can also include different colours or shape by spraying or pasting Marker, identify the robot 20 to facilitate the central controller 10 to pass through the camera 30 and scan the marker.
It can be identified it should be noted that the central controller 10 scans the robot 20 by the camera 30 The robot 20, while the robot 20 is positioned according to the image of the scanning of the camera 30, it is situated between behind detail It continues.
In the present embodiment, after the central controller 10 identifies the robot 20, user is received from described defeated Enter the starting point and ending point information of the input of output interface 110.The central controller 10 according to the starting point of the input and Terminating point information generates a virtual route on the virtual map of the navigation space.The robot 20 receives the center control The move that device 10 processed is sent, and walked along the virtual route according to the move.The central controller 10 judges institute It states robot 20 and whether reaches next location point in the virtual route, and confirming that the robot 20 reaches the void After next location point in quasi- path, determine whether the robot 20 needs in next position according to the virtual route Set a steering.When determining that the robot 20 is needed in next location point steering, direction information is sent to the machine People 20.The robot 20 is after receiving the direction information, after being turned to according to the direction information, walks on to the void Next location point in quasi- path, until the robot 20 reaches terminating point.
In the present embodiment, the central controller 10 judges whether the robot 20 reaches in the virtual route The scheme of next location point specifically include:
The central controller 10 is placed with that can scan the image forward direction that the camera 30 of next location point is shot When the lower left corner be center of circle O', be laterally X' axis, longitudinal is Y' axis, establishes second coordinate system (X'O'Y').The center Controller 10 determines position coordinates (X', Y') of the robot 20 in second coordinate system (X'O'Y').The position is sat Mark the pixel that (X', Y') corresponds in described image.The central controller 10 is according to first coordinate system (XOY) and institute The correspondence for stating the second coordinate system (X'O'Y') converts the position coordinates (X', Y') of the robot 20 in the images For coordinate (X, Y) of the robot 20 in first coordinate system (XOY), thus the central controller 10 can confirm this Whether robot has arrived at next location point.
Specifically, the central controller 10 obtains the current captured image of the camera 30, when robot 20 The position coordinates (X'1, Y'1) of captured image and position of the robot 20 corresponding to next location point in this prior When position coordinates (X'2, Y'2) in image match, the central controller 10 can determine that the robot 20 moves to reach Next location point.Location drawing picture and robot 20 corresponding to next location point is corresponding to next location point Location drawing picture in position coordinates (X'2, Y'2) be stored in advance in the memory 112.
In one embodiment, the position coordinates (X'1, Y'1) match with position coordinates (X'2, Y'2) to be Refer to when " X'1 " is located at section [X'2-M, X'2+M], and " Y'1 " is located at section [Y'2-M, Y'2+M], you can be determined as position Coordinate (X'1, Y'1) matches with position coordinates (X'2, Y'2).The value of " M " and " N " can be preset, for example, M and N Value be equal to 2,3 or other values.
It navigates in a navigation space referring to FIG. 4, showing mobile navigation system 100 in an embodiment of the present invention Schematic diagram.In the present embodiment, the navigation space is indoor workshop, and the interior workshop includes six producing lines, respectively For producing line 1 to producing line 6.Eminence (such as ceiling) in the navigation space is equipped with multiple cameras 30.For example, in Fig. 3 Four corners of the interior workshop ceiling are installed by one camera 30 respectively.Wherein, it is mounted on the upper left of the indoor workshop The range A1 that the camera 30 at angle can scan covering includes producing line 1 and producing line 3, is mounted on the upper right corner of the indoor workshop The range that camera 30 can scan covering includes producing line 2, and the camera 30 for being mounted on the lower left corner of the indoor workshop can be with The range of scanning covering includes producing line 5, and the camera 30 mounted on the lower right corner of the indoor workshop can scan the model of covering It includes producing line 4 and producing line 6 to enclose A2.
In the present embodiment, the robot 20 is needed from the starting point (stop where the navigation space upper left corner Point) run to terminating point (dwell point in the lower right corner of the navigation space).The machine is identified in the central controller 10 After people 20, the starting point and ending point information that user inputs from the input/output interface 110 is received.The central controller 10 generate a virtual route according to the starting point and ending point information on the virtual map of the navigation space.Such as Fig. 3 institutes Show, which includes two dwell points and four turning points.Described two dwell points are respectively first stop S1 and Two dwell point S2, four turning points be respectively the first turning point T1, the second turning point T2, third turning point T3 and the 4th turn Curved point T4.
The robot 20 receives the move of the transmission of the central controller 10, and according to the move along the void Quasi- path walking.The move includes translational speed information.The central controller 10 is according to the first stop S1 Whether the distance between first turning point T1 and velocity estimated robot 20 of the robot 20 movement will reach first Turning point T1.In the present embodiment, the central controller 10 according to the first stop S1 and the first turning point T1 it Between distance and the robot 20 movement speed, can calculate the robot 20 be moved to needed for first turning point T1 when Between.The central controller 10 can judge whether the robot 20 will reach first turning according to the time as a result, Point T1.For example, the central controller 10 starts timing after sending move to the robot 20, when the timing time When difference between the time calculated the central controller 10 is less than preset value, illustrate the robot 20 will reach this One turning point T1.
When the robot 20 will reach first turning point T1, central controller 10 is according to pre-stored camera The location information of 30 location informations, the first turning point T1 determines that camera corresponding with the first turning point T1 30 (is mounted on institute State the upper left corner of indoor workshop), and the camera 30 for controlling the determination shoots the image of the robot 20 and sends the machine The image of device people 20 is to the central controller 10.The central controller 10 is according to the image of the shooting and prestores to depositing Location drawing picture of the robot 20 in first turning point T1 in reservoir 112 is compared, whether to confirm the robot 20 Have arrived at first turning point T1.After the robot 20 has arrived at first turning point T1, the central controller 10 The direction information turned left is sent to the robot 20.The robot 20 is after receiving the direction information, according to the steering Information starts to turn left, and walks on to the second turning point T2.
The central controller 10 is according to the distance between the first turning point T1 and the second turning point T2 and the machine Whether velocity estimated robot 20 that people 20 moves will reach the second turning point T2.Due to the second turning point T2 In the scanning range mounted on the camera 30 in the lower right corner, therefore when the robot 20 will reach the second turning point T2 When, camera 30 corresponding second turning point T2 (lower right corner for being mounted on navigation space) shoots the figure of the robot 20 Picture, and send shooting the robot 20 image to the central controller 10, the central controller 10 is according to the bat The image taken the photograph and prestoring to the robot 20 in memory 112 is compared in the location drawing picture of second turning point T2 It is right, to confirm whether the robot 20 has arrived at second turning point T2.When the central controller 10 confirms the robot After 20 have arrived at second turning point T2, right-handed direction information is sent to the robot 20.The robot 20 exists After receiving the direction information, start to turn right according to the direction information, and walks on to third turning point T3.So cycle, directly The second dwell point S2 (terminating point) is run to the robot 20.
Referring to FIG. 5, showing the flow chart of mobile navigation method in an embodiment of the present invention.According to different demands, The sequence of step can change in the flow chart, and certain steps can be omitted or merge.
Step S51 receives starting point input by user and end after the central controller 10 identifies the robot 20 Stop information.
In the present embodiment, the central controller 10 shoots the image of the robot 20 by the camera 30 To identify the robot 20.Specifically, the robot 20 possesses unique shape, the camera 30 obtains the machine The image can be sent after the image of device people 20 to the central controller 10, the central controller 10 is deposited in advance by comparing It stores up and identifies the robot 20 to the image of the robot 20 in memory 112 and the image of the shooting of the camera 30.
The central controller 10 can also scan spraying by the camera 30 or be pasted onto 20 table of the robot The Quick Response Code in face identifies the robot 20.
It is understood that 20 surface of the robot can also include different colours or shape by spraying or pasting Marker, identify the robot 20 to facilitate the central controller 10 to pass through the camera 30 and scan the marker.
It can be identified it should be noted that the central controller 10 scans the robot 20 by the camera 30 The robot 20, while the robot 20 is positioned according to the image of the scanning of the camera 30.
Step S52, the central controller 10 generate a virtual route according to the starting point and ending point information.It is described Virtual route is made of the connection relation between location point and location point and location point.The location point includes dwell point and turning Point.In the present embodiment, the dwell point be defined as the robot 20 need to stop one in the virtual route it is pre- If the point of time, for example, starting point and ending point.The turning point is defined as the robot 20 in the virtual route The point of direction of travel can be changed.
Step S53, the robot 20 receive the move of the transmission of the central controller 10, and are referred to according to the movement It enables and walking along the virtual route.The move includes speed of travel information.In the present embodiment, the robot 20 Controller 213 controls the walking unit 211 and is walked along the virtual route according to the move.
Step S54, it is next in the virtual route that the central controller 10 judges whether the robot 20 reaches A location point.When the robot 20 reaches next location point on the virtual route, flow enters step S55;When When the robot 20 is without reaching next location point on the virtual route, flow return to step S53.
In the present embodiment, the central controller 10 according to the starting point with it is next in the virtual route The movement speed of the distance between location point and the robot 20 calculates the robot 20 and is moved to needed for next location point Time, the central controller 10 judges whether the robot 10 will reach next location point according to the time. For example, the central controller 10 starts timing after sending move to the robot 20, when the timing time and institute When stating the difference between the time of the calculating of central controller 10 and being less than preset value, illustrate that this will be reached by the robot 20 that this is next A location point.
When the robot 20 will reach next location point, the central controller 10 controls next position It puts corresponding camera 30 and shoots the location drawing picture of the robot 20, and send the location drawing picture to the central controller 10。
The central controller 10 is placed with that can scan the image forward direction that the camera 30 of next location point is shot When the lower left corner be center of circle O', be laterally X' axis, longitudinal is Y' axis, establishes second coordinate system (X'O'Y').The center Controller 10 determines position coordinates (X', Y') of the robot 20 in second coordinate system (X'O'Y').The position is sat Mark the pixel that (X', Y') corresponds in described image.The central controller 10 is according to first coordinate system (XOY) and institute The correspondence for stating the second coordinate system (X'O'Y') converts the position coordinates (X', Y') of the robot 20 in the images For coordinate (X, Y) of the robot 20 in first coordinate system (XOY), thus the central controller 10 can confirm this Whether robot has arrived at next location point.
Specifically, the central controller 10 obtains the current captured image of the camera 30, when robot 20 The position coordinates (X'1, Y'1) of captured image and position of the robot 20 corresponding to next location point in this prior When position coordinates (X'2, Y'2) in image match, the central controller 10 can determine that the robot 20 moves to reach Next location point.Location drawing picture and robot 20 corresponding to next location point is corresponding to next location point Location drawing picture in position coordinates (X'2, Y'2) be stored in advance in the memory 112.
Step S55, the central controller 10 judge whether the robot 20 needs to turn in next location point To.When the robot 20 is when next location point needs to turn to, flow enters step S56;When the robot 20 It need not be turned in next location point, flow return to step S53.
In the present embodiment, the central controller 10 can judge the robot 20 according to the virtual route Whether need to turn in next location point.For example, joining Fig. 4 it is found that the robot 20 is in the first turning point T1, second Turning point T2 and base-leg turn point T4 are required for turning to, and the robot 20 need not be turned in third turning point T3.
Step S56, the robot 20 receive the steering order of the transmission of the central controller 10, and are referred to according to the steering It enables and continues to walk along the virtual route after turning to.
Step S57, the central controller 10 judge whether the robot reaches terminating point.When the robot 20 arrives Up to terminating point, flow terminates;When the robot 20 is without reaching terminating point, flow return to step S53.
Embodiment of above is merely illustrative of the technical solution of the present invention and unrestricted, although with reference to the above preferable embodiment party Formula describes the invention in detail, it will be understood by those of ordinary skill in the art that, it can be to technical scheme of the present invention The spirit and scope of technical solution of the present invention should not be all detached from by being modified or replaced equivalently.

Claims (16)

1. a kind of mobile navigation system, including central controller, robot and camera, which is characterized in that the center control Device communicates to connect between the robot and camera respectively, which includes:
The central controller, for after identifying the robot, receiving the starting point and ending point information of input;
The central controller is additionally operable to be generated on the virtual map of navigation space according to the starting point and ending point information One virtual route, wherein the virtual route includes multiple location points;
The robot, the move sent for receiving the central controller, and it is virtual along this according to the move It walks in path;
The central controller is additionally operable to judge whether the robot reaches in the virtual route by the camera Next location point;And
When the robot reaches next location point in the virtual route, the central controller judges described next Whether a location point is terminating point, and terminates to navigate when next location point is terminating point.
2. mobile navigation system as described in claim 1, which is characterized in that the system further includes:The central controller is sentenced Whether the disconnected robot needs to turn in next location point, and when the robot needs to turn in next location point Xiang Shi, the central controller send steering order to the robot.
3. mobile navigation system as described in claim 1, which is characterized in that the central controller includes a memory, institute It states and prestores on the virtual route position coordinates of each location point, the robot in memory in each location point Position coordinates in location drawing picture and each location drawing picture, wherein the position of each position point is sat on the virtual route Mark refers to the coordinate in the first coordinate system for being established based on a navigation space region, in each location drawing picture Position coordinates refer to the coordinate in the second coordinate system established based on the location drawing picture.
4. mobile navigation system as claimed in claim 3, which is characterized in that the memory also prestores described first and sits The correspondence of mark system and second coordinate system, the central controller is according to first coordinate system and second coordinate Position coordinates in the robot in the position image are scaled the robot in first coordinate by the correspondence of system Coordinate in system, so that it is determined that location point of the robot in the virtual route.
5. mobile navigation system as claimed in claim 4, which is characterized in that the central controller according to the starting point with The movement speed of the distance between next location point in the virtual route and the robot calculates the robot and is moved to Time needed for next location point, the central controller judge whether the robot will reach this according to the time Next location point.
6. mobile navigation system as claimed in claim 5, which is characterized in that when the robot will reach next position When point, the corresponding camera of the central controller controls next location point shoots the location drawing picture of the robot, and The location drawing picture is sent to the central controller.
7. mobile navigation system as claimed in claim 6, which is characterized in that the central controller judges the location drawing picture Position coordinates and location drawing picture of the pre-stored robot corresponding to next location point in position coordinates When matching, determine that the robot moves to reach next location point.
8. a kind of mobile navigation method, be applied in central controller, the central controller respectively with robot and camera Communication connection, which is characterized in that this method includes:
After central controller identifies the robot, the starting point and ending point information of input is received;
The central controller generates one virtually according to the starting point and ending point information on the virtual map of navigation space Path;
The robot receives the move that the central controller is sent, and according to the move along the virtual route row It walks;
The central controller judges whether the robot reaches next location point in the virtual route;
When the robot reaches next location point in the virtual route, the central controller judges the machine Whether people needs to turn in next location point;
When the robot is when next location point needs to turn to, the robot receives the central controller and sends Steering order, and according to the steering order turn to after continue along the virtual route walk, until reach the terminating point.
9. mobile navigation method as claimed in claim 8, which is characterized in that this method in step, " sentence by the central controller Whether the disconnected robot reaches next location point in the virtual route " include:
The central controller is according to the distance between next location point in the starting point and the virtual route and should The movement speed of robot calculates the robot and is moved to time needed for next location point, the central controller root Judge whether the robot will reach next location point according to the time;
When the robot will reach next location point, the central controller controls next location point is corresponding Camera shoots the location drawing picture of the robot, and sends the location drawing picture to the central controller;And
Position coordinates when the location drawing picture and position of the pre-stored robot corresponding to next location point When setting the position coordinates in image and matching, the central controller determines that the robot moves to reach next location point.
10. a kind of central controller, for controlling robot mobile navigation, which is characterized in that the central controller includes:
Input/output interface, the starting point and ending point information for receiving input;
Network element, for providing communication connection between the central controller and robot and camera;
Memory, for storing data;
Processor, for generating a virtual road on the virtual map of a navigation space according to the starting point and ending point information Diameter, wherein the virtual route includes multiple location points;
The processor is additionally operable to send move to the robot, is somebody's turn to do according to move edge with controlling the robot Virtual route is walked;
The processor is additionally operable to judge whether the robot reaches by the camera next in the virtual route A location point;And when the robot reaches next location point in the virtual route, next position is judged Whether point is terminating point, and terminates to navigate when next location point is terminating point.
11. central controller as claimed in claim 10, which is characterized in that the processor is additionally operable to judge the robot Whether need to turn in next location point, and when the robot is when next location point needs to turn to, transmission turns To instruction to the robot.
12. central controller as claimed in claim 10, which is characterized in that prestore the virtual road in the memory The position coordinates of each location point, the robot are in the location drawing picture of each location point and each location drawing picture on diameter Position coordinates, wherein on the virtual route position coordinates of each position point refer to be based on a navigation space location Coordinate in the first coordinate system that domain is established, the position coordinates in each location drawing picture refer to being based on the location drawing picture institute Coordinate in the second coordinate system established.
13. central controller as claimed in claim 12, which is characterized in that the memory also prestores described first and sits The correspondence of mark system and second coordinate, the central controller is according to first coordinate system and second coordinate system Correspondence, the position coordinates in the robot in the position image are scaled the robot in first coordinate system In coordinate, so that it is determined that location point of the robot in the virtual route.
14. central controller as claimed in claim 13, which is characterized in that the processor according to the starting point with it is described The movement speed of the distance between next location point in virtual route and the robot calculates the robot and is moved under this Time needed for one location point, the central controller judge it is next whether the robot will reach this according to the time A location point.
15. central controller as claimed in claim 14, which is characterized in that when the robot will reach next position When point, the processor controls the corresponding camera of the next location point and shoots the location drawing picture of the robot, and sends The location drawing picture is to the central controller.
16. central controller as claimed in claim 15, which is characterized in that the processor judges the position of the location drawing picture Set the position coordinates phase in the location drawing picture of coordinate and the pre-stored robot corresponding to next location point Timing determines that the robot moves to reach next location point.
CN201711342765.3A 2017-12-14 2017-12-14 Central controller and the system and method for utilizing the central controller mobile navigation Active CN108318050B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711342765.3A CN108318050B (en) 2017-12-14 2017-12-14 Central controller and the system and method for utilizing the central controller mobile navigation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711342765.3A CN108318050B (en) 2017-12-14 2017-12-14 Central controller and the system and method for utilizing the central controller mobile navigation

Publications (2)

Publication Number Publication Date
CN108318050A true CN108318050A (en) 2018-07-24
CN108318050B CN108318050B (en) 2019-08-23

Family

ID=62892713

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711342765.3A Active CN108318050B (en) 2017-12-14 2017-12-14 Central controller and the system and method for utilizing the central controller mobile navigation

Country Status (1)

Country Link
CN (1) CN108318050B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110426030A (en) * 2019-08-14 2019-11-08 金同磊 Indoor navigation method and system
CN111198556A (en) * 2018-10-31 2020-05-26 富华科精密工业(深圳)有限公司 Automatic navigation method, central controller and storage medium
CN111435245A (en) * 2018-12-26 2020-07-21 富华科精密工业(深圳)有限公司 Automatic navigation system and method
CN111829510A (en) * 2019-04-15 2020-10-27 富华科精密工业(深圳)有限公司 Automatic navigation method, server and storage medium
CN113391631A (en) * 2021-05-11 2021-09-14 北京迈格威科技有限公司 Operation control method and device for mobile device, storage medium and mobile device

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101231174A (en) * 2007-01-25 2008-07-30 乐金电子(昆山)电脑有限公司 Method for guiding residual distance/estimate run time of mobile object in guidance system
CN101556647A (en) * 2009-05-20 2009-10-14 哈尔滨理工大学 mobile robot visual orientation method based on improved SIFT algorithm
US20110098923A1 (en) * 2009-10-26 2011-04-28 Electronics And Telecommunications Research Institute Method of and apparatus for creating map of artificial marks, and method and apparatus for measuring position of moving object using the map
CN103197679A (en) * 2013-03-22 2013-07-10 长沙理工大学 Accurate positioning method for orbit type routing-inspection robot
CN103717358A (en) * 2011-08-02 2014-04-09 索尼公司 Control system, display control method, and non-transitory computer readable storage medium
CN104199450A (en) * 2014-09-17 2014-12-10 上海畔慧信息技术有限公司 Swarm robot control system
CN105116886A (en) * 2015-08-11 2015-12-02 余路 Robot autonomous walking method
CN105352508A (en) * 2015-10-22 2016-02-24 深圳创想未来机器人有限公司 Method and device of robot positioning and navigation
CN105698784A (en) * 2016-03-22 2016-06-22 成都电科创品机器人科技有限公司 Indoor robot positioning system and method
CN106017458A (en) * 2016-05-18 2016-10-12 宁波华狮智能科技有限公司 Combined navigation method and device for mobile robot
CN106291517A (en) * 2016-08-12 2017-01-04 苏州大学 Indoor cloud robot angle positioning method based on position and visual information optimization
CN106863332A (en) * 2017-04-27 2017-06-20 广东工业大学 A kind of robot visual orientation method and system
CN106931945A (en) * 2017-03-10 2017-07-07 上海木爷机器人技术有限公司 Robot navigation method and system

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101231174A (en) * 2007-01-25 2008-07-30 乐金电子(昆山)电脑有限公司 Method for guiding residual distance/estimate run time of mobile object in guidance system
CN101556647A (en) * 2009-05-20 2009-10-14 哈尔滨理工大学 mobile robot visual orientation method based on improved SIFT algorithm
US20110098923A1 (en) * 2009-10-26 2011-04-28 Electronics And Telecommunications Research Institute Method of and apparatus for creating map of artificial marks, and method and apparatus for measuring position of moving object using the map
CN103717358A (en) * 2011-08-02 2014-04-09 索尼公司 Control system, display control method, and non-transitory computer readable storage medium
CN103197679A (en) * 2013-03-22 2013-07-10 长沙理工大学 Accurate positioning method for orbit type routing-inspection robot
CN104199450A (en) * 2014-09-17 2014-12-10 上海畔慧信息技术有限公司 Swarm robot control system
CN105116886A (en) * 2015-08-11 2015-12-02 余路 Robot autonomous walking method
CN105352508A (en) * 2015-10-22 2016-02-24 深圳创想未来机器人有限公司 Method and device of robot positioning and navigation
CN105698784A (en) * 2016-03-22 2016-06-22 成都电科创品机器人科技有限公司 Indoor robot positioning system and method
CN106017458A (en) * 2016-05-18 2016-10-12 宁波华狮智能科技有限公司 Combined navigation method and device for mobile robot
CN106291517A (en) * 2016-08-12 2017-01-04 苏州大学 Indoor cloud robot angle positioning method based on position and visual information optimization
CN106931945A (en) * 2017-03-10 2017-07-07 上海木爷机器人技术有限公司 Robot navigation method and system
CN106863332A (en) * 2017-04-27 2017-06-20 广东工业大学 A kind of robot visual orientation method and system

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111198556A (en) * 2018-10-31 2020-05-26 富华科精密工业(深圳)有限公司 Automatic navigation method, central controller and storage medium
CN111198556B (en) * 2018-10-31 2024-03-12 深圳富联富桂精密工业有限公司 Automatic navigation method, central controller and storage medium
CN111435245A (en) * 2018-12-26 2020-07-21 富华科精密工业(深圳)有限公司 Automatic navigation system and method
CN111829510A (en) * 2019-04-15 2020-10-27 富华科精密工业(深圳)有限公司 Automatic navigation method, server and storage medium
CN110426030A (en) * 2019-08-14 2019-11-08 金同磊 Indoor navigation method and system
CN113391631A (en) * 2021-05-11 2021-09-14 北京迈格威科技有限公司 Operation control method and device for mobile device, storage medium and mobile device

Also Published As

Publication number Publication date
CN108318050B (en) 2019-08-23

Similar Documents

Publication Publication Date Title
CN108318050B (en) Central controller and the system and method for utilizing the central controller mobile navigation
WO2019144541A1 (en) Cleaning robot
US9075444B2 (en) Information input apparatus, information input method, and computer program
JP6659317B2 (en) Position and orientation estimation device, position and orientation estimation program, and vacuum cleaner system
CN106774310A (en) A kind of robot navigation method
CN107000839A (en) The control method of unmanned plane, device, the control system of equipment and unmanned plane
US20170203439A1 (en) System for operating mobile robot based on complex map information and operating method thereof
CN103135756A (en) Method and system for generating control instruction
CN105242670A (en) Robot having function of automatic return charging, system and corresponding method
CN105486311A (en) Indoor robot positioning navigation method and device
CN102902271A (en) Binocular vision-based robot target identifying and gripping system and method
CN105737820A (en) Positioning and navigation method for indoor robot
KR20180067724A (en) Interfacing with a mobile telepresence robot
KR20130031088A (en) Mobile robot and controlling method of the same
CN110477825A (en) Clean robot, recharging method, system and readable storage medium storing program for executing
CN110202573A (en) Full-automatic hand and eye calibrating, working face scaling method and device
US20230057965A1 (en) Robot and control method therefor
CN106970618A (en) A kind of unmanned boat control method and system
CN107167138A (en) A kind of intelligent Way guidance system and method in library
WO2019001237A1 (en) Mobile electronic device, and method in mobile electronic device
WO2018014420A1 (en) Light-emitting target recognition-based unmanned aerial vehicle tracking control system and method
CN110515383A (en) The method and mobile robot of recharging
KR102190743B1 (en) AUGMENTED REALITY SERVICE PROVIDING APPARATUS INTERACTING WITH ROBOT and METHOD OF THEREOF
WO2022017341A1 (en) Automatic recharging method and apparatus, storage medium, charging base, and system
CN108873911A (en) It is a kind of that luggage case and its control method are followed based on ROS automatically

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20221205

Address after: The first floor, the second floor, the third floor and the fourth floor of the factory building No.1, f8d District, Foxconn science and Technology Industrial Park, east side of Minqing Road, Longhua street, Shenzhen City, Guangdong Province

Patentee after: Shenzhen Fulian Fugui Precision Industry Co.,Ltd.

Address before: 518109 3rd floor, building 1, F8B, Foxconn Science Park, No.2, Donghuan 2nd Road, Longhua street, Longhua District, Shenzhen City, Guangdong Province

Patentee before: FUHUAKE PRECISION INDUSTRY (SHENZHEN) Co.,Ltd.