CN207115193U - A kind of mobile electronic device for being used to handle the task of mission area - Google Patents

A kind of mobile electronic device for being used to handle the task of mission area Download PDF

Info

Publication number
CN207115193U
CN207115193U CN201720917099.0U CN201720917099U CN207115193U CN 207115193 U CN207115193 U CN 207115193U CN 201720917099 U CN201720917099 U CN 201720917099U CN 207115193 U CN207115193 U CN 207115193U
Authority
CN
China
Prior art keywords
electronic device
mobile electronic
camera
module
path planning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201720917099.0U
Other languages
Chinese (zh)
Inventor
潘景良
陈灼
李腾
陈嘉宏
高鲁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ju Da Technology Co Ltd
Original Assignee
Ju Da Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ju Da Technology Co Ltd filed Critical Ju Da Technology Co Ltd
Priority to CN201720917099.0U priority Critical patent/CN207115193U/en
Application granted granted Critical
Publication of CN207115193U publication Critical patent/CN207115193U/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

A kind of mobile electronic device of task for handling mission area includes the first wireless signal transceiver, image processor, locating module, path planning module and motion module.First wireless signal transceiver is obtained as the user of the second mobile electronic device to the photo captured by task place and the selection area on photo;The characteristic information of photo of the image processor extraction comprising selection area, and by comparing the characteristic information of the characteristic information of extraction and the image map comprising positional information of storage, determine the actual coordinate scope of the selection area in photo;The distance between the present position of locating module record mobile electronic device and the actual coordinate scope of mission area scope;Path planning module generates path planning scheme according to the actual coordinate scope of selection area;Motion module is moved according to path planning scheme.

Description

A kind of mobile electronic device for being used to handle the task of mission area
Technical field
It the utility model is related to electronic device field.Specifically, it the utility model is related to intelligent robot system field.
Background technology
Traditional sweeping robot presses the map autonomous positioning and movement or collision reaction deflecting random walk of scanning, together When clean ground.Therefore, traditional sweeping robot is because drawing and location technology are immature or inaccurate, in the course of the work Ground complicated state can not be judged completely, the situation for losing position and direction easily occur.In addition, some types are not due to possessing Stationkeeping ability, can only be by the physical principle of collision reaction come deflecting, or even household articles or robot itself can be caused to damage Bad even personal injury, the problems such as being interfered to user.
Utility model content
The utility model proposes a kind of user can use cell phone terminal APP delineations object run region (cleaning) And sending a command to robot makes it be automatically performed the technology that delineation region is automatically brought into operation (cleaning).Function is drawn a circle to approve to realize, is carried Three kinds of modes for establishing indoor environment map are gone out.In addition, also achieve be accurate to up to user mobile phone terminal APP draw a circle to approve region and Effectively the path planning algorithm of cleaning is completed in covering delineation region.
One embodiment of the present utility model discloses a kind of mobile electronic device for being used to handle the task of mission area, Including the first wireless signal transceiver, image processor, locating module, path planning module and motion module, wherein:It is described First wireless signal transceiver is communicatively connected to the second mobile electronic device, is configured to obtain by second mobile electron The user of equipment is to the photo captured by task place and the selection area on the photo;Described image processor is communicably First wireless signal transceiver is connected to, is configured to the characteristic information of photo of the extraction comprising the selection area, and is led to Cross compare extraction characteristic information and storage the image map comprising positional information characteristic information, it is determined that with the photo The corresponding actual coordinate scope of the selection area;The locating module is communicatively coupled to described image processor, It is configured to record the present position of the mobile electronic device and the actual coordinate scope corresponding with the selection area The distance between scope;The path planning module is communicatively coupled to described image processor, be configured to according to it is described The corresponding actual coordinate scope of selection area, generate path planning scheme;The motion module is communicatively coupled to described Path planning module, it is configured to, according to the path planning scheme, be moved.
Brief description of the drawings
The detailed description that more complete understanding of the present utility model describes by referring to associated drawings is obtained, attached Similar reference refers to similar part in figure.
Fig. 1 shows the schematic diagram of the system according to where the mobile electronic device of one embodiment of the present utility model.
Fig. 2A and Fig. 2 B respectively illustrate the mobile electronic device of utilization second according to one embodiment of the present utility model Second camera captured by mission area, and delineation of second mobile electronic device to mission area.
Fig. 3 shows the mobile electronic device according to one embodiment of the present utility model, where the second mobile electronic device The schematic diagram of system.
Fig. 4 shows the method flow diagram in mobile electronic device according to one embodiment of the present utility model.
Fig. 5 shows the chessboard figure that the chequered with black and white rectangle shown by the display screen on mobile electronic device 100 is formed Schematic diagram.
Embodiment
Fig. 1 shows the schematic diagram of the system according to where the mobile electronic device of one embodiment of the present utility model.
Reference picture 1, mobile electronic device 100 include but is not limited to sweeping robot, industrial automation robot, service type Robot, Disaster Relief Robot of getting rid of the danger, underwater robot, robot for space, unmanned plane etc..It is appreciated that in order to following Two mobile electronic devices 140 distinguish, and mobile electronic device 100 is referred to as the first mobile electronic device 100.
Second mobile electronic device 140 includes but is not limited to:Mobile phone, tablet personal computer, notebook computer, remote control etc..Move Dynamic electronic equipment alternatively includes operation interface.In an optional embodiment, the second mobile electronic device is mobile phone, behaviour It is cell phone application as interface.
Signal transmission form between mobile electronic device 100 and charging pile 160 includes but is not limited to:Bluetooth, WIFI, ZigBee, infrared, ultrasonic wave, ultra wide band (Ultra-wide Bandwidth, UWB) etc., are transmitted with signal in the present embodiment Mode is described exemplified by WIFI.
Mission area represents that mobile electronic device 100 performs the place of task.For example, appointing when mobile electronic device 100 Be engaged in for sweeping robot when, mission area represents that sweeping robot needs the region that cleans.In another example work as mobile electronic device 100 task is when getting rid of the danger Disaster Relief Robot, and mission area represents that the Disaster Relief Robot of getting rid of the danger needs the region speedily carried out rescue work.Task field The represented place for including whole mission area.
As shown in figure 1, the mobile electronic device of the task for handling mission area, including the first wireless signal transceiver 102nd, image processor 104, locating module 106, path planning module 108 and motion module 110.First wireless signal is received Hair device 102 can be communicably connected to the second mobile electronic device 140, be configured to obtain the use by the second mobile electronic device 140 Family is to the photo captured by task place and the selection area on photo.
Fig. 2A and Fig. 2 B respectively illustrate the mobile electronic device of utilization second according to one embodiment of the present utility model Mission area captured by 140 second camera 144, and the second mobile electronic device 140 user to selection area Delineation.
It is mobile phone with the second mobile electronic device 140, and mission area is illustrates exemplified by cleaning region.Such as Fig. 2A Shown in 2B, when starting cleaning task, the user of the second mobile electronic device 140 uses the second movement by cell phone application The position that second camera 144 on electronic equipment 140 cleans to needs is taken pictures (as shown in Figure 2 A), and is drawn a circle to approve in photo Target purging zone (as shown in Figure 2 B).The photo (the target purging zone for including delineation) passes through local wireless communication network (WIFI etc.) is sent to mobile electronic device 100 and is stored in memory 116.
Image processor 104 is communicatively coupled to the first wireless signal transceiver 102, is configured to extraction and includes selected area The characteristic information of the photo in domain, and by comparing the spy of the characteristic information of extraction and the image map comprising positional information of storage Reference ceases, it is determined that the actual coordinate scope corresponding with the selection area in photo.The positional information, which refers to, is establishing map process In, the location information of image characteristic point, i.e. real coordinate position in image map.The positional information is for example including charging pile 180 Position and/or the position of of mobile electronic device 100 itself.For example, image processor 104 can be by the position of charging pile 180 As the origin of coordinates.
Stored in the memory 116 of mobile electronic device 100 and establish institute during indoor environment map when using first The image map of foundation, such as off-the-air picture cartographic information, including image characteristic point and its positional information.In addition, image Processor 104 also extracts characteristic information and positional information in captured photo, and further utilizes Image Feature Point Matching Algorithm (such as SIFT, SURF etc.) carries out quick compare with the off-the-air picture map (containing positional information) in memory 116 and analyzed. According to the pixel characteristic and relative position of the user of the second mobile electronic device 140 selection area in cell phone application, mobile electron Image processor 104 in equipment 100 is by comparing the image characteristic point in off-the-air picture map, it is determined that being moved with photo second The coordinate range of the corresponding indoor actual area of user's selection area of dynamic electronic equipment 140.It is relative with user's selection area The determination mode for the indoor actual area scope answered is as follows:For example, image processor 104 can in the original selection area of user, Such as by appropriate increase corresponding percentage scope on the basis of region in the circle of finger pattern instruction in Fig. 2 B, for example, increase 10% scope, so as to ensure the selection area of finger pattern instruction within actual cleaning range, to determine actual coordinate model Enclose.Alternatively, image processor 104 can outwards offset in original region a certain distance to determine actual coordinate scope. Alternatively, image processor 104, which can obscure structure, includes the test pattern of actual coordinate scope.For example, by finger figure in figure The selected scope of case instruction is irregular approximate rectangular figure, and image processor 104 approximate rectangular can be converted to this The actual coordinate scope of rectangle is corresponded to, consequently facilitating mobile electronic device is cleaned, completes cleaning task.Image characteristic point It can use based on Scale invariant features transform (Scale Invariant Feature Transform, SIFT) algorithm or add Fast robust feature (Speeded Up Robust Features, SURF) algorithm identifies features described above.Using SIFT algorithms, it is necessary to Reference picture is stored in memory 116.Image processor 1040 identifies the reference picture of storage in the memory 110 first Object key point, extract SIFT feature, then by comparing each key point SIFT feature in memory 110 with newly adopting The SIFT feature of the image of collection, then the matching characteristic based on K k-nearest neighbors (K-Nearest Neighbor KNN), to know Object in other new images.SURF algorithms are responded based on approximate 2D Haar wavelet transforms (Haar wavelet), and utilize integration Image (integral images) carries out image convolution, has used and has estimated construction detection based on Hessian matrixes (Hessian matrix-based measure forthe detector), and used the description (a based on distribution distribution-based descriptor)。
Alternatively, or in addition to, it is it is determined that corresponding with user's selection area of the second mobile electronic device 140 in photo The coordinate range of indoor actual area the actual coordinate scope of mission area can be determined by coordinate Mapping and Converting.Second moves The characteristic point in image in dynamic electronic equipment 140 by with the Image Feature Point Matching in Tu Xiangditu, you can determine the second shifting The real coordinate position of the characteristic point in image in dynamic electronic equipment 140.Meanwhile the bat of user's camera can be calculated after matching Coordinate system transformational relation of the camera coordinates system taken the photograph where image with respect to the real world coordinate system where charging pile.In image Draw a circle to approve the boundary line that regional edge boundary line can be using discretization to be made up of point.The point relative image of discretization is special on boundary line in image The positional information of point, the real coordinate position of image characteristic point, and coordinate system transformational relation are levied, can be used for calculating boundary line Real coordinate position of the upper discretization point in real world coordinate system (i.e. charging pile coordinate system), i.e. room corresponding to boundary line The coordinate range of interior actual area.
Locating module 106 is communicatively coupled to image processor 104, is configured to record working as mobile electronic device 100 The distance between preceding position and the actual coordinate scope corresponding with selection area scope.For example, locating module 106 will fill It is set to the origin of coordinates at the electric place of stake 180, corresponding coordinate value (X, Y) of each point in image.Locating module 106 and volume Code device causes mobile electronic device 100 to know oneself current position.Locating module 106 is to calculate the first electronic equipment 100 to exist The module of indoor location.First electronic equipment 100 will know the indoor location of oneself constantly at work, all pass through positioning Module 106 is realized.
Path planning module 108 is communicatively coupled to image processor 104, is configured to according to corresponding with selection area Actual coordinate scope, generate path planning scheme.Alternatively, path planning module 108 is additionally operable to use the life based on grid Into tree path planning algorithm, path planning is carried out to selection area.For example, corresponding area of the path planning module 108 according to generation Domain coordinate range (the target purging zone of user's delineation), path is cleaned to the coordinate range plan optimization.Using based on grid Spanning tree path planning algorithm (Grid-based Spanning Tree PathPlanning) realize to selected target clean The cleaning path planning in region.This method handles in respective coordinates region using gridding, and tree node is established to grid and is generated Tree, then use the hamiltonian circuit (Hamiltonian path) for surrounding spanning tree clear as the optimization for cleaning the region Sweep path.
In addition, when initial, mobile electronic device 100 is located at intelligent charging spot 180.For mobile electronic device 100 such as What reaches the coordinate range region of selection area from intelligent charging spot 180, and path planning module 108 will read and use time shift first Dynamic electronic equipment 100 follows the path (if mobile electronic device 100 uses follow the mode) up to the region, or using the The walking path that the user of two mobile electronic devices 140 is built during figure is as reaching the path in the region (if used first When, mobile electronic device 100 does not follow the situation of user), and the path is optimized into cleaning path with selection area and synthesizes cleaning Task path.The synthesis can do two sections of paths simple in-order connection, and first paragraph path, which is realized, reaches target purging zone, the Two sections of paths are realized to be covered to the optimization for drawing a circle to approve purging zone, completes clean up task.
Then, above-mentioned task is sent to mobile electronic device 100 and performed automatically.For example, motion module 110 is communicably Path planning module 108 is connected to, is configured to, according to path planning scheme, be moved.
Mobile device 100 described separately below establishes the various ways of indoor environment map when using first.
Mode one:Mobile electronic device 100 (such as robot) includes camera, and the second mobile electronic device 140 User wears location receivers
Alternatively, mobile electronic device 100 also includes the first camera 112, wherein, the second mobile electronic device 140 is also Including second wireless singal transceiver 142, mobile electronic device 100, which is configured to work at, establishes map mode.First is wireless Signal transceiver 102 and second wireless singal transceiver 142 be communicatively connected to respectively it is multiple refer to radio signal source, configuration According to from it is multiple with reference to radio signal source obtain signal intensities, determine that the mobile electron of mobile electronic device 100 and second is set Standby 140 position.For example, can be by any method known in the art by from reference to the signal received at radio signal source Range information is converted to, the above method includes but is not limited to:Flight time algorithm (Time of Flight, ToF), angle of arrival Spend algorithm (Angle of Arrival, AoA), arrival time difference algorithm (Time Difference of Arrival, TDOA) With received signal strength algorithm (Received Signal Strengh, RSS).
Motion module 110 is configured as the position according to the mobile electronic device 140 of mobile electronic device 100 and second, with With the motion of the second mobile electronic device 140.For example, mobile electronic device 100 includes monocular cam 112, the second movement electricity The user of sub- equipment 140 wears wireless positioning receiver bracelet, or user's equipment has wireless positioning receiver peripheral hardware Mobile phone.Hardware cost and calculation cost can be reduced using monocular cam 112, is realized using monocular cam and uses depth The same effect of camera.Image depth information can not needed.Pass through ultrasonic sensor and laser apart from depth information Sensor perceives.In the present embodiment, illustrated by taking monocular cam as an example, those skilled in the art, which will be understood that, to adopt By the use of the first-class camera as mobile electronic device 100 of depth camera.The wireless location that mobile electronic device 100 passes through itself Receiver follows user.For example, using first, the user of the second mobile electronic device 140 is realized and mobile electricity by cell phone application The interaction of sub- equipment 100 completes interior and establishes map.Ginseng is used as by the wireless signal transmission group of the fixed position of indoor placement Examination point, for example, UWB, the wireless signal module in the cell phone application and mobile electronic device 100 of the second mobile electronic device 140 is read The signal intensity (RSS) to each signal source is taken, to determine the user of the second mobile electronic device 140 and mobile electronic device 100 position indoors.Also, the real time position that the motion module 110 of mobile electronic device 100 is sent according to intelligent charging spot Information (mobile phone and robot location), complete user and follow.
First camera 112 is configured as shooting multiple images when motion module 110 is moved, and the plurality of image includes spy Reference ceases and corresponding taking location information.For example, complete to build figure by the monocular cam of robot during following.Following During, mobile electronic device 100 utilizes the first camera 112, such as monocular cam, and whole indoor arrangement is clapped Take the photograph, and image containing big measure feature and its corresponding taking location information will be taken and mobile electronic device 100 follows road Footpath coordinate, it is sent in real time in memory 116 by local wireless communication network (WIFI, bluetooth, ZigBee etc.).In Fig. 1, Memory 116 is shown to be included in mobile electronic device 100.Alternatively, memory 116 can also be included in intelligent charging spot In 180, namely high in the clouds.
Image procossing mold 104 is communicatively connected to the first camera 112, is configured as by entering to the plurality of image Row splicing, extracts the characteristic information and camera site point information in the plurality of image, generates image map.For example, image procossing Mold 104 is right via image processor 104 according to the height and inside and outside parameter of the first camera 112 of mobile electronic device 100 Great amount of images captured by first camera 112 carries out map splicing and created, feature selecting extraction (such as SIFT, SURF algorithm Deng), the addition of characteristic point position information, and then off-the-air picture cartographic information (characteristic point containing great amount of images) is generated, then by after processing Image map information be stored in memory 116.The intrinsic parameter of camera (camera) refers to the ginseng related to camera self-characteristic Number, such as the lens focus of camera, pixel size etc.;The outer parameter of camera is (actual in charging pile room in world coordinate system Coordinate system) parameter, such as the position of camera, direction of rotation, angle etc..The photo of camera shooting has the camera coordinates of oneself System, therefore need camera inside and outside parameter to realize the conversion of coordinate system.
Mode two:Mobile electronic device 100 (robot) includes camera and can show camera calibration black and white chessboard, the The user of two mobile electronic devices 140 need not wear location receivers.
Alternatively, in another embodiment, mobile electronic device 100 also includes display screen 118, mobile electronic device 100 It is configured to work at and establishes map mode, the second mobile electronic device 140 includes second camera 144, the first wireless signal Transceiver 142 be communicatively connected to it is multiple refer to radio signal source, be configured to according to from it is multiple with reference to radio signal sources obtain Signal intensity, determine the position of mobile electronic device 100.
First camera 112 is configured as detecting the position of the second mobile electronic device 140.Alternatively, mobile electron is set Standby 100 also include ultrasonic sensor and laser sensor, can detect the mobile electronic device of mobile electronic device 100 and second The distance between 140.
Motion module 110 is configured as the position according to the mobile electronic device 140 of mobile electronic device 100 and second, with With the motion of the second mobile electronic device 140.For example, using first, the user of the second mobile electronic device 140 passes through mobile phone APP realizes establishes map with the user mutual of mobile electronic device 100 to complete interior.Pass through the fixed position of indoor placement Wireless signal transmission group (UWB etc.) is as a reference point, and the first wireless signal transceiver 102 in mobile electronic device 100 is read The signal intensity (RSS) to each signal source is taken, to determine the position of mobile electronic device 100 indoors.Pass through mobile electron First camera 112 of equipment 100, such as monocular cam, ultrasonic sensor and laser sensor 114 are realized and moved to second The target of the user of dynamic electronic equipment 100 is positioned with following.For example, the user of the second mobile electronic device 140 can pass through hand Machine APP set following distance, so as to mobile electronic device 100 according to the following distance and measure in real time with the second mobile electron Angle between equipment 140, the distance between adjustment and the second mobile electronic device 140 and angle.Mobile electricity during following Sub- equipment 100 sends follow path coordinate to intelligent charging spot 180 in real time.
In addition, the display screen 118 of mobile electronic device 100 is configured as showing such as black and white chessboard.Image processor 104 are communicatively connected to second camera 144, are configured as receiving and are moved from second camera 144 in motion module 110 When the multiple images that shoot.For example, image processor 104 can pass through the first wireless signal transceiver 102 and second wireless singal Transceiver 142, receive from the multiple images captured by second camera 144.Wherein, multiple images include mobile electronic device The image of 100 display screen 118 for being shown as black and white chessboard.Image processor 104 is additionally configured to by multiple images Spliced, extract characteristic information and camera site point information in multiple images, generate image map.In this approach second The user of mobile electronic device 140 need not wear location receivers, therefore the second mobile device 140, such as the outer ginseng of mobile phone camera Number needs to carry out camera calibration by calibration maps.Demarcation picture is the chessboard figure that chequered with black and white rectangle is formed, as shown in Figure 5.
For example, mobile electronic device 100, namely robot include the first camera 112, for example, monocular cam, and The display screen 118 of black and white camera calibration chessboard can be shown.User need not wear wireless positioning receiver bracelet, without user The mobile phone of equipment wireless positioning receiver peripheral hardware, mobile electronic device 100 follow user, the second mobile electron by vision The user of equipment 140 builds figure using cell phone application completion of taking pictures.For example, often reach a room, the second mobile electronic device 140 User room started by cell phone application build figure application, the display of the LCDs 118 of mobile electronic device 100 now is used In the classic black color chessboard of correcting camera.Mobile electronic device 100 simultaneously sends the now coordinate of itself and directional information To locating module 106.Now, the user of the second mobile electronic device 140 shoots the room environment using cell phone application, shooting Photo needs to include the black and white chessboard in the LCDs of mobile electronic device 100.The use of second mobile electronic device 140 Family shoots multiple pictures (photo is required to photograph the black and white chessboard in robot liquid crystal display) according to room layout situation, and leads to Cross cell phone application and contain room environment and mobile electronic device 100 by what is taken, such as the image of robot 100 passes through local wireless Communication network (WIFI, bluetooth, ZigBee etc.) is sent in memory 116.According to mobile electronic device 100, such as robot Position and direction information, the height and inside and outside parameter of camera 112 at that time, via image processor 104 to the second movement electricity The great amount of images of user's shooting of sub- equipment 140 carries out map splicing and created, and feature selecting extraction, characteristic point position information adds Add, generate off-the-air picture characteristic point cartographic information, then the image map information after processing is stored in memory 116.
Mode three:Mobile electronic device 100 (robot) does not include camera, the user of the second mobile electronic device 140 Wear location receivers.
Alternatively, in another embodiment, the second mobile electronic device 140 also includes second wireless singal transceiver 142 and second camera 144.Second wireless singal transceiver 142 be communicatively connected to it is multiple refer to radio signal source, configuration According to from multiple signal intensities obtained with reference to radio signal source, determine the position of the second mobile electronic device 140.Second takes the photograph As first 144 multiple images for being configured as shooting task place.Image processor 104 is communicatively connected to second camera 140, it is configured as by splicing to multiple images, characteristic information and camera site point information in extraction multiple images, Generate image map.
For example, in this embodiment, mobile electronic device 100, such as robot do not include monocular cam and robot The user of the second mobile electronic device 140 is not followed.The user of second mobile electronic device 140 wears wireless positioning receiver Bracelet, or user's equipment have the mobile phone of wireless positioning receiver peripheral hardware, complete interior using cell phone application and build figure.It is for example, first Secondary use, the user of the second mobile electronic device 140 pass through cell phone application or the wireless positioning receiver bracelet or hand of user's wearing The wireless positioning receiver peripheral hardware of machine equipment realizes that map is established in interior.Pass through the reference wireless communication of the fixed position of indoor placement Number source (UWB etc.) is as a reference point, and the wireless signal transceiver 142 in the second mobile electronic device 140 is read to each reference The signal intensity (Received Signal Strength, RSS) of radio signal source determines second mobile electronic device 140 User position indoors.A room is often reached, the user of the second mobile electronic device 140 starts room by cell phone application Between graph builder.The user of second mobile electronic device 140 shoots the room environment using cell phone application, for example, according to room cloth Office's situation can shoot multiple pictures.The cell phone application of second mobile electronic device 140 will record the second camera shot every time Second mobile electronic device 140 of 144 posture information and second wireless singal transceiver 142 record, such as mobile phone are relative The elevation information on ground and its positional information indoors, and pass through local wireless communication network (WIFI, bluetooth, ZigBee etc.) It is sent in memory 116.Believed according to the posture information when inside and outside parameter information of second camera 144 and shooting, height Breath and positional information, map splicing is carried out to the great amount of images of shooting via image processor 104 and created, feature selecting extraction, Characteristic point position information is added, and generates off-the-air picture characteristic point cartographic information, then the image map information after processing is stored in In memory 116.
Alternatively, or in addition to, mobile electronic device 100, for example, robot 100 also includes encoder and inertia measurement Module (IMU), to aid in the first camera 112 to obtain mobile electronic device 100, such as the position of robot and posture.Such as When robot is shielded, not in the sight of the first camera 112 when, encoder and IMU can also provide the position of robot And posture.For example, encoder can be used as odometer, by the rotation information of recorder people's wheel, carry out calculating robot and walk The track crossed.
Alternatively, or in addition to, mobile electronic device 100 can also include sensor 114, and sensor 114 is by mobile electron Obstacle information around equipment 100 is sent to motion module 110.Motion mould 110 is additionally configured to adjust mobile electronic device 100 Motion orientation with avoiding obstacles.It is appreciated that because the height of installation is different, on mobile electronic device 100 First camera 112 is different from the height of the sensor 114 on mobile electronic device 100, therefore the first camera Obstacle information captured by 112 may be different from the barrier captured by sensor, because there may be masking.First shooting First 112 can change visual direction by modes such as rotation, pitching, to obtain wider array of visual range.In addition, sensor 114 It may be mounted at than relatively low horizontal level, and this position is likely to be the blind area of the first camera 112, object is not present in In the visual angle of first camera 112, then avoidance must be carried out by these traditional sensors 112.Alternatively, camera 112 can To obtain obstacle information, and combine ultrasound and the information of laser sensor 114.The image that monocular cam 112 obtains does thing Body identification, ultrasound and the ranging of laser sensor 114.
Alternatively or alternately, sensor 114 includes ultrasonic sensor and/or laser sensor.First shooting First 112 and sensor 114 can mutually aid in.For example, during if any masking, in shielded part, mobile electronic device 100 needs Itself laser sensor, ultrasonic sensor 114 etc. is leaned on to carry out avoidance.
For example, mobile electronic device 100 carry laser sensor, ultrasonic sensor is to mobile electronic device 100 weeks Enclose static, dynamic environment to be detected, auxiliary avoids static, dynamic barrier and adjustment optimal path.
Fig. 3 shows the mobile electronic device according to one embodiment of the present utility model, where the second mobile electronic device The schematic diagram of system.Alternatively or alternately, mobile electronic device 300 also includes charging pile 380, and charging pile 380 can be with Including image processor 386, path planning module 388, memory 384, such as internal storage data module, the first transmitting set 381 (such as UWB) and second wireless singal receiver 382, such as realized by WIFI.The body of mobile electronic device 300, example Such as, robot can include the first wireless signal receiver 302, such as be realized by UWB, the first camera 310, Orientation on map mould Block 304, obstacle avoidance module 306, motion module 308 and sensor 314 and 316, and encoder 318, second wireless singal receive Machine 320, such as realized by WIFI.Second mobile electronic device 340, such as mobile phone, in addition to cell phone application and second camera. Alternatively, at least one can also be included in image processor 386, path planning module 388, memory 384 moves electricity In the body of sub- equipment 300.As shown in figure 3, the first transmitting set 381 and sweeping robot 300 in intelligent charging spot 380 In the first wireless signal receiver 302 be communicatively coupled, the second wireless singal emitter 320 in sweeping robot 300 It is communicatively coupled with the 5th wireless signal receiver 382 in intelligent charging spot 380.Path rule in intelligent charging spot 380 Module 388 is drawn to be communicatively coupled with the motion module 308 in sweeping robot 300.The mobile phone 340 of second electronic mobile device It is communicatively coupled with the second wireless singal receiver 382 in intelligent charging spot 380.Path planning module 388 is by generation Path sends to motion module 308 and performed.
Inside sweeping robot 300, in the first wireless signal receiver 302 can be communicated with map location module 304 Ground connects.The communication of map location module 304 communicably connects with second wireless singal emitter 320 and motion module 308 again Connect.First camera 310 is communicatively coupled with second wireless singal emitter 320.Ultrasonic sensor 314, laser sensing Device 316 and encoder 318 are communicatively coupled with obstacle avoidance module 306.Obstacle avoidance module 306 communicably connects with motion module 308 Connect.Also there is information exchange between locating module 304 and motion module 308, motion module 308 needs fixed at Execution plan path The positional information that position module 304 inputs.
Inside intelligent charging spot 380, second wireless singal receiver 382 is communicatively coupled can lead to memory 384 The connection of letter ground.Memory 384 is communicatively coupled with image processor 386.Image processor 386 and path planning module 388 It is communicatively coupled.
Fig. 4 shows the flow of the method 400 in mobile electronic device according to one embodiment of the present utility model Figure.It is used for the method 400 for handling the task of mission area in mobile electronic device.Wherein mobile electronic device includes the first nothing Line signal transceiver, image processor, locating module, path planning module and motion module.Method 400 is included in block 410 In, by being communicatively connected to second the first wireless signal transceiver of mobile electronic device, obtain and set by the second mobile electron Standby user is to the photo captured by task place and the selection area on photo;In block 420, by being communicatively coupled to The image processor of first wireless signal transceiver, the characteristic information of photo of the extraction comprising selection area, and carried by comparing The characteristic information of the characteristic information that takes and the image map comprising positional information of storage, it is determined that with the selection area phase in photo Corresponding actual coordinate scope;In block 430, by being communicatively coupled to the locating module of image processor, record movement The distance between the present position of electronic equipment and the actual coordinate scope corresponding with selection area scope;In block 440 In, by being communicatively coupled to the path planning module of image processor, according to the actual coordinate corresponding with selection area Scope, generate path planning scheme;In block 450, by being communicatively coupled to the motion module of path planning module, according to Path planning scheme, is moved.
Alternatively or alternately, mobile electronic device also includes the first camera, and the second mobile electronic device also wraps Second wireless singal transceiver is included, mobile electronic device, which is configured to work at, establishes map mode, and method 400 also includes (figure Not shown in) respectively by being communicatively connected to multiple the first wireless signal transceivers and the second nothing with reference to radio signal source Line signal transceiver, according to from multiple signal intensities obtained with reference to radio signal source, determine that mobile electronic device and second moves The position of dynamic electronic equipment;By motion module, according to the position of mobile electronic device and the second mobile electronic device, is followed The motion of two mobile electronic devices;By the first camera, multiple images are shot when motion module is moved, multiple images include Characteristic information and corresponding taking location information;By being communicatively connected to the image procossing mold of the first camera, pass through Multiple images are spliced, characteristic information and camera site point information in multiple images is extracted, generates image map.
Alternatively or alternately, mobile electronic device also includes display screen, and mobile electronic device is configured as work Map mode is being established, the second mobile electronic device includes second camera, and method 400 also passes through including (not shown) Multiple the first wireless signal transceivers with reference to radio signal source are communicatively connected to, radio signal source is referred to according to from multiple The signal intensity of acquisition, determine the position of mobile electronic device;By motion module, moved according to mobile electronic device and second The position of electronic equipment, follow the motion of the second mobile electronic device;By the display screen of mobile electronic device, black and white is shown Chessboard;By being communicatively connected to the image processor of second camera, receive and transported from second camera in motion module The multiple images shot when dynamic, wherein, multiple images include the display screen for being shown as black and white chessboard of mobile electronic device Image, by image processor, by splicing to multiple images, extract the characteristic information in multiple images and camera site Point information, generates image map.
Alternatively or alternately, wherein the second mobile electronic device also includes second wireless singal transceiver and second Camera, method 400 also include (not shown) by being communicatively connected to multiple the second nothings with reference to radio signal source Line signal transceiver, according to from multiple signal intensities obtained with reference to radio signal source, determine the position of the second mobile electronic device Put;Pass through second camera, the multiple images in shooting task place, by being communicatively connected at the image of second camera Device is managed, by splicing to multiple images, characteristic information and camera site point information in multiple images is extracted, generates image Map.
Alternatively or alternately, method 400 also includes (not shown) passage path planning module, using based on The spanning tree path planning algorithm of grid, path planning is carried out to selection area.
Alternatively or alternately, method 400, which also includes (not shown), is also included by being communicatively connected to The encoder and inertia measuring module of image processor, position and the appearance of mobile electronic device are obtained for the first camera of auxiliary State.
Alternatively or alternately, mobile electronic device also includes charging pile, wherein charging pile include image processor, Path planning module and locating module.
Alternatively or alternately, mobile electronic device can also include sensor, and method 400 also includes (in figure not showing Go out) by sensor, the obstacle information around mobile electronic device is sent to motion module, passes through motion module, adjustment The motion orientation of mobile electronic device is with avoiding obstacles.
Alternatively or alternately, sensor includes ultrasonic sensor and/or laser sensor.
In description above, the utility model is described by reference to specific illustrative embodiment;However, it should manage Solution, in the case where not departing from the scope of the utility model described in this paper, can carry out various modifications and variations.Specification It should treat in an exemplary fashion with accompanying drawing, rather than it is restricted, and all such modifications are intended to be included in this reality With in new scope.Therefore, the scope of the utility model should by this paper general embodiments and its legal equivalents rather than Only determined by above-mentioned specific embodiment.For example, the step in any method or process embodiments can be performed in any order, and And it is not limited to the clear and definite order presented in a particular embodiment.In addition, the part and/or element in any device embodiment can Assembled with various arrangements or otherwise operatively configured, with generation and the essentially identical result of the utility model, therefore not The concrete configuration being limited in specific embodiment.
The solution of benefit, other advantages and problem is described on specific embodiment above;However, any benefit The solution at place, advantage or problem, or any particular benefits, advantage or scheme can be caused to occur or become more apparent upon any Element is not necessarily to be construed as key, required or basic feature or part.
As it is used herein, term " comprising ", "comprising" or its any modification are intended to quote including for nonexcludability, make Process, method, article, composition or the device of element list, which must be included, not only includes those described elements, and can also Including not expressly listed or intrinsic main process, method, article, composition or device.Except that being not specifically delineated Outside a little, the said structure, layout, application, ratio, element, material or the part that are used in practice of the present utility model its It is combined and/or modification can be changed, or otherwise especially suitable for specific environment, manufacture specification, design ginseng Number or other operations require, without departing from its substantially principle.
Although describing the utility model by reference to some preferred embodiments herein, those skilled in the art will hold It is readily understood, in the case where not departing from spirit and scope of the present utility model, other application can substitute it is described in this paper that A bit.Therefore, the utility model is only limited by following claims.

Claims (9)

1. a kind of mobile electronic device for being used to handle the task of mission area, including at the first wireless signal transceiver, image Device, locating module, path planning module and motion module are managed, wherein:
First wireless signal transceiver is communicatively connected to the second mobile electronic device, is configured to obtain by described second The user of mobile electronic device is to the photo captured by task place and the selection area on the photo;
Described image processor is communicatively coupled to first wireless signal transceiver, is configured to extraction comprising described selected The characteristic information of the photo in region, and the image map comprising positional information by comparing the characteristic information of extraction and storing Characteristic information, it is determined that the actual coordinate scope corresponding with the selection area in the photo;
The locating module is communicatively coupled to described image processor, is configured to record the current of the mobile electronic device The distance between the position actual coordinate scope corresponding with selection area scope;
The path planning module is communicatively coupled to described image processor, is configured to according to relative with the selection area The actual coordinate scope answered, generate path planning scheme;
The motion module is communicatively coupled to the path planning module, is configured to, according to the path planning scheme, enter Row motion.
2. mobile electronic device according to claim 1, in addition to the first camera, wherein, second mobile electron Equipment also includes second wireless singal transceiver, and the mobile electronic device, which is configured to work at, establishes map mode,
First wireless signal transceiver and the second wireless singal transceiver are communicatively connected to multiple references respectively Radio signal source, it is configured to, according to from the multiple signal intensity obtained with reference to radio signal source, determine the mobile electron The position of equipment and second mobile electronic device;
The motion module is configured as the position according to the mobile electronic device and second mobile electronic device, follows The motion of second mobile electronic device;
First camera is configured as shooting multiple images when the motion module is moved, and the multiple image includes spy Reference ceases and corresponding taking location information,
Described image processing mold is communicatively connected to first camera, is configured as by entering to the multiple image Row splicing, extracts the characteristic information and camera site point information in the multiple image, generates described image map.
3. mobile electronic device according to claim 1, in addition to display screen, wherein, the mobile electronic device by with It is set to be operated in and establishes map mode, the mobile electronic device includes the first camera, the second mobile electronic device bag Include second camera,
First wireless signal transceiver be communicatively connected to it is multiple refer to radio signal source, be configured to according to from described more The individual signal intensity obtained with reference to radio signal source, determine the position of the mobile electronic device;
First camera is configured as detecting the position of second mobile electronic device;
The motion module is configured as the position according to the mobile electronic device and second mobile electronic device, follows The motion of second mobile electronic device;
The display screen of the mobile electronic device is configured as showing black and white chessboard;
Described image processor is communicatively connected to the second camera, is configured as reception and comes from the second camera The multiple images shot when the motion module is moved, wherein, the multiple image includes the aobvious of the mobile electronic device The image of the display screen of the black and white chessboard is shown as, described image processor is additionally configured to by entering to the multiple image Row splicing, extracts the characteristic information and camera site point information in the multiple image, generates described image map.
4. mobile electronic device according to claim 1, wherein second mobile electronic device is also wireless including second Signal transceiver and second camera,
Wherein described second wireless singal transceiver be communicatively connected to it is multiple refer to radio signal source, be configured to according to from institute Multiple signal intensities obtained with reference to radio signal source are stated, determine the position of second mobile electronic device;
The second camera is configured as shooting the multiple images in the task place,
Described image processor is communicatively connected to the second camera, is configured as by being carried out to the multiple image Splicing, the characteristic information and camera site point information in the multiple image are extracted, generate described image map.
5. mobile electronic device according to claim 1, wherein the path planning module, which is additionally operable to use, is based on grid Spanning tree path planning algorithm, to the selection area carry out path planning.
6. mobile electronic device according to claim 2, in addition to it is communicatively connected to the volume of described image processor Code device and inertia measuring module, it is configured to aid in position and posture that the camera obtains the mobile electronic device.
7. according to the mobile electronic device any one of claim 1-6, in addition to charging pile, wherein the charging pile bag Include described image processor, the path planning module and the locating module.
8. according to the mobile electronic device any one of claim 1-6, sensor can be also included, the sensor is by institute State the obstacle information around mobile electronic device and send to the motion module, the motion module and be additionally configured to described in adjustment The motion orientation of mobile electronic device is to avoid the barrier.
9. mobile electronic device according to claim 8, wherein the sensor includes ultrasonic sensor and/or laser Sensor.
CN201720917099.0U 2017-07-26 2017-07-26 A kind of mobile electronic device for being used to handle the task of mission area Expired - Fee Related CN207115193U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201720917099.0U CN207115193U (en) 2017-07-26 2017-07-26 A kind of mobile electronic device for being used to handle the task of mission area

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201720917099.0U CN207115193U (en) 2017-07-26 2017-07-26 A kind of mobile electronic device for being used to handle the task of mission area

Publications (1)

Publication Number Publication Date
CN207115193U true CN207115193U (en) 2018-03-16

Family

ID=61579736

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201720917099.0U Expired - Fee Related CN207115193U (en) 2017-07-26 2017-07-26 A kind of mobile electronic device for being used to handle the task of mission area

Country Status (1)

Country Link
CN (1) CN207115193U (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108459597A (en) * 2017-07-26 2018-08-28 炬大科技有限公司 A kind of mobile electronic device and method for handling the task of mission area
CN108827309A (en) * 2018-06-29 2018-11-16 炬大科技有限公司 A kind of robot path planning method and the dust catcher with it
CN108958253A (en) * 2018-07-19 2018-12-07 北京小米移动软件有限公司 The control method and device of sweeping robot
WO2019001237A1 (en) * 2017-06-30 2019-01-03 炬大科技有限公司 Mobile electronic device, and method in mobile electronic device
TWI695243B (en) * 2019-01-25 2020-06-01 孟菁 Obstacle avoiding guidance system
CN112739244A (en) * 2018-07-13 2021-04-30 美国iRobot公司 Mobile robot cleaning system
CN113467448A (en) * 2018-06-07 2021-10-01 科沃斯机器人股份有限公司 Fixed-point working method, self-moving robot and storage medium
CN114680741A (en) * 2020-12-30 2022-07-01 Oppo广东移动通信有限公司 Sweeping control method and device, storage medium and sweeping robot

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019001237A1 (en) * 2017-06-30 2019-01-03 炬大科技有限公司 Mobile electronic device, and method in mobile electronic device
CN108459597A (en) * 2017-07-26 2018-08-28 炬大科技有限公司 A kind of mobile electronic device and method for handling the task of mission area
CN108459597B (en) * 2017-07-26 2024-02-23 炬大科技有限公司 Mobile electronic device and method for processing tasks in task area
WO2019019819A1 (en) * 2017-07-26 2019-01-31 炬大科技有限公司 Mobile electronic device and method for processing tasks in task region
CN113467448A (en) * 2018-06-07 2021-10-01 科沃斯机器人股份有限公司 Fixed-point working method, self-moving robot and storage medium
CN108827309A (en) * 2018-06-29 2018-11-16 炬大科技有限公司 A kind of robot path planning method and the dust catcher with it
CN112739244A (en) * 2018-07-13 2021-04-30 美国iRobot公司 Mobile robot cleaning system
US11669086B2 (en) 2018-07-13 2023-06-06 Irobot Corporation Mobile robot cleaning system
CN112739244B (en) * 2018-07-13 2024-02-09 美国iRobot公司 Mobile robot cleaning system
CN108958253A (en) * 2018-07-19 2018-12-07 北京小米移动软件有限公司 The control method and device of sweeping robot
TWI695243B (en) * 2019-01-25 2020-06-01 孟菁 Obstacle avoiding guidance system
CN114680741A (en) * 2020-12-30 2022-07-01 Oppo广东移动通信有限公司 Sweeping control method and device, storage medium and sweeping robot
CN114680741B (en) * 2020-12-30 2023-08-11 Oppo广东移动通信有限公司 Sweeping control method and device, storage medium and sweeping robot

Similar Documents

Publication Publication Date Title
CN207115193U (en) A kind of mobile electronic device for being used to handle the task of mission area
CN108459597A (en) A kind of mobile electronic device and method for handling the task of mission area
CN108762245B (en) Data fusion method and related equipment
US10482619B2 (en) Method and apparatus for combining data to construct a floor plan
CN110446159B (en) System and method for accurate positioning and autonomous navigation of indoor unmanned aerial vehicle
CN207488823U (en) A kind of mobile electronic device
CN207067803U (en) A kind of mobile electronic device for being used to handle the task of mission area
CN109901590B (en) Recharging control method of desktop robot
WO2019001237A1 (en) Mobile electronic device, and method in mobile electronic device
US11450102B2 (en) System and method for spatially mapping smart objects within augmented reality scenes
WO2011005783A2 (en) Image-based surface tracking
KR20220028042A (en) Pose determination method, apparatus, electronic device, storage medium and program
TW202115366A (en) System and method for probabilistic multi-robot slam
US10949579B2 (en) Method and apparatus for enhanced position and orientation determination
US20190156568A1 (en) System and method of scanning an environment and generating two dimensional images of the environment
CN112785682A (en) Model generation method, model reconstruction method and device
WO2018228258A1 (en) Mobile electronic device and method therein
Sohn et al. Localization system for mobile robot using wireless communication with IR landmark
JP7166446B2 (en) System and method for estimating pose of robot, robot, and storage medium
CN206833252U (en) A kind of mobile electronic device
Iqbal et al. A unified SLAM solution using partial 3D structure
US11009887B2 (en) Systems and methods for remote visual inspection of a closed space
Rekleitis et al. Automated calibration of a camera sensor network
CN108459598A (en) A kind of mobile electronic device and method for handling the task of mission area
Liu et al. Self-landmarking for robotics applications

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20180316

Termination date: 20200726

CF01 Termination of patent right due to non-payment of annual fee