CN206833252U - A kind of mobile electronic device - Google Patents

A kind of mobile electronic device Download PDF

Info

Publication number
CN206833252U
CN206833252U CN201720702958.4U CN201720702958U CN206833252U CN 206833252 U CN206833252 U CN 206833252U CN 201720702958 U CN201720702958 U CN 201720702958U CN 206833252 U CN206833252 U CN 206833252U
Authority
CN
China
Prior art keywords
electronic device
mobile electronic
mission area
image
locating module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201720702958.4U
Other languages
Chinese (zh)
Inventor
潘景良
陈灼
李腾
陈嘉宏
高鲁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ju Da Technology Co Ltd
Original Assignee
Ju Da Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ju Da Technology Co Ltd filed Critical Ju Da Technology Co Ltd
Priority to CN201720702958.4U priority Critical patent/CN206833252U/en
Application granted granted Critical
Publication of CN206833252U publication Critical patent/CN206833252U/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

First mobile electronic device includes wireless signal transceiver, image processor, locating module and motion module, and wireless signal transceiver obtains the image of the mission area including the first mobile electronic device gathered by camera;The characteristic information of the first mobile electronic device in the image in image processor identification mission area, and establish coordinate system for image;Locating module identifies position and the profile of the first mobile electronic device, and draw the map of mission area by comparing the characteristic information and the origin of coordinates of the first identified mobile electronic device on coordinate system;The second mobile electronic device is also communicatively coupled to, sends map to the second mobile electronic device, and the second mobile electronic device receives the task location set from user on map;And motion module receives the task location from the second mobile electronic device, and according to the position and profile of task location and the first mobile electronic device identified, path planning, and moved according to path.

Description

A kind of mobile electronic device
Technical field
It the utility model is related to electronic device field.Specifically, it the utility model is related to intelligent robot system field.
Background technology
Traditional sweeping robot presses the map autonomous positioning and movement or collision reaction deflecting random walk of scanning, together When clean ground.Therefore, traditional sweeping robot is because drawing and location technology are immature or inaccurate, in the course of the work Ground complicated state can not be judged completely, the situation for losing position and direction easily occur.In addition, some types are not due to possessing Stationkeeping ability, can only be by the physical principle of collision reaction come deflecting, or even household articles or robot itself can be caused to damage Bad even personal injury, the problems such as being interfered to user.
Utility model content
Mobile electronic device system described in embodiment of the present utility model, such as robot system utilize installation indoors Fixing camera, positioned and map making, and mission area is determined by the feedback of user, carry out path planning, Real-time localization machine robot, so as to which robot is walked according to the path of planning.Thus solving robot can not judge ground The problem of face situation, present position and optimal mobile route.The tracing sensor of robot is replaced with fixed camera;With machine The work of the duplication of labour substitution people of device people, saves the cost of intelligent robot research and development and equipment and makes an investment in sweeping, dust collecting machine Cost on structure.In addition, because camera is fixed, therefore also eliminates and be arranged on the camera of robot interior and brought Accumulated error, so as to realize accurately positioning and mapping.
Robot system described in embodiment has accurately positioning and mapping, can improve the work of robot Efficiency, while mitigate the live load of user.
The mobile electronic device of one kind first according to one embodiment includes wireless signal transceiver, image procossing Device, locating module and motion module, wherein:The wireless signal transceiver is communicatively connected to install shooting indoors Head, it is configured to obtain the image of the mission area including first mobile electronic device gathered by the camera;It is described Image processor is communicatively coupled to the wireless signal transceiver, is configured to identify described in the image of the mission area The characteristic information of first mobile electronic device;The locating module is communicatively coupled to described image processor, is configured to lead to The characteristic information and the origin of coordinates established coordinate system and compare the first identified mobile electronic device are crossed, identification described first moves The position of dynamic electronic equipment and profile, and draw the map of the mission area;The locating module is also communicatively coupled to Two mobile electronic devices, it is configured to send the map to second mobile electronic device, and second mobile electron is set It is standby to receive the task location set on the map from user;And the motion module be communicatively coupled to it is described fixed Position module and second mobile electronic device, it is configured to receive the task position from second mobile electronic device Put, and according to the position and profile, path planning, and root of the task location and first mobile electronic device identified Moved according to the path.
Alternatively or alternately, it is described wherein the image of the mission area also includes at least one first barrier Image processor is additionally configured to identify the characteristic information of at least one first barrier in the image of the mission area;Institute State characteristic information and the origin of coordinates that locating module is additionally configured at least one first barrier described in comparison, described in identification at least The position of one barrier and profile;The motion module be additionally configured to according to the task location, positioned described first The position and profile of the position of mobile electronic device and at least one first barrier identified, path planning, and according to institute Path is stated to be moved.
Alternatively or alternately, wherein the motion module is moved, and the camera gathers bag in real time Include the image of the mission area of first mobile electronic device;Described image processor and the locating module are additionally operable to dynamically Determine the characteristic information, position and the profile of first mobile electronic device.
Alternatively or alternately, the camera is vertically mounted on the ceiling of the mission area.
Alternatively or alternately, the first described mobile electronic device, in addition to charging pile, wherein the charging pile Including described image processor and the locating module.
Alternatively or alternately, the first mobile electronic device can also include sensor, and the sensor is by described The second obstacle information around one mobile electronic device sends to the motion module, the motion module and is additionally configured to adjust The motion orientation of first mobile electronic device is to avoid second barrier.
Alternatively or alternately, the first described mobile electronic device, the sensor include ultrasonic sensor And/or laser sensor.
According to another utility model of the present utility model, a kind of method in the first mobile electronic device, described One mobile electronic device includes wireless signal transceiver, image processor, locating module and motion module, methods described bag Include:By being communicatively connected to install the wireless signal transceiver of at least one camera indoors, obtain by institute State the image for the mission area including first mobile electronic device that at least one camera is gathered;By that can communicate Ground is connected to the described image processor of the wireless signal transceiver, identifies that described first in the image of the mission area moves The characteristic information of dynamic electronic equipment, and coordinate system is established for described image, and each point in the mission area is assigned Corresponding coordinate value;By being communicatively coupled to the locating module of described image processor, by the seat established Mark fastens the characteristic information and the origin of coordinates for comparing the first identified mobile electronic device, identifies that first mobile electron is set Standby position and profile, and draw the map of the mission area;By the institute for being communicatively coupled to the second mobile electronic device Locating module is stated, sends the map to second mobile electronic device, and second mobile electronic device is received and come from The task location that user sets on the map;And moved by being communicatively coupled to the locating module and described second The motion module of dynamic electronic equipment, receives the task location from second mobile electronic device, and according to institute State task location and the position of first mobile electronic device identified and profile, path planning, and according to the path Moved.
Brief description of the drawings
The detailed description that more complete understanding of the present utility model describes by referring to associated drawings is obtained, attached Similar reference refers to similar part in figure.
Fig. 1 shows the first mobile electronic device and the second mobile electronic device according to one embodiment of the present utility model The schematic diagram of place system.
Fig. 2 shows the frame of the processor in the first mobile electronic device according to one embodiment of the present utility model Figure.
Fig. 3 A-3D show the schematic diagram of the first mobile electronic device according to one embodiment of the present utility model.
Fig. 4 shows the method flow in the first mobile electronic device according to one embodiment of the present utility model Figure.
Embodiment
Embodiment one
Fig. 1 is shown according to first mobile electronic device 100 of one embodiment of the present utility model and the second mobile electron The schematic diagram of the place system of equipment 120.
Reference picture 1, the first mobile electronic device 100 include but is not limited to sweeping robot, industrial automation robot, clothes Business humanoid robot, Disaster Relief Robot of getting rid of the danger, underwater robot, robot for space, autonomous driving vehicle are first-class.When the first movement electricity Sub- equipment 100 is in field operation equipment such as autonomous driving vehicle, and camera by the unmanned plane flown in overhead for example, can be taken Band.In one embodiment, the first mobile electronic device 100 itself does not include camera.
Second mobile electronic device 120 includes but is not limited to:Mobile phone, tablet personal computer, notebook computer, remote control etc..Move Dynamic electronic equipment alternatively includes operation interface.In an optional embodiment, mobile electronic device is mobile phone, operates boundary Face is cell phone application.
Signal transmission form between first mobile electronic device 100 and the second mobile electronic device 120 includes but unlimited In:Bluetooth, WIFI, ZigBee, infrared, ultrasonic wave, UWB etc., enter in the present embodiment so that signal transmission form is WIFI as an example Row description.
Mission area represents that the first mobile electronic device 100 performs the place of task.For example, when the first mobile electronic device 100 when being sweeping robot, and mission area represents that sweeping robot needs the region cleaned.In another example when the first mobile electron is set Standby 100 be when getting rid of the danger Disaster Relief Robot, and mission area represents that the Disaster Relief Robot of getting rid of the danger needs the place speedily carried out rescue work.
As shown in figure 1, in one embodiment, the first mobile electronic device 100 includes wireless signal transceiver 102, figure As processor 1040, locating module 1042 and motion module 106.In addition, the first mobile electronic device 100 also includes encoder With Inertial Measurement Unit (Inertial Measurement Unit, IMU), this two parts by the first mobile electronic device 100, Namely robot 100 is self-contained.In addition, the first mobile electronic device 100 can also include memory 110, data processor 1046, two entities can be with robot 100, can also be inside charging pile, alternatively inside charging pile.Storage This two functions of device 110, data processor 1046 are storage and processing image information, data message etc..Memory 110 can communicate Ground is connected to processor 104.Data processor 1046 is located in processor 104.As shown in FIG. 2, data processor 1046 are connected to image processor 1040 and path planning module 1044.
Wireless signal transceiver 102 is communicatively connected to install camera 140 indoors.Wireless signal transceiver 102 are configured to obtain the image of the mission area including the first mobile electronic device 100 gathered by camera 140.Example Such as, depression angle shooting is carried out to indoor plane using fixed camera 140 on the ceiling.It is wireless in camera 140 Communication module is connected with the wireless communication module 102 in the first mobile electronic device 100 by WIFI.
Image processor 1040 is communicatively coupled to wireless signal transceiver 102, is configured to the image in identification mission area In the first mobile electronic device 100 characteristic information.Sat in addition, image processor 1040 also establishes two dimension for plane picture Mark system, corresponding coordinate value is assigned to each indoor point.Locating module 1042 is communicatively coupled to image processor 1040, The characteristic information and the origin of coordinates for comparing the first identified mobile electronic device 100 on the coordinate system established are configured to, Position and the profile of the first mobile electronic device 100 are identified, and draws the map of mission area.
For example, the indoor plane figure under being shot in camera 140 is reached at the image in the first mobile electronic device 100 Manage device 1040 and carry out Treatment Analysis.The function of image processor 1040 is to identify the first mobile electronic device 100 in image Feature.Alternatively, graphics processor 1040 can use and be based on Scale invariant features transform (Scale Invariant Feature Transform, SIFT) algorithm or accelerate robust feature (Speeded Up Robust Features, SURF) calculate Method identifies features described above.Using SIFT algorithms, it is necessary to store reference picture in the memory 110.Image processor 1040 is first The key point of the object of the reference picture of identification storage in the memory 110, extracts SIFT feature, then by comparing memory Each key point SIFT feature and the SIFT feature of freshly harvested image in 110, then based on K k-nearest neighbors (K-Nearest Neighbor KNN) matching characteristic, to identify the object in new images.SURF algorithm is to be based on approximate 2D Haar wavelet transforms (Haar wavelet) is responded, and carries out image convolution using integral image (integral images), has been used and has been based on Hessian matrixes estimate construction detection (Hessian matrix-based measurefor the detector), And use the description based on distribution sub (adistribution-based descriptor).Those skilled in the art can be with Understand, image processor 1040 and locating module 1042 can be included in processor 104.Below with reference to Fig. 2 to processing Device 104 is further described.
Locating module 1042 is communicatively coupled to image processor 1040, is configured to by establishing coordinate system and comparing institute Identification the first mobile electronic device 100 characteristic information and the origin of coordinates, identify the first mobile electronic device 100 position with Profile, and draw the map of mission area.Locating module 1042 is also communicatively coupled to the second mobile electronic device 120, enters one Step is configured to send the map to the second mobile electronic device 120, and the second mobile electronic device 120 is received and existed from user The task location set on map.For example, locating module 1042 establishes plane right-angle coordinate to the plane picture after treating, Will where charging pile at be set to the origin of coordinates, corresponding coordinate value (X, Y) of each point in image, mark the first movement The position of electronic equipment 100 and profile, realize and indoor map is formulated.The second mobile electronic device 120 of user, such as mobile phone 120 are connected with the wireless signal transceiver 102 of the first mobile electronic device 100 by WIFI.The APP of mobile phone 120, which is read, to be formulated Good indoor map, user set task location using cell phone application on map, that is, what single or multiple needs cleaned Position (corresponding X, Y-coordinate information), the locating module 1042 of the first mobile electronic device 100 records this setting position information. Meanwhile first mobile electronic device 100 using ceiling camera 140 and its internal image processing module 1040 search first Mobile electronic device 100, such as the image of sweeping robot 100, where coordinating locating module 1042 real-time to robot 100 Coordinate information (X, Y) in map is positioned.
Then, motion module 106 is communicatively coupled to the mobile electronic device 120 of locating module 1042 and second, configuration To receive the task location from the second mobile electronic device 120, and according to task location and first movement identified The position of electronic equipment and profile, path planning, and moved according to the path.
In addition, the first mobile electronic device 100, for example, robot 100 also includes robot encoder and inertia measurement mould Block IMU, position and the posture of robot are obtained with auxiliary camera 140.Such as when robot is shielded, not in camera sight When middle, encoder and IMU can also provide position and the posture of robot.For example, encoder can be used as odometer, pass through note The rotation information of wheels of robot is recorded, comes the track that calculating robot 100 passes by.
Alternatively, or in addition to, the image of mission area also includes at least one first barrier.Image processor 1040 The characteristic information at least one first barrier being additionally configured in the image in identification mission area.As discussed above, image Processor 1040 can use SIFT or SURF algorithm to carry out image recognition.Locating module 1042 is additionally configured to compare at least one The characteristic information and the origin of coordinates of individual first barrier, identify position and the profile of at least one barrier.Obstacle avoidance module 1048 It is additionally configured to utilize the second mobile electronic device 120, such as the task that mobile phone 120 is selected according to user with motion module 106 Position and the wheel of position, the position of the first mobile electronic device 100 positioned and at least one first barrier identified Exterior feature, path planning, and moved according to the path.For example, when cleaning positional information and robot location's information all confirm Afterwards, the path planning module 1044 in the first mobile electronic device 100 will cook up the optimal path of energy avoiding obstacles, will Routing information reaches the motion module 106 of robot 100 by wireless network, then robot 100 sets out to clear according to this path Sweep the floor and a little begin a task with.
Alternatively, or in addition to, motion module 106 is moved, and camera 140 gathers in real time includes described The image of the mission area of one mobile electronic device.Image processor 1040 and locating module 1042 are additionally operable to dynamically determine first Characteristic information, position and the profile of mobile electronic device 100.
Alternatively, or in addition to, the first mobile electronic device 100 can also include sensor 108, and sensor 108 is by first The second obstacle information around mobile electronic device 100 is sent to motion module 106.Motion module 106 is additionally configured to adjust The motion orientation of first mobile electronic device 100 is to avoid second barrier.It is appreciated that because of the height installed not Together, the vertical view information that ceiling mounted camera 140 is shot, and the biography on the first mobile electronic device 100 What sensor obtained is to look squarely information, therefore obstacle information and robot captured by ceiling mounted camera 140 In the barrier of sensor may be different because there may be masking.The synthetic assembly of motion module 106 taking the photograph on the ceiling The second obstacle information that the first obstacle information and sensor as captured by first 140 are perceived, carry out avoidance path planning. Alternatively or alternately, sensor includes ultrasonic sensor and/or laser sensor.It is appreciated that the first barrier With the second barrier can be identical or different.Therefore, camera 140 and sensor 108 can be mutually auxiliary Help.For example, if any masking when, in shielded part, robot 100 needs to lean on laser sensor, the supersonic sensing of itself Device etc. carries out avoidance.
For example, in the first mobile electronic device 100, such as the moving process of robot 100, the first mobile electronic device Image processor 1040 in 100 and locating module 1042 using ceiling camera 140, robot 100 is entered Mobile state with Track, to ensure that the positional information of robot 100 is accurate, meanwhile, laser sensor, the ultrasonic sensor of the carrying of robot 100 Robot surrounding static, dynamic environment are detected, auxiliary avoids static, dynamic barrier and adjustment optimal path.
Alternatively or alternately, camera 140 is vertically mounted on the ceiling of mission area.The camera 140 Such as can be wide-angle camera, for example, fish-eye camera.Therefore, image processor 1040 uses the photograph shot by camera 140 During piece, it is only necessary to elevation information is considered, so as to carry out bi-directional scaling to captured image, it is possible to obtain the mesh in image Actual range between mark (for example, the first mobile electronic device 100, such as robot and the first barrier).The elevation information Can be by altimeter or range unit, such as acquired in laser range finder or infrared range-measurement system, the elevation information can be by Measured by first mobile electronic device 100, it can also be stored in advance in the first mobile electronic device 100.
Alternatively or alternately, camera 140 can also reuse monitoring camera.For example, the monitoring camera 140 It may be mounted on wall, then the first mobile electronic device 100 obtains the distance between target according to trigonometric function etc..As wanted Using original monitoring camera, it is necessary to the terrain clearance and angle lenses of the monitoring camera be known, so as to calculate figure The ratio of picture, to obtain accurate dimensional values.
Alternatively or alternately, if the camera lens limited viewing angle of single camera, can be arranged on simultaneously using multiple Camera on ceiling, to cover wider array of region, or even the whole interior of covering.
Alternatively or alternately, the first mobile electronic device 100 also includes charging pile 160, and wherein charging pile 160 wraps Include processor 104.For example, charging pile includes image processor 1040 and locating module 1042.
Embodiment two
Fig. 3 A-3D show the schematic diagram of the first mobile electronic device 100 according to one embodiment of the present utility model. Illustrated below so that the first mobile electronic device 100 is robot 300 as an example.
Reference picture 3A-3D, robot 300 is mainly by main body 310, wireless transceiver 320, microprocessor 330, sensor 340, and the part of charging pile 350 5 composition.User can be realized by wireless transceiver 320, microprocessor 330 and charging pile 350 Fixed point function in certain region, main body 310 is aided in complete the feature operation specified by sensor 340.
Reference picture 3A and Fig. 3 B, main body 310 are the carrier of microprocessor 330 and sensor 340, and microprocessor 330 can To assist the control unit of main body 310 to perform the function such as mobile.
Reference picture 3C and Fig. 3 D, wireless transceiver 320 can by mobile electronic devices such as mobile phone, computers, send bluetooth, The signal of communication such as WIFI, ZigBee, infrared, ultrasonic wave, row positional information can also be entered with microprocessor 330 and charging pile 350 Interactive function, and positional information is sent to microprocessor 330 in a manner of instructing.
Reference picture 3B, microprocessor 330 are located on main body 310, can receive the instruction of the transmission of wireless transceiver 320 simultaneously Enter line program processing to the instruction received, the result after processing is transmitted to the control unit of main body 310, perform corresponding appoint Business.Microprocessor 330 can also launch the signal of communication such as bluetooth, WIFI, ZigBee, infrared, ultrasonic wave to charging pile 350, real Stake charge function is now returned automatically.
Reference picture 3A, sensor 340 can detect surrounding environment in real time, realize intelligent barrier avoiding, signal of communication detection and hair The function of penetrating.
Reference picture 3D, charging pile 350 can receive the signal of the transmitting of wireless transceiver 320, while can also receive micro- place The signal of communication that reason device 330 is sent, realizes positional information interactive function, so as to guide 330 control main frame body of microprocessor 310 real Now return the function of stake charging.
Embodiment three
Fig. 4 shows the method flow in the first mobile electronic device according to one embodiment of the present utility model Figure.
Fig. 4 shows a kind of method 400 in the first mobile electronic device, wherein, the first mobile electronic device 100 wraps Include wireless signal transceiver 102, image processor 1040, locating module 1042 and motion module 106.This method 400 is wrapped Include:By being communicatively connected to install the wireless signal transceiver 102 of camera 140 indoors in block 410, obtain by The image for the mission area including the first mobile electronic device 100 that camera 140 is gathered;In block 420 by that can lead to It is connected to the image processor 1040 of wireless signal transceiver 102, the first mobile electron in the image in identification mission area letter The characteristic information of equipment 100;In block 430, by being communicatively coupled to the locating module 1042 of image processor 1040, lead to The characteristic information and the origin of coordinates established coordinate system and compare the first identified mobile electronic device 100 are crossed, identification first moves The position of dynamic electronic equipment 100 and profile, and draw the map of the mission area;In block 440, by being communicatively coupled to The locating module 1042 of second mobile electronic device, the map, and the second mobile electron are sent to the second mobile electronic device Equipment receives the task location set from user on map;And in block 450, by being communicatively coupled to positioning mould The motion module 106 of the mobile electronic device of block 1042 and second, receives the task location from the second mobile electronic device, and root According to the position and profile of task location and the first mobile electronic device 100 identified, path planning, and carried out according to the path Motion.
Alternatively or alternately, wherein the image of mission area also includes at least one first barrier, and method 400 is also Including (not shown):By image processor 1040, at least one first barrier in the image in identification mission area Characteristic information;By locating module 1042, the characteristic information and the origin of coordinates of more at least one first barrier, identification is at least The position of one barrier and profile;By motion module 106, according to task location, the first mobile electronic device positioned The position and profile of 100 position and at least one first barrier identified, path planning, and carried out according to the path Motion.
Alternatively or alternately, method 400 also includes (not shown):Moved by motion module 106; Gather the image of the mission area including the first mobile electronic device 106 in real time by camera;Pass through image processor 1040 and locating module 1042, dynamically determine characteristic information, position and the profile of the first mobile electronic device 100.
Alternatively or alternately, method 400 also includes (not shown):By being communicatively connected to the place The encoder and inertia measuring module of device are managed, aids in the camera to obtain position and the appearance of first mobile electronic device State.
Alternatively or alternately, camera is vertically mounted on the ceiling of the mission area.
Alternatively or alternately, wherein the first mobile electronic device 100 also includes charging pile 160, wherein charging pile 160 include image processor 1040 and locating module 1042.
Alternatively or alternately, wherein the first mobile electronic device 100 can also include sensor 108, method 400 Also include (not shown) to send the second obstacle information around the first mobile electronic device 100 by sensor 108 To motion module 106;And the motion orientation of the first mobile electronic device 100 is adjusted to avoid second by motion module 106 Barrier.
Alternatively or alternately, wherein the sensor includes ultrasonic sensor and/or laser sensor.
In description above, the utility model is described by reference to specific illustrative embodiment;However, it should manage Solution, in the case where not departing from the scope of the utility model described in this paper, can carry out various modifications and variations.Specification It should treat in an exemplary fashion with accompanying drawing, rather than it is restricted, and all such modifications are intended to be included in this reality With in new scope.Therefore, the scope of the utility model should by general embodiments as described herein and its legal equivalents and It is not only to be determined by above-mentioned specific embodiment.For example, the step of described in any method or process embodiments, can be with any suitable Sequence performs, and is not limited to the clear and definite order presented in a particular embodiment.In addition, the portion described in any device embodiment Part and/or element can be assembled with various arrangements or otherwise operatively configured, essentially identical with the utility model to produce Result, therefore be not limited to the concrete configuration described in specific embodiment.
The solution of benefit, other advantages and problem is described on specific embodiment above;However, any benefit The solution at place, advantage or problem, or any particular benefits, advantage or scheme can be caused to occur or become more apparent upon any Element is not necessarily to be construed as key, required or basic feature or part.
As it is used herein, term " comprising ", "comprising" or its any modification are intended to quote including for nonexcludability, make Process, method, article, composition or the device of element list, which must be included, not only includes those described elements, and can also Including not expressly listed or intrinsic main process, method, article, composition or device.Except that being not specifically delineated Outside a little, the said structure, layout, application, ratio, element, material or the part that are used in practice of the present utility model its It is combined and/or modification can be changed, or otherwise especially suitable for specific environment, manufacture specification, design ginseng Number or other operations require, without departing from its substantially principle.
Although describing the utility model by reference to some preferred embodiments herein, those skilled in the art will hold It is readily understood, in the case where not departing from spirit and scope of the present utility model, other application can substitute it is described in this paper that A bit.Therefore, the utility model is only limited by following claims.

Claims (8)

1. a kind of first mobile electronic device, including wireless signal transceiver, image processor, locating module and motion mould Block, wherein:
The wireless signal transceiver is communicatively connected to install at least one camera indoors, is configured to obtain by institute State the image for the mission area including first mobile electronic device that at least one camera is gathered;
Described image processor is communicatively coupled to the wireless signal transceiver, is configured to identify the image of the mission area In first mobile electronic device characteristic information, and establish coordinate system for described image, and in the mission area Each point assign corresponding coordinate value;
The locating module is communicatively coupled to described image processor, is configured to compare on the coordinate system established and is known The characteristic information and the origin of coordinates of other first mobile electronic device, identify position and the wheel of first mobile electronic device Exterior feature, and draw the map of the mission area;
The locating module is also communicatively coupled to the second mobile electronic device, is configured to second mobile electronic device The map is sent, and second mobile electronic device receives the task location set on the map from user;With And
The motion module is communicatively coupled to the locating module and second mobile electronic device, is configured to receive and From the task location of second mobile electronic device, and according to the task location and first movement identified The position of electronic equipment and profile, path planning, and moved according to the path.
2. the first mobile electronic device according to claim 1, wherein the image of the mission area is also including at least one First barrier,
Described image processor is additionally configured to identify the spy of at least one first barrier in the image of the mission area Reference ceases;
The locating module is additionally configured to the characteristic information and the origin of coordinates of at least one first barrier described in comparison, identifies institute State position and the profile of at least one barrier;
The motion module be additionally configured to according to the task location, first mobile electronic device positioned position and The position of at least one first barrier identified and profile, path planning, and moved according to the path.
3. the first mobile electronic device according to claim 1, wherein the motion module is moved, and it is described at least One camera gathers the image of the mission area including first mobile electronic device in real time;
Described image processor and the locating module are additionally operable to dynamically determine the spy of first mobile electronic device Reference breath, position and profile.
4. the first mobile electronic device according to claim 1, in addition to
The encoder and inertia measuring module of the processor are communicatively connected to, is configured to aid at least one shooting Head obtains position and the posture of first mobile electronic device.
5. the first mobile electronic device according to claim 1, wherein, at least one camera is vertically mounted on institute State on the ceiling of mission area.
6. the first mobile electronic device according to any one of claim 1-5, in addition to charging pile, wherein the charging Stake includes described image processor and the locating module.
7. the first mobile electronic device according to any one of claim 1-5, it can also include sensor, the sensor The second obstacle information around first mobile electronic device is sent to the motion module, the motion module and also matched somebody with somebody It is set to and adjusts the motion orientation of first mobile electronic device to avoid second barrier.
8. the first mobile electronic device according to claim 7, the sensor includes ultrasonic sensor and/or laser Sensor.
CN201720702958.4U 2017-06-16 2017-06-16 A kind of mobile electronic device Expired - Fee Related CN206833252U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201720702958.4U CN206833252U (en) 2017-06-16 2017-06-16 A kind of mobile electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201720702958.4U CN206833252U (en) 2017-06-16 2017-06-16 A kind of mobile electronic device

Publications (1)

Publication Number Publication Date
CN206833252U true CN206833252U (en) 2018-01-02

Family

ID=60776329

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201720702958.4U Expired - Fee Related CN206833252U (en) 2017-06-16 2017-06-16 A kind of mobile electronic device

Country Status (1)

Country Link
CN (1) CN206833252U (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108459595A (en) * 2017-06-16 2018-08-28 炬大科技有限公司 A kind of method in mobile electronic device and the mobile electronic device
WO2019001237A1 (en) * 2017-06-30 2019-01-03 炬大科技有限公司 Mobile electronic device, and method in mobile electronic device
CN110299030A (en) * 2019-06-28 2019-10-01 汉王科技股份有限公司 Handheld terminal, aircraft and its airspace measurement method, control method

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108459595A (en) * 2017-06-16 2018-08-28 炬大科技有限公司 A kind of method in mobile electronic device and the mobile electronic device
WO2018228258A1 (en) * 2017-06-16 2018-12-20 炬大科技有限公司 Mobile electronic device and method therein
WO2019001237A1 (en) * 2017-06-30 2019-01-03 炬大科技有限公司 Mobile electronic device, and method in mobile electronic device
CN110299030A (en) * 2019-06-28 2019-10-01 汉王科技股份有限公司 Handheld terminal, aircraft and its airspace measurement method, control method

Similar Documents

Publication Publication Date Title
CN207488823U (en) A kind of mobile electronic device
CN108459596A (en) A kind of method in mobile electronic device and the mobile electronic device
CN207115193U (en) A kind of mobile electronic device for being used to handle the task of mission area
US7865267B2 (en) Environment recognizing device, environment recognizing method, route planning device, route planning method and robot
CN100369487C (en) Object detection device, object detection server, and object detection method
TWI827649B (en) Apparatuses, systems and methods for vslam scale estimation
US9020301B2 (en) Method and system for three dimensional mapping of an environment
CN109074083A (en) Control method for movement, mobile robot and computer storage medium
CN108459597B (en) Mobile electronic device and method for processing tasks in task area
CN108073167A (en) A kind of positioning and air navigation aid based on depth camera and laser radar
KR20180070932A (en) A movable object and a method for controlling the same
CN108888187A (en) A kind of sweeping robot based on depth camera
CN207067803U (en) A kind of mobile electronic device for being used to handle the task of mission area
WO2018228256A1 (en) System and method for determining indoor task target location by image recognition mode
WO2008091714A1 (en) Method and apparatus for localizing and mapping the position of a set of points on a digital model
CN206833252U (en) A kind of mobile electronic device
WO2018228258A1 (en) Mobile electronic device and method therein
AU2019438843A1 (en) Recharging Control Method of Desktop Robot
CN207051738U (en) A kind of mobile electronic device
Wang et al. Three-dimensional reconstruction based on visual SLAM of mobile robot in search and rescue disaster scenarios
CN110260866A (en) A kind of robot localization and barrier-avoiding method of view-based access control model sensor
CN108780319A (en) Oftware updating method, system, mobile robot and server
CN108459594A (en) A kind of method in mobile electronic device and the mobile electronic device
CN115900710A (en) Dynamic environment navigation method based on visual information
CN111630346B (en) Improved positioning of mobile devices based on images and radio words

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20180102

Termination date: 20200616