CN108459594A - A kind of method in mobile electronic device and the mobile electronic device - Google Patents
A kind of method in mobile electronic device and the mobile electronic device Download PDFInfo
- Publication number
- CN108459594A CN108459594A CN201710437071.1A CN201710437071A CN108459594A CN 108459594 A CN108459594 A CN 108459594A CN 201710437071 A CN201710437071 A CN 201710437071A CN 108459594 A CN108459594 A CN 108459594A
- Authority
- CN
- China
- Prior art keywords
- information
- processor
- target
- electronic equipment
- movement
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims description 39
- 230000033001 locomotion Effects 0.000 claims abstract description 184
- 238000005352 clarification Methods 0.000 claims description 5
- 238000000605 extraction Methods 0.000 claims description 5
- 238000010408 sweeping Methods 0.000 description 14
- 238000004140 cleaning Methods 0.000 description 10
- 238000010926 purge Methods 0.000 description 9
- 230000008901 benefit Effects 0.000 description 8
- 230000008569 process Effects 0.000 description 8
- 238000010586 diagram Methods 0.000 description 6
- 230000004048 modification Effects 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 230000008054 signal transmission Effects 0.000 description 4
- 241001417527 Pempheridae Species 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000001914 filtration Methods 0.000 description 3
- 230000003993 interaction Effects 0.000 description 3
- 230000006378 damage Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000002452 interceptive effect Effects 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 208000027418 Wounds and injury Diseases 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000004888 barrier function Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 239000000428 dust Substances 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 208000014674 injury Diseases 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000000116 mitigating effect Effects 0.000 description 1
- 239000002245 particle Substances 0.000 description 1
- 238000005295 random walk Methods 0.000 description 1
- 238000012827 research and development Methods 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 239000002699 waste material Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0217—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with energy consumption, time reduction or distance reduction criteria
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0238—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
- G05D1/024—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0255—Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
Abstract
First movement electronic equipment includes camera, wireless signal transceiver, processor and motion module.The wherein depth distance information of the camera collection image information and the image information;The wireless signal transceiver is communicatively coupled to the camera, the image information and the depth distance information are supplied to the second mobile electronic device, and the selected information and following distance information to target from second mobile electronic device are received, wherein second mobile electronic device selectes the target based on the image information and depth distance information received;The processor is communicatively coupled to the wireless signal transceiver and the camera, is based on the selected information, the following distance information, the image information and the depth distance information, and the movable information based on the target is calculated for the first movement electronic equipment;And the motion module is communicatively coupled to the processor, according to the movable information, the target is followed to be moved.
Description
Technical field
The present invention relates to electronic device fields.Specifically, the present invention relates to intelligent robot system fields.
Background technology
Traditional movable machine people or other electronic equipments use the modes such as tracing sensor, infrared ray or ultrasonic wave
The two-dimensionally or three-dimensionally figure in space where scanning, by autonomous positioning and movement or collision reaction deflecting random walk, simultaneously
Execute other preset functions.Mode of operation issues an instruction to realize with user by remote controler or base station remote control etc..
Traditional mobile robot or electronic equipment is because drawing and location technology are immature or inaccurate, in the course of work
In can not judge the complicated state in ground and space completely, be susceptible to the case where losing position and direction, when ground out-of-flatness,
Whens thering is step or high low head, ground to have sundries etc., robot can it is stuck, lose coordinate, can not recharge.Certain types are not due to
Have stationkeeping ability, can only be by the physical principle of collision reaction come deflecting, or even household items or robot can be caused certainly
The problems such as body damages even personal injury, is interfered to user.Because the level of intelligence of robot is not enough to really judge ground
Face and spatial depiction lead to that track route can be repeated in moving process, detect environment repeatedly, thus electricity and time all waste
On many idle works.
Invention content
Mobile electronic device system described in the embodiment of the present invention, such as robot system are intended to carry out with user's cooperation
Work, robot system determine mission area, the starting of mission area are arrived in robot by following target, such as user
After point, moved according to the pattern of regulation in the mission area.Thus surface state, residing can not be judged by solving robot
The problem of position and best mobile route.The tracing sensor that robot is replaced with human eye replaces robot with the planning of human brain
Algorithm;Replace the labour of people with the duplication of labour of robot, saves intelligent robot research and development and cost and the investment of equipment
Cost in sweeping, dust collecting mechanism.By the advantage combination among the strong ones of the advantage of people and robot, existing sweeping robot is made up
Various weakness, while user being conformed to the principle of simplicity to free in the multiple labour of substance.
Robot system human-computer interaction described in embodiment is not necessarily to map making, can improve the work effect of robot
Rate, while mitigating the live load of user, the intelligence of employment compensates for the technology limitation of robot itself.
First movement electronic equipment according to one embodiment includes camera, wireless signal transceiver, processor
And motion module, wherein the camera is configured to the depth distance information of acquisition image information and described image information;Institute
It states wireless signal transceiver and is communicatively coupled to the camera, be configured to believe described image information and the depth distance
Breath is supplied to the second mobile electronic device, and receive selected information to target from second mobile electronic device and
Following distance information, wherein second mobile electronic device selectes institute based on the image information and depth distance information received
State target;The processor is communicatively coupled the wireless signal transceiver, be configured to the selected information, it is described with
With range information, described image information and the depth distance information, calculated based on described for the first movement electronic equipment
The movable information of target;And the motion module is communicatively coupled to the processor, is configured to be believed according to the movement
Breath, follows the target to be moved.
Alternatively, or in addition, the wireless signal transceiver, which is additionally configured to receive, comes from second mobile electronic device
Mission bit stream, the starting point that the mission bit stream is used to that the first movement electronic equipment to be guided to reach mission area;The place
Reason device is additionally configured to, according to the mission bit stream, the position for including the starting point for first movement electronic equipment setting
The movable information of information;And the motion module is additionally configured to, according to the movable information, follow described in the target arrival
Mission area starting point.
Alternatively, or in addition, first movement electronic equipment further includes memory, and the memory is communicatively coupled to
The processor, for storing the start point information and reaching the routing information of the starting point;The motion module is also matched
It is set to according to the origin information and the routing information being stored in the memory, reaches the mission area starting point.
Alternatively, or in addition, wherein the wireless signal transceiver, which is additionally configured to receive, comes from second mobile electron
The mission bit stream of equipment, the mission bit stream is for planning mission area, wherein the processor, which is additionally configured to receive, to be come
The motion module is guided to be moved according to following either type from the instruction target of second mobile electronic device
Instruction information, and the motion module according to instruction information, moved in a corresponding manner:When the target is around described
When the edge of mission area is moved for one week, the motion module be configured as with the edge one week of the mission area for boundary into
Row movement is to complete task;When diagonal motion of the target along the mission area, the motion module be configured as with
Rectangle corresponding to the diagonal line is that boundary is moved to complete to appoint;Or the target described in the processor None- identified
When path, the motion module is configured as according to the farthest point that the processor identifies being that radius carries out sector region movement.
Alternatively, or in addition, first movement electronic equipment further includes memory, and the memory is communicatively coupled to
The processor, for the instruction information and corresponding pattern information will to be stored;The motion module is additionally configured to basis and deposits
The instruction information and the corresponding pattern information being stored in the memory, are moved in the mission area.
Alternatively, or in addition, first movement electronic equipment further includes charging pile, wherein the charging pile includes the place
Manage device.
Alternatively, or in addition, first movement electronic equipment also may include that sensor, the sensor are moved described first
Obstacle information around dynamic electronic equipment is sent to the processor, and the processor is additionally configured to adjust the first movement
The movement orientation of electronic equipment is with avoiding obstacles.
Alternatively, or in addition, the sensor includes ultrasonic sensor and/or laser sensor.
Another embodiment provides for a kind of methods in first movement electronic equipment.The first movement electronic equipment
Including camera, wireless signal transceiver, processor and motion module, the method includes:It is acquired by the camera
The depth distance information of image information and described image information;By the wireless communication for being communicatively coupled to the camera
Described image information and the depth distance information are supplied to the second mobile electronic device by number transceiver, and are received and come from institute
The selected information and following distance information to target for stating the second mobile electronic device, wherein the second mobile electronic device base
The target is selected in the image information and depth distance information received;By being communicatively coupled the wireless signal transmitting-receiving
The processor of device is based on the selected information, the following distance information, described image information and depth distance letter
Breath calculates the movable information based on the target for the first movement electronic equipment;And by being communicatively coupled to
The motion module for stating processor follows the target to be moved according to the movable information.
Using the scheme of embodiment, figure can be swept to progress SLAM in entire family room to avoid traditional robot needs and built
Mould and the miscellaneous work for cleaning entire family's room area.
Description of the drawings
The more complete understanding of the present invention is obtained by referring to the detailed description that associated drawings describe, in the accompanying drawings
Similar reference numeral refers to similar part.
Fig. 1 is shown where first movement electronic equipment and the second mobile electronic device according to an embodiment of the invention
The schematic diagram of system.
Fig. 2 shows the block diagrams of the processor in first movement electronic equipment according to an embodiment of the invention.
Fig. 3 shows the signal according to an embodiment of the invention that angle (orientation) information is calculated using triangle relation
Figure.
Fig. 4 shows the method flow diagram according to an embodiment of the invention in first movement electronic equipment.
Specific implementation mode
Embodiment one
Fig. 1 shows 100 and second mobile electronic device of first movement electronic equipment according to an embodiment of the invention
The schematic diagram of 120 place systems.
Referring to Fig.1, first movement electronic equipment 100 includes but not limited to sweeping robot, industrial automation robot, clothes
Business humanoid robot, Disaster Relief Robot of getting rid of the danger, underwater robot, robot for space, unmanned vehicle, automatic driving vehicle etc..
Second mobile electronic device 120 includes but not limited to:Mobile phone, tablet computer, laptop, remote controler etc..It moves
Dynamic electronic equipment includes optionally operation interface.In an optional embodiment, mobile electronic device is mobile phone, operates boundary
Face is cell phone application.
Signal transmission form between first movement electronic equipment 100 and the second mobile electronic device 120 includes but unlimited
In:Bluetooth, WIFI, ZigBee, infrared, ultrasonic wave, UWB etc., in the present embodiment by taking signal transmission form is WIFI as an example into
Row description.
As shown in Figure 1, in one embodiment, first movement electronic equipment 100 includes camera 102, wireless signal receipts
Send out device 104, processor 106 and motion module 108.Camera 102 is configured to the depth of acquisition image information and image information
Range information.Camera 102 is for example, it may be depth (RGB-D) camera.The RGB-D cameras not only acquire captured figure
The colour element information of picture also acquires the depth distance of each pixel in shown image.
Camera 102 can be automatically turned on when first movement electronic equipment 100 is opened, and can also be, for example, by second
The user of mobile electronic device 120 utilizes the second mobile electronic device 120, such as cell phone application to open first movement electronic equipment
100 RGB-D cameras 102.Range finder module inside camera 102 will be to the depth of each pixel in image shown in camera lens
Degree distance is quickly measured, and image information and depth distance information are passed in processor 104.The range finder module for example may be used
Can also be infrared distance measurement module to be laser ranging module, etc..Processor 104 further comprises image processor and data
Processor will be described in detail below in conjunction with Fig. 2.
Wireless signal transceiver 104 is communicatively coupled to camera 102, is configured to believe image information and depth distance
Breath is supplied to the second mobile electronic device 120.
Second mobile electronic device 120, such as mobile phone are receiving the image letter from first movement electronic equipment 100
Include in mobile phone screen by content synchronization shown in RGB-D cameras 102 after breath and depth distance information.Then, it second moves
The user of dynamic electronic equipment 120 selects setting to need the target followed in mobile phone screen center, and according to depth distance information, by this
The real-time range of target is shown in screen.For example, user select the user shown in image information oneself as
Target, that is, needing the object followed.User can control camera 102 by APP and be directed at user oneself, and frame is selected
The profile of oneself shown in cell phone application, to complete to select.Then, APP shows the distance letter for the target selected by frame in real time
Breath, for example, according to the depth distance information of RGB-D cameras, target is 1.2 meters at a distance from first movement electronic equipment.It should
1.2 meters are shown in cell phone application in real time.In addition, user can also set following distance information by mobile phone A pp.For example, the
It is 1 meter that one mobile electronic device 100, which follows the distance of target, that is, optionally, first movement electronic equipment 100 follows target
Distance threshold be 1 meter.In the present embodiment, we are illustrated with target for user.It will be understood by those skilled in the art that
The target can also be other vehicles such as another vehicle, bicycle, ship or any visible and moveable object
Deng.
The reception of first movement electronic equipment 100 to the selected information of target and is followed from the second mobile electronic device 120
Range information.Wherein as discussed above, the second mobile electronic device 120 is based on the image information selected target received.
Processor 106 is communicatively coupled to wireless signal transceiver 104 and camera 102, and processor 106 is configured to base
In to the selected information of target, following distance information, image information and depth distance information, being first movement electronic equipment 100
Calculate the movable information based on target.
Fig. 2 shows the block diagrams of the processor 106 in first movement electronic equipment according to an embodiment of the invention.
As shown in Fig. 2, processor 106 includes image processor 2060 and data processor 2062.Optionally, processor 106 further includes
Path planning module 2064, obstacle avoidance module 2066 and locating module 2068.Image processor 2060 is communicatively connected to image
First 102, it is configured to, according to image information and depth distance information extraction clarification of objective information, according to characteristic information, lock mesh
Mark, and image information, depth distance information and characteristic information are sent to data processor 2062.Data processor 2062 can
It is communicably connected to image processor 2060, is configured to clarification of objective information, image information and depth distance according to locking
Information calculates the movable information based on target.The extraction of characteristics of image may be used based on Scale invariant features transform (Scale
Invariant Feature Transform, SIFT) algorithm or accelerate robust feature (Speeded Up Robust
Features, SURF) algorithm progress.
Specifically, the object content selected frame is carried out processing analysis by the image processor 2060 in processor 106, locking
The characteristics of image of the target followed.The characteristics of image can be calculated each pixel of selected target by image processor 2060
Mean depth range information obtains.For example, image processor 2060 calculates the average depth of each pixel of chosen user
Distance is spent, and ignores the depth distance of the background in image.Then, image processor 2060 is by characteristic information and the depth of target
Degree range information reaches data processor 2062.Data processor 2062 calculates the orientation for following target using triangle relation,
That is, angle information of the target relative to first movement electronic equipment 100.Then, data processor 2062 is by comparing in real time
The depth distance of target image and the threshold value of following distance and it is calculated follow target bearing angle information, for path advise
It draws module 2064 and mobile message is provided.The acquisition of angle information and mobile message is further retouched below in conjunction with Fig. 3
It states.
Fig. 3 shows the signal according to an embodiment of the invention that angle (orientation) information is calculated using triangle relation
Figure.Specifically, C points indicate the position of camera.After A points indicate that the position of the previous sampled point of target, B points indicate target movement
Latter sampled point position.By measuring C points to the distance of A points, the distance of the distances of C points to B points and A points to B points,
The angle information α of line segment AC to line segment BC can be obtained, namely follows the motion direction of target.Optionally, when camera C points revolve
When turning, then phase compensation is carried out to angle information α, obtain the actual angular offset alpha between B points and A points '.Then, at data
Reason device 2062 obtains the angular offset alpha being calculated ', the first movement electronic equipment 100 and target extracted from image information
Between actual range, e.g. 1.2 meters and 1 meter of depth distance threshold value.Data processor 2062 is based on above- mentioned information, is
First movement electronic equipment 100 calculates the movable information based on target.For example, 1.2 meters of actual range is more than 1 meter of threshold distance,
Therefore, first movement electronic equipment 100 should closely follow target and move.
It is back to Fig. 1 and Fig. 2 below, motion module 108 is communicatively coupled to processor 106, is configured to according to movement
Information follows the target to be moved.For example, data processor 2062 in processor 106 is by first movement electronic equipment
100 movable information is supplied to path planning module 2064, path planning module 2064 to indicate that motion module 108 improves speed,
To reduce the distance between target, for example, the distance reduced is 0.2 meter.
In the moving process of subsequent first movement electronic equipment 100, image processing module 2060 constantly utilizes camera shooting
First 102, which find and lock frame, selects target, itself movement velocity, direction is adjusted, to realize following in real time to target.
Optionally, first movement electronic equipment 100 further includes encoder 114, is set for recording first movement electronics in real time
Standby Indoor Location Information, for example, the position of opposite starting point charging pile 140, and after terminating to follow task, auto-returned fills
Electric stake 140 is awaited orders.In addition, encoder 114 carrys out computing machine as odometer by the rotation information of recorder people's wheel
The track that people passes by.In addition, the locating module 2068 in first movement electronic equipment 100 realizes local positioning, that is, first movement
100 moment of equipment determines the relative position of oneself and the second mobile device 120, so as to durings such as dropout, avoidance etc.,
The relative position originally set can be returned to.
Optionally, first movement electronic equipment 100 further includes sensor 112.Sensor 112 for example can be ultrasonic sensing
Device or laser sensor.In addition, processor 106 further includes obstacle avoidance module 2066.In operation, obstacle avoidance module 2066 is according to super
The data of sonic sensor and laser sensor calculate obstacle information, and reach path planning module 2064.In conjunction with following
Target range, path planning module meter 2064 calculate it is optimal follow path, then by motion module 108 to first movement electronics
Control is made in the movement of equipment.Image processor 2060 is constantly found and is locked using camera in subsequent robot's moving process
Determine frame and select content, to realize following in real time to target.
One of a variety of following algorithms may be used in first movement electronic equipment, follow target.Above-mentioned following algorithm include but
It is not limited to correlation filtering (Kernel Correlation filter, KCF), mean shift algorithm, (Mean Shift, MS
Algorithm), optical flow method (Optical Flow, OF algorithm), Kalman filtering algorithm (Kalman Filter, KF algorithm) and grain
Sub- filtering algorithm (Particle Filter, PF algorithm) etc..
Alternatively, or in addition to, first movement electronic equipment 100 further includes charging pile 140, and wherein charging pile 140 includes
Processor 106.That is, the function of processor 106 can be integrated in charging pile.
Signal transmission form between first movement electronic equipment 100 and charging pile 140 includes but not limited to:Bluetooth,
WIFI, ZigBee, infrared, ultrasonic wave, UWB etc., optional signal transmission form are bluetooths.
Embodiment two
Embodiment two discloses the road for guiding first movement electronic equipment 100 to reach mission area starting point by human-computer interaction
The example of diameter guiding.
First, wireless signal transceiver 104 receives the mission bit stream from the second mobile electronic device 120.The task is believed
Cease the starting point for guiding first movement electronic equipment 100 to reach mission area.Then processor 106 is believed also according to the task
Breath includes the movable information of the location information of starting point for the setting of first movement electronic equipment 100.Motion module 108 is according to next
From the movable information of processor 106, target is followed to reach mission area starting point.
Optionally, first movement electronic equipment 100 further includes memory 110.Memory 110 is communicatively coupled to handle
Device 106 and motion module 108, for storing start point information and following the routing information of target arrival starting point.Motion module
108 are additionally configured to, according to the origin information and routing information being stored in memory 110, reach mission area starting point.
For example, being sweeping robot with first movement electronic equipment 100, the second mobile electronic device 120 is mobile phone, task
To be illustrated for cleaning task.Those skilled in the art will be understood that first movement electronic equipment 100 is not limited to sweeper
Device people, the second mobile electronic device 120 are also not necessarily limited to mobile phone, and task is also not necessarily limited to the cleaning task being mentioned below, can also be
Walking task etc..When carrying out the task of Route guiding, the user of mobile phone can be waken up by gesture, voice wakes up, APP is called out
The modes such as awake, fuselage key wakeup inform that sweeping robot will carry out Route guiding study.Wait for that sweeping robot is ready
Afterwards, user guides sweeping robot to move towards different mission areas, such as parlor, kitchen, bedroom 1, bedroom 2 etc. respectively, and carries out
Region division guides.When user guides sweeping robot to reach first mission area starting point, starting point is generally visitor
The path planning module 2064 of Room entrance, kitchen entrance, bedroom entrance etc., sweeping robot records this mission area starting point
Confidence ceases, and reaches the path of this mission area starting point.It is above-mentioned to guide the lower Path selection reached by target, namely by user
It avoids traditional sweeping robot and needs to sweep figure to carrying out SLAM in entire family room and model and clean entire family room's inner region
The miscellaneous work in domain.
In starting point, robot is by by the RGB-D camera identification missions region starting point entrained by fuselage, such as sleeping
Room doorway, and confirm with user.User can be confirmed by user gesture, voice, fuselage button or APP.In addition, appointing
Business area's starting point also can directly be set by user by APP by man-machine interactive interface.
Embodiment three
Embodiment three discloses the example for guiding first movement electronic equipment 100 to carry out regional planning guiding by human-computer interaction
Son.
First, wireless signal transceiver 104 receives the mission bit stream from the second mobile electronic device 120, task letter
Breath is for planning mission area.Processor 106 receives the instruction target from the second mobile electronic device 120 according to following
The instruction information that either type guided-moving module 108 is moved, and motion module 108 according to instruction information, with corresponding
Pattern is moved:Work as target, such as when user moves for one week around the edge of mission area, motion module 108 is configured
To be that boundary is moved to complete task with the edge one week of the mission area;Optionally, work as target, such as user along mission area
Diagonal motion when, motion module 108 be configured as using the rectangle corresponding to diagonal line as boundary moved with complete appoint
Business;Or optionally, when the path of 106 None- identified target of processor, motion module 108 is configured as according to processor 106
The farthest point of identification is that radius carries out sector region movement to complete task.
Optionally, first movement electronic equipment 100 further includes memory 110, and memory 110 is communicatively coupled to handle
Device 106 and motion module 108, for storing instruction information and corresponding pattern information;Motion module 108 is additionally configured to basis and deposits
The instruction information and corresponding pattern information being stored in memory 110, are moved in mission area.
For example, being sweeping robot with first movement electronic equipment 100, the second mobile electronic device 120 is mobile phone, task
To be illustrated for cleaning task.Those skilled in the art will be understood that first movement electronic equipment 100 is not limited to sweeper
Device people, the second mobile electronic device 120 are also not necessarily limited to mobile phone, and task is also not necessarily limited to the cleaning task being mentioned below.First, mobile phone
User guided robot is entered into mission area, planning guided to the mission area, and be that the region is named by App,
The concrete mode of mission area planning and guidance will illustrated subsequently.Wait for that this mission area Route guiding and regional planning guiding are completed
Afterwards, user can continue to lead robot to next task area, carry out the guiding in path and region again, and name.User can
To establish multiple tasks area, after the completion of all mission areas guide, the cleaning task of robot afterwards can be according to user's
It cleans order and voluntarily reaches the start-up operation of mission area starting point, guided again without user.
When carrying out mission area planning and guidance, user can pass by different paths to carry out mission area rule with guided robot
It draws.Such as:Around task area edge one week clearly to mark off the region for needing to clean;Directly diagonally, from mission area
Initial point moves towards mission area distalmost end, and automatic decision is gone out this diagonal line maximum rectangular extent that may be present as task by robot
Area;Alternatively, can not be identified well by robot in the path of user's walking and when establishing mission area, robot will be by identifying
The farthest point that is reached of user be that radius carries out sector region cleaning.User can be by above-mentioned after the completion of task Division
Man-machine interactive interface preserves task and reuses.For example, user can select on the APP of the second mobile electronic device 120
First movement electronic equipment should be with which kind of Pattern completion task.For example, can be on the APP of the second mobile electronic device 120
Display at least Three models:Closed mode, diagonal model and radial mode in edge one week.First movement electronic equipment
120, according to the selected pattern of user, carry out corresponding cleaning modes.
In addition, cleaning frequency can also be respectively set for different task area in user, also task can be cleaned every time in setting
When choose one or more mission areas.Robot can go to purging zone to carry out by contexture by self in any manner known in the art
It cleans, or multiple and different destinations is planned according to real-time indoor arrangement by user, make robot with optimal by human intervention
It moves in path.
Alternatively, or in addition to, purging zone can be any shape of the location determination based on mobile electronic device
And range.For example, purging zone can be the border circular areas centered on the position of mobile electronic device, the radius in the region can
To be such as 0.1-10 meters, such as 0.5-5 meters, preferably 1 meter;For example, purging zone can be the rectangular area of the different length of sides;With
Family can arbitrarily set the shape and range of purging zone by mobile electronic device.For different mobile electronic devices
Position can set the shape and range of different purging zones.
In one embodiment, purging zone is the location determination based on second mobile electronic device.
In another embodiment, purging zone is the location determination based on multiple second mobile electronic devices, this
When intelligent charging spot 140 can be generated according to all location informations and clean task path and be sent to robot, this road is pressed by robot
Diameter is cleaned.
In another embodiment, purging zone is the continuous moving track of the position based on the second mobile electronic device
Determining, the second mobile electronic device continuously sends out the wireless signal of the position with mobile electronic device at this time, and robot can
To postpone or keep the rule of a certain distance, continuous motion track is formed according to these wireless signals received, in movement
In the process to the cleaning of the ground progress predetermined amplitude along road.The distance can be such as 0.1-5 meters, such as 0.5-2 meters, preferably
1 meter.For example, robot is moved with certain left front right preceding angle of S types.This process without relying on map, can taking human as programme path or
Pro-active intervention is to avoid fixed or movable barrier, such as moves the furniture that may be bumped against on path.When the second mobile electron
When equipment stops movement and still sends out wireless signal, it is repeatedly clear that robot can change motion track centered on radio signal source
Current region is swept, such as pitch of the laps movement is done by certain radius around radio signal source.
In one embodiment, sweeping robot also may include sensor and motion-control module.Sensor include but
It is not limited to ultrasonic sensor and laser sensor.In one embodiment, the sensor in sweeping robot is by sweeper
Obstacle information around device people is sent to motion-control module, adjusts the movement orientation avoiding obstacles of sweeping robot.
Example IV
Fig. 4 shows the method flow diagram according to an embodiment of the invention in first movement electronic equipment.
The first movement electronic equipment 100 includes camera 102, wireless signal transceiver 104, processor 106 and fortune
Dynamic model block 108.This method 400 is included in block 410, and image information and described image information are acquired by the camera 102
Depth distance information;In block 420, by being communicatively coupled to the wireless signal transceiver of the camera 102
Described image information and the depth distance information are supplied to the second mobile electronic device by 104, and in block 430, reception comes from
The selected information and following distance information to target of second mobile electronic device, wherein second mobile electronic device
The target is selected based on the image information and depth distance information received;It is described by being communicatively coupled in block 440
The processor 106 of wireless signal transceiver 104 and the camera 102 is based on the selected information, the following distance
Information, described image information and the depth distance information calculate for the first movement electronic equipment based on the target
Movable information;And in block 450, by being communicatively coupled to the motion module 108 of the processor 106, according to
The movable information follows the target to be moved.
Optionally, method 400 further includes that (not shown) is received by the wireless signal transceiver 104 from described
The mission bit stream of second mobile electronic device, the mission bit stream is for guiding the first movement electronic equipment to reach mission area
Starting point;Include described for first movement electronic equipment setting by the processor 106 according to the mission bit stream
The movable information of the location information of starting point;And the mesh is followed according to the movable information by the motion module 108
Mark reaches the mission area starting point.
Optionally, the first movement electronic equipment further includes being communicatively coupled to the processor 106 and the fortune
The memory 110 of dynamic model block 108, method 400 further include (not shown):By the memory 110, the starting is stored
Point information and the routing information for reaching the starting point;By the motion module 108, according to being stored in the memory 110
The interior origin information and the routing information reach the mission area starting point.
Optionally, method 400 further includes (not shown):By the wireless signal transceiver 104, receives and come from institute
The mission bit stream of the second mobile electronic device is stated, the mission bit stream passes through the processor for planning mission area
106, it receives the instruction target from second mobile electronic device and guides the movement mould according to following either type
The instruction information that block is moved, and the motion module 108 is moved in a corresponding manner according to instruction information:Work as institute
When stating target and being moved within one week around the edge of the mission area, the motion module 108 is configured as with the mission area
Edge one week moved for boundary to complete task;It is described when diagonal motion of the target along the mission area
Motion module 108 is configured as being moved to complete task by boundary of the rectangle corresponding to the diagonal line;Or when described
Described in 106 None- identified of processor when the path of target, the motion module 108 is configured as being known according to the processor 106
Other farthest point is that radius carries out sector region movement to complete task.
Optionally, first movement electronic equipment further includes being communicatively coupled to the processor and the motion module
Memory 110.Method 400 further includes (not shown):By the memory 110, the instruction information and accordingly is stored
Pattern information;By the motion module, according to the instruction information that is stored in the memory 110 and described corresponding
Pattern information, moved in the mission area.
Optionally, processor further includes image processor 2060 and data processor 2062, wherein described image processor
2060 are communicatively connected to the camera 102 and the data processor 2062 is communicatively connected to described image
Processor 2060.Method 400 further includes (not shown):According to described image information and the depth distance information extraction institute
State clarification of objective information, according to the characteristic information, lock the target, and by described image information, the depth away from
It is sent to the data processor 2062 from information and the characteristic information;According to the characteristic information of the target of locking, institute
Image information and the depth distance information are stated, the movable information based on target is calculated.
Optionally first movement electronic equipment further includes charging pile 140, wherein the charging pile 140 includes the processor
106。
Optionally, first movement electronic equipment also may include that sensor, method 400 further include that (not shown) passes through institute
Sensor is stated, the obstacle information around the first movement electronic equipment is sent to the processor;And by described
Processor adjusts the movement orientation of the first movement electronic equipment with avoiding obstacles.
Optionally, sensor described in first movement electronic equipment includes ultrasonic sensor and/or laser sensor.
In description in front, the present invention is described by reference to specific illustrative embodiment;It will be appreciated, however, that
In the case of the range for not departing from the invention described herein, various modifications can be carried out and variation.The description and the appended drawings are answered
Treat in an exemplary fashion, rather than it is restrictive, and all such modifications are intended to be included in the scope of the present invention
It is interior.Therefore, the scope of the present invention should be by general embodiments as described herein and its legal equivalents rather than only by above-mentioned specific
Embodiment determines.For example, the step of described in any method or process embodiments can be performed in any order, and it is not limited to
The clear sequence presented in a particular embodiment.In addition, the component and/or element described in any device embodiment can be each
Kind arrangement assembling otherwise operatively configures, to generate with of the invention essentially identical as a result, being therefore not limited to specific
Concrete configuration described in embodiment.
Above benefit, other advantages and solution to the problem are described about specific embodiment;However, any benefit
Place, advantage or solution to the problem, or any particular benefits, advantage or scheme can be caused to occur or become more apparent upon any
Element is not necessarily to be construed as crucial, required or basic feature or component.
As it is used herein, the terms "include", "comprise" or its any modification are intended to reference non-exclusive inclusion, make
Process, method, article, composition or the device that must include element list include not only those of described element, but also can also
Including not expressly listed or intrinsic main process, method, article, composition or device.In addition to that being not specifically delineated
Except a little, other groups of the above structure, layout, application, ratio, element, material or the component that use in the practice of the invention
Conjunction and/or modification can be changed, or otherwise especially suitable for specific environment, manufacture specification, design parameter or
Other operations require, without departing from its substantially principle.
Although describing the present invention by reference to certain preferred embodiments herein, those skilled in the art are by easy reason
Solution, without departing from the spirit and scope of the present invention, other application can substitute those of described here.Therefore,
The present invention is only limited by following claims.
Claims (18)
1. a kind of first movement electronic equipment, including camera, wireless signal transceiver, processor and motion module, wherein:
The camera is configured to the depth distance information of acquisition image information and described image information;
The wireless signal transceiver is communicatively coupled to the camera, is configured to described image information and the depth
Range information is supplied to the second mobile electronic device, and receives selecting to target from second mobile electronic device
Information and following distance information, wherein second mobile electronic device selectes the target based on the image information received;
The processor is communicatively coupled to the wireless signal transceiver and the camera, is configured to described selected
Information, the following distance information, described image information and the depth distance information are the first movement electronic equipment meter
Calculate the movable information based on the target;And
The motion module is communicatively coupled to the processor, is configured to, according to the movable information, follow the target
It is moved.
2. first movement electronic equipment according to claim 1, wherein the wireless signal transceiver is additionally configured to receive
Mission bit stream from second mobile electronic device, the mission bit stream is for guiding the first movement electronic equipment to arrive
Up to the starting point of mission area;
The processor is additionally configured to, and includes described rise for first movement electronic equipment setting according to the mission bit stream
The movable information of the location information of initial point;And
The motion module is additionally configured to, according to the movable information, the target be followed to reach the mission area starting point.
3. first movement electronic equipment according to claim 2, further includes memory,
The memory is communicatively coupled to the processor and the motion module, for store the start point information and
Reach the routing information of the starting point;
The motion module is additionally configured to, according to the origin information and the routing information that are stored in the memory, arrive
Up to the mission area starting point.
4. first movement electronic equipment according to claim 1, wherein the wireless signal transceiver is additionally configured to receive
Mission bit stream from second mobile electronic device, the mission bit stream is for planning mission area, wherein
The processor is additionally configured to receive the instruction target from second mobile electronic device according to following any
Mode guides the instruction information that the motion module is moved, and the motion module is according to instruction information, with corresponding mould
Formula is moved:
When the target is moved for one week around the edge of the mission area, the motion module is configured as with described
The edge in business area is moved for boundary to complete task for one week;
When diagonal motion of the target along the mission area, the motion module is configured as right with diagonal line institute
The rectangle answered is that boundary is moved to complete task;Or
When the path of the target described in the processor None- identified, the motion module is configured as being known according to the processor
Other farthest point is that radius carries out sector region movement to complete task.
5. first movement electronic equipment according to claim 4, further includes memory,
The memory is communicatively coupled to the processor and the motion module, for storing the instruction information and phase
The pattern information answered;
The motion module is additionally configured to according to the instruction information and the corresponding pattern being stored in the memory
Information is moved in the mission area.
6. first movement electronic equipment according to claim 1, wherein the processor further includes image processor sum number
According to processor, wherein
Described image processor is communicatively connected to the camera, is configured to
According to clarification of objective information described in described image information and the depth distance information extraction,
According to the characteristic information, the target is locked, and
Described image information, the depth distance information and the characteristic information are sent to the data processor;
The data processor is communicatively connected to described image processor, is configured to the feature of the target according to locking
Information, described image information and the depth distance information calculate the movable information based on target.
7. the first movement electronic equipment according to any one of claim 1-6, further includes charging pile, wherein the charging
Stake includes the processor.
8. the first movement electronic equipment according to any one of claim 1-7 also may include sensor, the sensor
Obstacle information around the first movement electronic equipment is sent to the processor, the processor is additionally configured to adjust
The movement orientation of the first movement electronic equipment is with avoiding obstacles.
9. first movement electronic equipment according to claim 8, the sensor includes ultrasonic sensor and/or laser
Sensor.
10. a kind of method in first movement electronic equipment, the first movement electronic equipment includes camera, wireless signal
Transceiver, processor and motion module, the method includes:
Pass through the depth distance information of the camera collection image information and described image information;
By being communicatively coupled to the wireless signal transceiver of the camera by described image information and the depth
Range information is supplied to the second mobile electronic device;
The selected information and following distance information to target from second mobile electronic device are received, wherein described second
Mobile electronic device selectes the target based on the image information and depth distance information received;
The processor by being communicatively coupled the wireless signal transceiver and the camera is based on the selected letter
Breath, the following distance information, described image information and the depth distance information calculate for the first movement electronic equipment
Movable information based on the target;And
By being communicatively coupled to the motion module of the processor, according to the movable information, the target is followed
It is moved.
11. according to the method described in claim 10, further including:
The mission bit stream from second mobile electronic device, the mission bit stream are received by the wireless signal transceiver
For guiding the first movement electronic equipment to reach the starting point of mission area;
Include the starting point for first movement electronic equipment setting by the processor according to the mission bit stream
The movable information of location information;And
By the motion module according to the movable information, the target is followed to reach the mission area starting point.
12. according to the method for claim 11, wherein the first movement electronic equipment further includes being communicatively coupled to
The memory of the processor and the motion module, the method further include:
By the memory, stores the start point information and reach the routing information of the starting point;
It is reached according to the origin information and the routing information being stored in the memory by the motion module
The mission area starting point.
13. according to the method described in claim 10, further including:
By the wireless signal transceiver, the mission bit stream from second mobile electronic device, the task letter are received
It ceases for planning mission area,
By the processor, the instruction target from second mobile electronic device is received according to following either type
Guide the instruction information that the motion module is moved, and the motion module is according to instruction information, in a corresponding manner into
Row movement:
When the target is moved for one week around the edge of the mission area, the motion module is configured as with described
The edge in business area is moved for boundary to complete task for one week;
When diagonal motion of the target along the mission area, the motion module is configured as right with diagonal line institute
The rectangle answered is that boundary is moved to complete task;Or
When the path of the target described in the processor None- identified, the motion module is configured as being known according to the processor
Other farthest point is that radius carries out sector region movement to complete task.
14. according to the method for claim 13, wherein the first movement electronic equipment further includes being communicatively coupled to
The memory of the processor and the motion module, the method further include:
By the memory, the instruction information and corresponding pattern information are stored;
By the motion module, believed according to the instruction information being stored in the memory and corresponding pattern
Breath, is moved in the mission area.
15. according to the method described in claim 10, wherein, the processor further includes image processor and data processor,
Wherein described image processor is communicatively connected to the camera and the data processor is communicatively connected to institute
Image processor is stated, the method includes
According to clarification of objective information described in described image information and the depth distance information extraction,
According to the characteristic information, the target is locked, and
Described image information, the depth distance information and the characteristic information are sent to the data processor;
According to the characteristic information of the target of locking, described image information and the depth distance information, it is based on described in calculating
The movable information of target.
16. according to the method described in any one of claim 10-15, wherein the first movement electronic equipment further includes charging
Stake, wherein the charging pile includes the processor.
17. according to the method described in any one of claim 10-16, wherein the first movement electronic equipment also may include passing
Sensor, the method further include:
By the sensor, the obstacle information around the first movement electronic equipment is sent to the processor;With
And
By the processor, the movement orientation of the first movement electronic equipment is adjusted with avoiding obstacles.
18. according to the method for claim 17, wherein sensor described in the first movement electronic equipment includes ultrasonic wave
Sensor and/or laser sensor.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710437071.1A CN108459594A (en) | 2017-06-12 | 2017-06-12 | A kind of method in mobile electronic device and the mobile electronic device |
PCT/CN2018/090140 WO2018228254A1 (en) | 2017-06-12 | 2018-06-06 | Mobile electronic device and method for use in mobile electronic device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710437071.1A CN108459594A (en) | 2017-06-12 | 2017-06-12 | A kind of method in mobile electronic device and the mobile electronic device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108459594A true CN108459594A (en) | 2018-08-28 |
Family
ID=63220952
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710437071.1A Pending CN108459594A (en) | 2017-06-12 | 2017-06-12 | A kind of method in mobile electronic device and the mobile electronic device |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN108459594A (en) |
WO (1) | WO2018228254A1 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109709953A (en) * | 2018-12-21 | 2019-05-03 | 北京智行者科技有限公司 | Vehicle follower method in road cleaning operation |
CN109709954A (en) * | 2018-12-21 | 2019-05-03 | 北京智行者科技有限公司 | Vehicle follower method in road cleaning operation |
CN110362092A (en) * | 2019-08-05 | 2019-10-22 | 广东交通职业技术学院 | It is a kind of based on mobile phone wireless control robot follow kinescope method and system |
CN111820822A (en) * | 2020-07-30 | 2020-10-27 | 睿住科技有限公司 | Sweeping robot, illuminating method thereof and computer readable storage medium |
TWI779600B (en) * | 2021-05-11 | 2022-10-01 | 東元電機股份有限公司 | Charging vehicle for following portable electronic device |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102866706A (en) * | 2012-09-13 | 2013-01-09 | 深圳市银星智能科技股份有限公司 | Cleaning robot adopting smart phone navigation and navigation cleaning method thereof |
CN105911999A (en) * | 2016-06-21 | 2016-08-31 | 上海酷哇机器人有限公司 | Mobile luggage case with automatic following and obstacle avoiding functions and using method thereof |
CN106094875A (en) * | 2016-06-27 | 2016-11-09 | 南京邮电大学 | A kind of target follow-up control method of mobile robot |
CN106647766A (en) * | 2017-01-13 | 2017-05-10 | 广东工业大学 | Robot cruise method and system based on complex environment UWB-vision interaction |
CN106774315A (en) * | 2016-12-12 | 2017-05-31 | 深圳市智美达科技股份有限公司 | Autonomous navigation method of robot and device |
CN207051738U (en) * | 2017-06-12 | 2018-02-27 | 炬大科技有限公司 | A kind of mobile electronic device |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CA2407992C (en) * | 2000-05-01 | 2010-07-20 | Irobot Corporation | Method and system for remote control of mobile robot |
JP2007317112A (en) * | 2006-05-29 | 2007-12-06 | Funai Electric Co Ltd | Self-propelled device and self-propelled cleaner |
CN105352508A (en) * | 2015-10-22 | 2016-02-24 | 深圳创想未来机器人有限公司 | Method and device of robot positioning and navigation |
CN105955251A (en) * | 2016-03-11 | 2016-09-21 | 北京克路德人工智能科技有限公司 | Vision following control method of robot and robot |
CN106709937A (en) * | 2016-12-21 | 2017-05-24 | 四川以太原力科技有限公司 | Method for controlling floor mopping robot |
-
2017
- 2017-06-12 CN CN201710437071.1A patent/CN108459594A/en active Pending
-
2018
- 2018-06-06 WO PCT/CN2018/090140 patent/WO2018228254A1/en active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102866706A (en) * | 2012-09-13 | 2013-01-09 | 深圳市银星智能科技股份有限公司 | Cleaning robot adopting smart phone navigation and navigation cleaning method thereof |
CN105911999A (en) * | 2016-06-21 | 2016-08-31 | 上海酷哇机器人有限公司 | Mobile luggage case with automatic following and obstacle avoiding functions and using method thereof |
CN106094875A (en) * | 2016-06-27 | 2016-11-09 | 南京邮电大学 | A kind of target follow-up control method of mobile robot |
CN106774315A (en) * | 2016-12-12 | 2017-05-31 | 深圳市智美达科技股份有限公司 | Autonomous navigation method of robot and device |
CN106647766A (en) * | 2017-01-13 | 2017-05-10 | 广东工业大学 | Robot cruise method and system based on complex environment UWB-vision interaction |
CN207051738U (en) * | 2017-06-12 | 2018-02-27 | 炬大科技有限公司 | A kind of mobile electronic device |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109709953A (en) * | 2018-12-21 | 2019-05-03 | 北京智行者科技有限公司 | Vehicle follower method in road cleaning operation |
CN109709954A (en) * | 2018-12-21 | 2019-05-03 | 北京智行者科技有限公司 | Vehicle follower method in road cleaning operation |
CN110362092A (en) * | 2019-08-05 | 2019-10-22 | 广东交通职业技术学院 | It is a kind of based on mobile phone wireless control robot follow kinescope method and system |
CN111820822A (en) * | 2020-07-30 | 2020-10-27 | 睿住科技有限公司 | Sweeping robot, illuminating method thereof and computer readable storage medium |
CN111820822B (en) * | 2020-07-30 | 2022-03-08 | 广东睿住智能科技有限公司 | Sweeping robot, illuminating method thereof and computer readable storage medium |
TWI779600B (en) * | 2021-05-11 | 2022-10-01 | 東元電機股份有限公司 | Charging vehicle for following portable electronic device |
Also Published As
Publication number | Publication date |
---|---|
WO2018228254A1 (en) | 2018-12-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108459594A (en) | A kind of method in mobile electronic device and the mobile electronic device | |
CN207051738U (en) | A kind of mobile electronic device | |
CN109890573B (en) | Control method and device for mobile robot, mobile robot and storage medium | |
US11400595B2 (en) | Robotic platform with area cleaning mode | |
US10328573B2 (en) | Robotic platform with teach-repeat mode | |
US20200047337A1 (en) | Robotic platform with event based mode change | |
US20200047343A1 (en) | Remote planning and locally adaptive service mapping | |
Thompson et al. | A probabilistic model of human motion and navigation intent for mobile robot path planning | |
US20180364045A1 (en) | Robotic platform with mapping facility | |
US20180361585A1 (en) | Robotic platform with multi-function service module | |
CN108983781A (en) | A kind of environment detection method in unmanned vehicle target acquisition system | |
CN106569489A (en) | Floor sweeping robot having visual navigation function and navigation method thereof | |
CN108594825A (en) | Sweeping robot control method based on depth camera and system | |
US20180361584A1 (en) | Robotic platform with long-term learning | |
CN111700546A (en) | Cleaning method of mobile robot and mobile robot | |
US20180361581A1 (en) | Robotic platform with following mode | |
JP6758005B2 (en) | Mobile robot | |
JP2007149088A (en) | Own position recognition method for moving robot and its apparatus | |
CN108459596A (en) | A kind of method in mobile electronic device and the mobile electronic device | |
KR20180080499A (en) | Robot for airport and method thereof | |
US20210213619A1 (en) | Robot and control method therefor | |
KR20190113692A (en) | Method of tracing user position using crowd robot, tag device, and robot implementting thererof | |
CN113837059B (en) | Patrol vehicle for prompting pedestrians to wear mask in time and control method thereof | |
US20230117848A1 (en) | Method, system and device for analyzing pedestrian motion patterns | |
WO2019203878A1 (en) | Apparatus and methods of a service robotic platform |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20180828 |