EP3652600A1 - Système robotique autonome - Google Patents

Système robotique autonome

Info

Publication number
EP3652600A1
EP3652600A1 EP18831842.2A EP18831842A EP3652600A1 EP 3652600 A1 EP3652600 A1 EP 3652600A1 EP 18831842 A EP18831842 A EP 18831842A EP 3652600 A1 EP3652600 A1 EP 3652600A1
Authority
EP
European Patent Office
Prior art keywords
electronic device
processor
obstacle
moving electronic
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP18831842.2A
Other languages
German (de)
English (en)
Other versions
EP3652600A4 (fr
Inventor
Leonid KOVTUN
Maximillian KOVTUN
Leonid RYZHENKO
Taras YERMAKOV
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Travelmate Robotics Inc
Original Assignee
Travelmate Robotics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US15/725,656 external-priority patent/US20180360177A1/en
Application filed by Travelmate Robotics Inc filed Critical Travelmate Robotics Inc
Publication of EP3652600A1 publication Critical patent/EP3652600A1/fr
Publication of EP3652600A4 publication Critical patent/EP3652600A4/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45CPURSES; LUGGAGE; HAND CARRIED BAGS
    • A45C15/00Purses, bags, luggage or other receptacles covered by groups A45C1/00 - A45C11/00, combined with other objects or articles
    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45CPURSES; LUGGAGE; HAND CARRIED BAGS
    • A45C5/00Rigid or semi-rigid luggage
    • A45C5/03Suitcases
    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45CPURSES; LUGGAGE; HAND CARRIED BAGS
    • A45C5/00Rigid or semi-rigid luggage
    • A45C5/14Rigid or semi-rigid luggage with built-in rolling means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/028Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using a RF signal
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/106Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones

Definitions

  • the present invention is generally directed to robotics, and pertains particularly to methods and apparatus for locating a moving target, following the moving target while detecting and avoiding obstacles that are on its movement path.
  • Previous solutions for an autonomous robot system that locates and follows its user include tracking a target based on a camera vision solution, which only worked when there was only one person present in the area where the camera or the video covered because the system that solely relies on visual image has difficulty with differentiating the targeted user from a crowd of similarly dressed people, or people who look alike. Additionally, tracking via camera or video requires large computing powers and potentially may cause security issues. Other solutions to follow a target include sound tracking, heat sensors, RFID and Bluetooth technology. Tracking via sound seems to be impractical because it would require constantly emitting a sound that the system program recognizes. Tracking via heat sensors becomes unreliable when the system is in an environment where multiple heat sources, e.g., more than one person, or animals are within range.
  • RFID and remote control technology only works when the target is directly visible to the device.
  • the currently available solutions that use Bluetooth technology face three issues. First, a person's body can weaken and spread out a Bluetooth signal. Second, there is a very large amount of signal reflection from Bluetooth device itself and the signal is heavily dependent on the position of the source, such as a phone with Bluetooth signal. Third, whenever the Bluetooth device changes position, the signal changes all of its parameters, making it difficult to determine the speed of the system and the moving target, and the distance between them.
  • a system for identifying and following a moving electronic device includes an antenna for receiving transmitting signals; a plurality of sensors for distance measurement; a processor; and a memory in communication with the processor.
  • the memory storing instructions that, when executed by the processor, cause the processor to: determine a speed and a direction of the moving electronic device; adjust a movement path of the system based on the determined speed and direction of the moving electronic device; determine a distance between the moving electronic device and the system; command the system to follow the moving electronic device within a predetermined range of the distance; identify an obstacle in the movement path of the system; command the system to stop for a predetermined time period when the obstacle is identified; determine whether the obstacle is still in the movement path of the system after the predetermined time period; adjust the movement path of the system, when determining the obstacle is still in the movement path of the system; and command the system to continue to follow the moving electronic device within the predetermined range of the distance, when determining the obstacle is no longer in the movement path of the
  • a method for identifying and following a moving electronic device by a system include: determining, by a processor, a speed and a direction of the moving electronic device; adjusting, by the processor, a movement path of the system based on the determined speed and direction of the moving electronic device; determining, by the processor, a distance between the moving electronic device and the system; commanding, by the processor, the system to follow the moving electronic device within a predetermined range of the distance; identifying, by the processor, an obstacle in a movement path of the system; commanding, by the processor, the system to stop for a predetermined time period when the obstacle is identified; determining, by the processor, whether the obstacle is still in the movement path of the system after the predetermined time period; adjusting, by the processor, the movement path of the system, when determining the obstacle is still in the movement path; and commanding, by the processor, the system to continue to follow the moving electronic device within the predetermined range of the distance, when detecting the obstacle is no longer in the movement path of the predetermined
  • the processor Bluetooth pairs with the moving electronic device and following only the moving electronic device after Bluetooth pairing.
  • the system includes a camera to perform object recognition to identify the obstacle and transmit an object recognition signal.
  • the command of the system to stop is based on a predetermined threshold of distance between the system and the obstacle.
  • the system includes an engine controller to control the movements of the system.
  • the command of the system to follow the moving electronic device within a predetermined range of the distance is based on a speed and a direction of the system.
  • the system commands the system to increase the speed of the system, when the system is being physically pulled at a predetermined angle with respect to ground.
  • the system commands the system to rotate a plurality of omni wheels of the system by 180 degrees, when the moving electronic device is at a predetermined threshold value of the angle with respect to the system.
  • the system includes a joystick to control the movement of the system.
  • the system includes one or more of a suitcase, a bag, a cargo, a stroller, a carriage, and a container.
  • the system may further include a camera, a joystick and/or an engine controller.
  • Figure 1 is a front view of an autonomous robot system in accordance with some embodiments of the present invention
  • Figure 2 is a side view of an autonomous robot system in accordance with some embodiments of the present invention
  • Figure 3 is a back view of an autonomous robot system in accordance with some embodiments of the present invention.
  • Figure 4 illustrates a process flow for an autonomous robot system locating a target in accordance with some embodiments of the present invention
  • Figure 5 illustrates a process flow for an autonomous robot system moving towards a stationary target according to some embodiments of the present invention
  • Figure 6 illustrates a process flow for an autonomous robot system reacting to a discovered obstacle in accordance with some embodiments of the present invention
  • Figure 7 illustrates a process flow for an autonomous robot system following a moving target according to some embodiments of the present invention
  • Figure 8 illustrates a process follow for an autonomous robot system assisting the user in according to some embodiment of the present invention.
  • An autonomous robot system for example, a suitcase, a bag, a cargo, a stroller, a carriage, a container and similar items with wheels (“system"), locates a target, for example, an electronic device, such as a smart phone, a laptop, or a notepad being carried by a user, and follows the target while it detects and avoids obstacles that are in its moving path.
  • the system wirelessly connects with the target, e.g., a handheld electronic device such as a smart phone, and exclusively "pair-up" with the target in order to follow the target as it moves.
  • the system navigates through large crowds recognizing and avoiding objects that are in its way while target path tracking.
  • the system is able to move through crowds and obstacles without requiring any extra peripherals.
  • the autonomous robot system includes omni wheels that allow for multi-directional movement, including vertically and horizontally and better stability. While following the moving target, the system moves at a speed that is within a predetermined threshold of the target's moving speed.
  • the system patrols its environment using cameras or recorders.
  • the camera or recorder can be controlled remotely.
  • the system includes location recognition application, e.g., to a global position system ("GPS") chip, to orient and track its location.
  • GPS global position system
  • the GPS chip is removable.
  • the system may include two additional GPS chips.
  • the system uses artificial intelligence (Al) and machine learning to optimize its movements.
  • the system may include integrated adaptive Al that recognizes its environment, e.g., on a flying aircraft, and adjusts its movement accordingly.
  • the system may include virtual reality (VR) and camera integration, which may be used to reconstruct images of the system's moving path.
  • the system may also include directional indicators, e.g., speakers for guiding visually impaired user.
  • FIG. 1 is a front view of an autonomous robot system in accordance with some embodiments of the disclosed invention, 100.
  • the basic components of the system 100 may include: directional antennas 102, 104, 106, 108; distance measuring sensors 1 10, 1 12, a processor and memory 1 14, and wheels 1 16, 1 18.
  • the system may include a User's Transmitter Detecting module, which include specially designed directional antennas 102, 104, 106, 108, and Bluetooth Low Energy modules, which include algorithms for data processing.
  • the directional antennas 102, 104, 106, 108 detect its target by searching for the target's wireless signal transmitter, e.g., smartphone, smart watch, or electronic wrist bracelet.
  • the strength differences of the signals received by the directional antennas 102, 104, 106, 108 are utilized to determine the distance and angle of the target with respects to the system, e.g., the sectoral and differential method.
  • the system may also use distance measuring sensors to detect its target.
  • Choices of the distance sensor include ultrasonic distance measuring sensor and/or laser distance measuring sensors. Additionally, the distance measuring sensors may also be used to detect obstacles, e.g., people, fixtures and buildings that are in the moving path of the system.
  • the system may locate its target when the target is in its vision range by its visual identification module.
  • the visual identification module may include a camera 120.
  • the system may include a module of visual identification and fixation of targeted human using human image recognition.
  • the module may include at least one camera 120 located at the center of the vehicle's top cover. The module outputs a radian degree from 0° to 170° and become activated when a human is within its vision range set at a predetermined distance, e.g., 25 cm from the ground, and a predetermined angle, e.g., 45 ° angle.
  • the module also includes an algorithm that processes the image of the targeted human to determine whether the human in range is the prospected target.
  • the camera 120 is equipped to function as a removable 360 degree virtual reality camera.
  • Figure 2 is a side view of an autonomous robot system in accordance with some embodiments of the disclosed invention, 200.
  • the system may include multiple ultrasonic distance measuring sensors 1 10, 1 12 located on the front of the system, and multiple laser distance measuring sensors 202, 204 on the top of the system.
  • the system may also include a biometric lock system 206, e.g., activated by fingerprint, facial or iris.
  • system may also include mechanism for manual locks.
  • FIG. 3 is a back view of an autonomous robot system in accordance with some embodiments of the disclosed invention 300.
  • the system includes a notification module such as light indicator and/or sound indicator, for example, the system may include an addressable LED RGB stripe 208.
  • the notification module may also include a speaker 302.
  • the LED RGB stipe 208 and the speaker 302 are configured to provide a variety of light patterns and sound effects.
  • the notification module may not necessarily be autonomous, it may be configured to be activated at various situations, such as when the system is activated from shut off; detection of obstacles in the system's moving path; impossible to bypass an obstacle or an unavoidable obstacle, e.g., when a step is detected; connection breakage; entry into the turn; rotation around the system's axis; and an unexpected removal, e.g., someone attempt to steal the system.
  • Power sources for the system include battery, solar panel and other means for providing long lasting power, one example is a removable battery 304, which may be charged wirelessly.
  • the system also includes a decision making module, whereas the "decision" is the result of a sequential process by the system's “working components” (pipelines).
  • the decision making process may include receiving data pertaining to the system engine, e.g., from the odometer, and setting the primary moving speed and angle of the system.
  • the stages of the decision making may include identifying a target, e.g., a handheld electronic device, or a targeted person.
  • the system communicates with the electronic device or utilizing the facial recognition data in getting the target's location information, including angle and distance.
  • the electronic device for example, a smartphone, smart watch, wrist bracelet etc., is presumably the moving target.
  • the system also calculates the target's speed, and corrects its angle of rotation based on the target's position and sets its moving direction. If the system detects itself as too close or too far from the target, e.g., completely lost connection with the user, the system may stop moving and send a notification to the electronic device which it is following.
  • the system searches through a list of pre-qualified target/device to establish "pairing." For example, the system may search through a particular person's car, smart phone, smart watch and/or tablet, which may all be "pre-qualified” as a "target” that the system may follow. Once the initial pairing of the targeting device and the system is successful, the paired device is considered a trusted device, as well as the target. From this point on, the system will not pair with any other target unless it receives further command. The system and target connection ends when the system or the target device is turned off.
  • the system establishes the exclusive targeting by verifying the identification code exchange between the system and a server and Bluetooth protocol of the target during initial connection. After the first activation of the system and establishing connection with the target, the system proceeds to a calibration process to be chosen optionally by the user in the mobile applications. Different types of single transmitter are likely to have different receiving and transmitting antennas with different characteristics. To be compatible with all types of wireless signal transmitters, an initial calibration for each transmitter with respect to the system is required to level out the effect of different types of signal transmitters for better accuracy in determining the distance and the angle of the system with respect to the target.
  • FIG. 4 illustrates a process flow for an autonomous robot system locating a target in accordance with some embodiments of the disclosed invention.
  • the target which the system detects and follows is presumably a wireless electronic device, e.g., smartphone, which is equipped with Bluetooth protocol to be paired with the system.
  • the pairing process can be activated by a user, e.g., via a mobile application that was previously downloaded on the wireless electronic device.
  • the system automatically searches for a device to establish "pairing.” To avoid the system from following a wrong target, e.g., paring with an undesirable smartphone, the targeted wireless electronic device is registered and verified.
  • the system when the system detects a wireless electronic device and determines that device is a possible target in block 402, the system begins the verification process by determining whether the device has already been registered. Each registered target/device has a unique serial number; only the verified registrant target/device can control and monitor the system.
  • a device verification process registers the serial number in a server e.g., a remote user account that is saved in Cloud. As shown in block 410, the system first determines whether the serial number of the device is located on a server, then verifies whether the device associated with the serial number has already been registered in block 412.
  • the system then requires the user to confirm during initial connection in block 404, by seeking permission to allow the system to register the device for verification purpose in block 406. If the user grants permission, the system will register the device on the server and run the verification process in block 408. Examples of registration methods include using the email address and/or the phone number that is associated with the smartphone. User permissions include, for example, Bluetooth usage, access to GEO data, etc.
  • the mobile application dashboard is the main application control panel with set of function to control and monitor device. The mobile application may also include an option to link a particular system to the user's account to enhance protection against unauthorized access and theft.
  • the system detects the target location and follows the user while it observes a set distance between it and the user, and maintains optimum speed. As an option, indicating alarms, e.g., lights or sound may come on when the system loses its connection with its target.
  • FIG. 5 illustrates a process flow for an autonomous robot system moving towards a stationary target, ("Find Me Process"), according to some embodiments of the disclosed invention.
  • the Find Me Process 500 begins after the system successfully verifies the target, while the target is standing still at its location, the system autonomously travels towards the target until the distance between the system and the target is within a predetermined threshold.
  • the user activates the Find Me process using the mobile application in block 502.
  • the system moves at a predetermined constant speed.
  • the predetermined speed is set by the system user, and it can be changed via mobile application.
  • the system only uses data that it received from the directional antennas, (e.g., directional antennas 102, 104, 106, 108 in Figure 1 ) for navigation.
  • the directional antennas e.g., directional antennas 102, 104, 106, 108 in Figure 1
  • the system receives data pertaining to the system engine from the odometer in block 506.
  • the system also receives data pertaining to the position of target in block 505.
  • the system uses both data sets to determine the angle and distance of the target with respect to the system in block 508.
  • the system compares its distance to the target with a predetermined threshold value in block 510 to determine whether to send command to the system engine controller. If the distance to the target is greater than the predetermined threshold value, the system determines the angle and distance to the target in block 512, and sets a movement path which it will take to the target. On the path of traveling towards the target, the system may detect one or more obstacle(s) in block 514.
  • the system retrieves data pertaining to the position of the obstacle for adjusting its movement path in block 516 from a separate operational process in block 600, and sends command to the system engine controller to adjust its movement accordingly. If the system does not detect any obstacle on the path of traveling towards the target, the system will determine whether the target has moved since it first paired up with the system in block 520, by retrieving data pertaining to the position of the target and comparing the data with the previous target data in block 505. If the target has moved, the system receives target motion data in block 522, which includes the angle and distance to the target. The system analyzes the target motion data and determines the engine data in block 524. The command to the system engine is sent to the engine in block 526 to set the system's next movement by adjusting the system wheel(s)' rotation angle and rotation speed.
  • the mode may be automatically turned off and the operational motion managing component, responsible for the operating modes, enters into another mode, for example, the "Sleep" mode, which stops the system from moving.
  • the system comes to a stop following the electronic device and waits for a period of time for the obstacle to disappear, e.g., removed.
  • the wait time period is predetermined based on the specific environment where the system operates. If an insurmountable obstacle (pit or dead end) is detected, an alert, e.g. an alarm or a visual indication such as an LED light or notice, is generated and sent to the mobile application installed on the user's smartphone.
  • the system travels towards the target until its distance from the target is less than a threshold value.
  • the threshold value is generally an optimal distance, which the system maintains when following the target in motion and may be set by the user.
  • Figure 6 illustrates a process flow 600 for the autonomous robot system reacting to a discovered obstacle in accordance with some embodiments of the disclosed invention.
  • the system If the system discovers an obstacle in its movement path in block 602, the system stops and waits for a period of time in block 604. Oftentimes, the obstacle is a moving object or a person, which will move away within a short period of time.
  • the length of the wait time in block 604 is predetermined or user selectable based on the specific environment where the system is presumed to be placed. For example, an airport is likely to have more temporary obstacles that "move away" quickly, e.g., people, than permanent obstacles such as road block. Therefore, a user who is in an airport may select to set a shorter wait time as oppose to a user who is on a sidewalk of a street.
  • the system determines whether the obstacle has been removed from its movement path after the "pause” and continues its movement on the path towards the target in block 608. If the obstacle is still present, the system generates commands to the engine controller that adjust its movement by changing the turn angles and/or the speed of the wheels in block 610. Sometimes, an obstacle may be insurmountable, such as a wall, that even adjusting the wheels does not allow the system to move passed it in block 612. An insuperable obstacle may also be an obstacle that cannot be bypassed by maneuvers available for the system geometry, for example, when the system is on a path which requires it to go up or down on stairs. In this case, the system stops and generates a notification to alert the user of the obstacle, e.g. via an alarm or a notice sent to the user's handheld device in block 614.
  • Figure 7 illustrates a process flow for an autonomous robot system following a moving target, ("Follow Me Process"), according to some embodiments of the disclosed invention.
  • the Follow Me Process in block 700 begins after the system successfully verifies the target. While the target is moving, the system determines the initial distance between itself and the target, and accelerates towards the target until it is within an optimal distance from the target. The system maintains its speed according to the moving target to stay within the optimal distance. The system analyzes the data collected from all internal components, such as antennas, sensors and/or cameras, and external sources, including the data collected from the user's mobile application, to determine both the moving speed and/or angle of the target.
  • the Follow Me process utilizes Al and autonomous movement technology to determine the direction and speed of its movements based on the moving speed and/or angle of the target.
  • the Follow Me Process may be activated by the user using a mobile application in block 702.
  • the Follow Me Process begins by receiving odometer data in block 706 from the system engines.
  • the system collects data from the antenna(s) in block 701 , the distance measuring sensors in block 703, and the camera in block 705 to determine the moving target's angle and distance with respect to its position in block 710.
  • the system utilizes the antenna data in block 701 to determine its own movement, specifically its angle and distance with respect to the moving target, and monitors its own driving speed.
  • the antenna(s) detect the direction of motion and the antenna(s) data is sent to the system where it determines the required angle of rotation of the wheels in its future movement in block 710.
  • the distance measuring sensors in block 703 detects the distance between the moving target and the system itself.
  • the sensor(s) data is sent to the system processor where it determines the required driving speed of the wheels in its future movement in block 710. If the system is within a predetermined threshold, the wheels rotate at a constant speed to maintain the optimal distance. The wheels accelerate or slowdown when the distance is too large or too small with respect to the threshold value, respectively.
  • the threshold value is the optimal distance between the system and moving target, which the system maintains while it follows the target.
  • the camera in block 705 identifies the target, and obtains information regarding the distance and angle with respect to the target.
  • the camera data is also sent to the system processor to be used in determining the next motion based on the distance and angle with respect to the target in block 710.
  • the system determines the approximate speed and the angle of the moving target in block 710, and sets a movement path in block 712, which it will take in order to follow the target based on the results of the analysis of data pertaining to the target in block 708, e.g., the target distance, target angle, and the odometer data in block 706.
  • the system may detect one or more obstacle(s) in block 714.
  • the system retrieves data pertaining to the position of the obstacle for adjusting its movement path in block 716 from a separate operational process, for example the process of Figure 6, and sends command to the system engine controller to adjust the system's next movement accordingly, for example, a camera may identify an obstacle by object recognition.
  • the system retrieves target motion data pertaining to the position of the moving target, which include the angle and distance of the target in block 718, determines the engine data by analyzing the target motion data and in 720.
  • the command to the system engine controller is sent in block 722 to set the system's next movement by adjusting the system wheel(s)' rotation angle and rotation speed.
  • the follow Me process ends when the user terminates the target-following mode, or when the target stops traveling and the system reaches a predetermined distance from the target, e.g., the distance between the system and the target is less than 1 meter.
  • the system engine controller may directly control the motor drivers by generating a pulse width modulation ("PWM") control signal of the required duty cycle for the motor driver.
  • PWM pulse width modulation
  • the system engine controller calculates the required wheel speeds and rotation angle based on the speed and the angle between the system and its target.
  • the engine controller may determine to rotate the wheels backwards so that the system turns around immediately to follow the electronic device based on a pre-determined threshold value of the angle between the system and its target, e.g., when the angle between the system and its target is 180 degrees.
  • the system process may also include a manual vehicle motion mode, which enables a user to control the movement of the system using a joystick in the mobile application.
  • a manual vehicle motion mode which enables a user to control the movement of the system using a joystick in the mobile application.
  • the user may activate the joystick mode using the mobile application and operate the joystick in multiple sensitivity modes, e.g., Low, Mid, High.
  • the mobile application sends (x-y) coordinates and a range of [0, 100] to the system processor, upon receiving the coordinates, the system calculates the wheel(s) rotation angle and speed and sends commands to the system engines according to the calculation to control the movement of the wheels(s).
  • Figure 8 illustrates a process flow for an autonomous robot system assisting the user, ("Assist Me Process"), according to some embodiments of the disclosed invention.
  • the system When the system reaches its target, it may be picked up or handled by the user directly. For example, the user may physically pull the system by a handle instead of letting the system follow the user while walking.
  • the system's engine While the system is being physically pulled by the user, the system's engine may automatically increase its horse power so that the user does not need to pull the entire weight of the system, which assists the user with moving a system that may be too heavy to maneuver.
  • the wheels rotate at an angle according to the direction in which the system is being pulled by the user, while moving in a speed according to an algorithm that is based on the system's inclined angle with respect to the system's movement path, and the system's own weight.
  • the assistant mode may be activated by a user using the mobile application in block 802.
  • the system is equipped with a gyroscope in block 808 and internal scale in block 809, and monitors the gyroscope data in block 804 to detect the angle of inclination in block 806.
  • the system determines whether the angle of inclination is out of predetermined threshold value, e.g., the system is tilted towards the ground at an angle of 45 degrees in block 810. If the angel is between the system and the ground is within the predetermined threshold, the system determines a necessary movement speed that corresponds to the angle, which does not require the user to use much pulling force to maintain in block 812.
  • the system determines the engine data in block 814 and sends commands to the system engine controller in block 816 to set the system's next movement by adjusting the system wheel(s)' rotation angle and rotation speed.
  • the system includes a peripheral platform.
  • the user mobile application that controls the system may also include the user's registration, optional device verification, user permissions and control functions.
  • the targeting device verification may include register and validate the device on a remote server and/or on cloud.
  • the autonomous robot system may be fully integrated with other software applications to provide additional functions.
  • the system may be integrated with an application that is able to make travel suggestions, airport information, and airport gate information.
  • the autonomous robot system function may be continuously improved through machine learning.
  • the autonomous robot system automatically uploads its own movement data to the autonomous robot system application to perfect the system as operating time increases.
  • the self-learning feature may be disabled as an option.
  • the autonomous robot system may carry more items, e.g., another suitcase, on the top of while it is traveling autonomously in horizontal mode.
  • the autonomous robot system may include a built-in scale that measures the weight of its contents.
  • the autonomous robot system may include a display that displays its total weight.
  • the autonomous robot system may include a unique handle that turns into a portable desk, which may be used for laptops, books, documents and other things.
  • the autonomous robot system may include an easily accessible separate compartment for storages.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Purses, Travelling Bags, Baskets, Or Suitcases (AREA)

Abstract

L'invention concerne un système pour identifier et suivre un dispositif électronique mobile, le système comprenant une antenne pour recevoir des signaux d'émission, une pluralité de capteurs pour une mesure de distance, un processeur et une mémoire en communication avec le processeur. La mémoire mémorise des instructions qui, lorsqu'elles sont exécutées par le processeur, amènent le processeur à déterminer une vitesse et une direction du dispositif électronique mobile ; ajustent un trajet de déplacement du système sur la base de la vitesse et de la direction déterminées du dispositif électronique mobile ; déterminent une distance entre le dispositif électronique mobile et le système ; commandent le système pour suivre le dispositif électronique mobile dans une plage prédéfinie de la distance tout en identifiant et en évitant un obstacle dans le trajet de déplacement du système.
EP18831842.2A 2017-07-10 2018-07-10 Système robotique autonome Withdrawn EP3652600A4 (fr)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US201762530744P 2017-07-10 2017-07-10
US15/725,656 US20180360177A1 (en) 2017-06-15 2017-10-05 Robotic suitcase
PCT/US2017/057319 WO2018231270A1 (fr) 2017-06-15 2017-10-19 Valise robotique
US201862651023P 2018-03-30 2018-03-30
PCT/US2018/041525 WO2019014277A1 (fr) 2017-07-10 2018-07-10 Système robotique autonome

Publications (2)

Publication Number Publication Date
EP3652600A1 true EP3652600A1 (fr) 2020-05-20
EP3652600A4 EP3652600A4 (fr) 2021-08-04

Family

ID=65002331

Family Applications (1)

Application Number Title Priority Date Filing Date
EP18831842.2A Withdrawn EP3652600A4 (fr) 2017-07-10 2018-07-10 Système robotique autonome

Country Status (4)

Country Link
EP (1) EP3652600A4 (fr)
JP (1) JP2020527266A (fr)
CN (1) CN111201497A (fr)
WO (1) WO2019014277A1 (fr)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111504316B (zh) * 2019-01-30 2023-03-31 北京优位智停科技有限公司 一种进行车辆领航的方法及地面移动装置
JP7225262B2 (ja) 2020-02-26 2023-02-20 バイドゥドットコム タイムズ テクノロジー (ベイジン) カンパニー リミテッド 自動運転車の障害物回避に関する軌跡計画
CN111324134B (zh) * 2020-03-06 2023-04-28 陕西雷神智能装备有限公司 基于预设顺序射频标签的无人车巡线方法、系统和无人车
CN111352422B (zh) * 2020-03-06 2023-04-28 陕西雷神智能装备有限公司 基于自学习射频标签的无人车巡线方法、系统和无人车
CN111273672B (zh) * 2020-03-06 2023-04-28 陕西雷神智能装备有限公司 基于已知坐标射频标签的无人车巡线方法、系统和无人车
CN111966023B (zh) * 2020-08-28 2024-04-30 王旭飞 一种智能跟随方法、装置及电子设备
CN112487869A (zh) * 2020-11-06 2021-03-12 深圳优地科技有限公司 机器人路口通行方法、装置和智能设备
US20220382282A1 (en) * 2021-05-25 2022-12-01 Ubtech North America Research And Development Center Corp Mobility aid robot navigating method and mobility aid robot using the same

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4830452B2 (ja) * 2005-11-04 2011-12-07 カシオ計算機株式会社 前傾角度検出装置及び車輪駆動式移動体装置
US8948913B2 (en) * 2009-10-26 2015-02-03 Electronics And Telecommunications Research Institute Method and apparatus for navigating robot
US20140107868A1 (en) * 2012-10-15 2014-04-17 Mirko DiGiacomcantonio Self-propelled luggage
JP2014092861A (ja) * 2012-11-01 2014-05-19 Symtec Hozumi:Kk 追従台車システム
US20140277841A1 (en) * 2013-03-15 2014-09-18 Elizabeth Klicpera Motorized Luggage or Luggage Platform with Wired or Wireless Guidance and Distance Control
JP5915690B2 (ja) * 2014-04-18 2016-05-11 株式会社豊田自動織機 搬送補助装置
JP5792361B1 (ja) * 2014-06-25 2015-10-07 シャープ株式会社 自律移動装置
KR102285083B1 (ko) * 2015-03-02 2021-08-04 케빈 오도넬 전동식 러기지
CN104991560B (zh) * 2015-07-12 2018-08-14 仲恺农业工程学院 自主移动式智能机器人
SG11201801327QA (en) * 2015-08-19 2018-03-28 Cyberdyne Inc Autonomous mobile body and on-site operation management system
KR102477855B1 (ko) * 2015-10-16 2022-12-15 레밍스 엘엘씨 로봇 골프 캐디
CN105955267A (zh) * 2016-05-11 2016-09-21 上海慧流云计算科技有限公司 一种移动控制方法及系统
ES2607223B1 (es) * 2016-06-08 2017-10-24 Pablo VIDAL ROJAS Maleta autónoma
CN106155065A (zh) * 2016-09-28 2016-11-23 上海仙知机器人科技有限公司 一种机器人跟随方法及用于机器人跟随的设备

Also Published As

Publication number Publication date
JP2020527266A (ja) 2020-09-03
CN111201497A (zh) 2020-05-26
EP3652600A4 (fr) 2021-08-04
WO2019014277A1 (fr) 2019-01-17

Similar Documents

Publication Publication Date Title
US11160340B2 (en) Autonomous robot system
EP3652600A1 (fr) Système robotique autonome
CN106444763B (zh) 基于视觉传感器智能自动跟随方法、系统及行李箱
KR102254881B1 (ko) 이동 로봇 및 이동 로봇의 추종 설정 방법
US20200000193A1 (en) Smart luggage system
CN104049633B (zh) 一种随动控制方法、随动装置及随动系统
JP4871160B2 (ja) ロボットおよびその制御方法
US20170368691A1 (en) Mobile Robot Navigation
KR101783890B1 (ko) 이동 로봇 시스템
US10023151B2 (en) Autonomous vehicle security
Krejsa et al. Infrared beacons based localization of mobile robot
EP3919238A2 (fr) Robot mobile et son procédé de commande
CN105807775A (zh) 具有自主跟随避障功能的移动机器人
CN105911999A (zh) 具有自主跟随避障功能的移动行李箱及其使用方法
US20180360177A1 (en) Robotic suitcase
CN114115296B (zh) 一种重点区域智能巡检与预警系统及方法
KR20180080499A (ko) 공항용 로봇 및 그의 동작 방법
CN106647815A (zh) 一种基于多传感器信息融合的智能跟随机器人及控制方法
US20220032874A1 (en) Vehicle Traveling Control Method and Vehicle Traveling Control Device
US11635759B2 (en) Method of moving robot in administrator mode and robot of implementing method
US20210208595A1 (en) User recognition-based stroller robot and method for controlling the same
US20220055654A1 (en) Methods and Apparatus for User Interactions with Autonomous Vehicles
WO2018101962A1 (fr) Récipient de stockage autonome
KR102433859B1 (ko) 적외선을 이용하여 사용자를 추종하는 방법 및 장치
JP7019125B2 (ja) 自律走行台車

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20200115

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
A4 Supplementary search report drawn up and despatched

Effective date: 20210706

RIC1 Information provided on ipc code assigned before grant

Ipc: G05D 1/02 20200101AFI20210630BHEP

Ipc: G05D 1/08 20060101ALI20210630BHEP

Ipc: G05D 1/10 20060101ALI20210630BHEP

Ipc: G05D 1/12 20060101ALI20210630BHEP

Ipc: G05D 3/12 20060101ALI20210630BHEP

Ipc: G01C 21/20 20060101ALI20210630BHEP

Ipc: G01C 21/34 20060101ALI20210630BHEP

Ipc: A63F 13/803 20140101ALI20210630BHEP

Ipc: A45C 5/03 20060101ALI20210630BHEP

Ipc: A45C 5/14 20060101ALI20210630BHEP

Ipc: A45C 15/00 20060101ALI20210630BHEP

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20220201