WO2020022622A1 - Procédé de commande d'un robot mobile à intelligence artificielle - Google Patents
Procédé de commande d'un robot mobile à intelligence artificielle Download PDFInfo
- Publication number
- WO2020022622A1 WO2020022622A1 PCT/KR2019/005470 KR2019005470W WO2020022622A1 WO 2020022622 A1 WO2020022622 A1 WO 2020022622A1 KR 2019005470 W KR2019005470 W KR 2019005470W WO 2020022622 A1 WO2020022622 A1 WO 2020022622A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- mobile robot
- cleaning
- space
- unit
- map
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L9/00—Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
- A47L9/28—Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
Definitions
- the present invention relates to a mobile robot and a control method thereof, and more particularly, to a mobile robot and a control method capable of providing a customized artificial intelligence-based service.
- Robots have been developed for industrial use and have been a part of factory automation. Recently, the application of robots has been further expanded, medical robots, aerospace robots, and the like have been developed, and home robots that can be used in general homes have also been made. Among these robots, a moving robot capable of traveling by magnetic force is called a mobile robot.
- a representative example of a mobile robot used at home is a robot cleaner, which is a device that cleans a corresponding area by inhaling dust or foreign matter while driving around a certain area by itself.
- the mobile robot is capable of moving by itself and is free to move, and is provided with a plurality of sensors for avoiding obstacles and the like while driving, so that the robot can travel to avoid obstacles.
- IoT Internet of Things
- Home appliances and Internet of Things (IoT) devices constituting a network may transmit data from one device to another device, and may check information of another device from one device.
- IoT Internet of Things
- voice recognition technology is applied to various devices, and researches on a method of controlling a mobile robot using voice recognition technology are increasing.
- prior art document 1 Korean Patent Publication No. 10-2012-0114670, published on October 17, 2012 discloses that the robot cleaner has a speech recognition unit and recognizes a user's speech signal to provide a corresponding control command. Is running.
- the voice input is unidirectional from the user to the robot cleaner, so that it stays in the additional means of the control operation of pressing a button or operating with a remote controller. Therefore, the speech recognition function has a limitation that it is hard to give the user more than simple control, and cannot provide other functions and services other than the addition of control input means.
- An object of the present invention is to provide a mobile robot with various information and services to a user.
- An object of the present invention is to provide a mobile robot and a method of controlling the same by communicating with a home appliance and Internet of Things (IoT) devices to control a mobile robot to increase user convenience.
- IoT Internet of Things
- An object of the present invention is to provide a mobile robot and its control method capable of providing various services by mapping sensing data and information on space.
- An object of the present invention is to provide a mobile robot and a control method thereof capable of providing various services by mapping information on voice and space.
- a control method of a mobile robot includes generating a map for a driving zone including a plurality of regions, and including one or more sensors. Mapping a plurality of units to specific regions of the plurality of regions, respectively, receiving sensing data from the plurality of units, and mapping the received sensing data to a corresponding sensor
- mapping By storing in association with the, it is possible to provide a variety of services by mapping the sensing data of the other device and the information on the space (mapping).
- the mobile robot may actively provide information and recommend services, functions, and the like, thereby increasing the reliability, preference, and product utilization of the user.
- the Internet of Things (IoT) devices by communicating with the home appliance, the Internet of Things (IoT) devices, by interlocking control of the mobile robot, it is possible to increase the user convenience.
- IoT Internet of Things
- the mobile robot may provide various services by mapping sensing data and information on space.
- the mobile robot may provide various services by mapping the information about the voice and the space.
- the attributes of the plurality of regions within the driving zone may be recognized, and the result of the region attribute recognition may be conveniently used, thereby improving user convenience.
- an evolving user experience may be provided.
- FIG. 1 is a block diagram of a home appliance network system according to an embodiment of the present invention.
- FIG. 2 is a perspective view showing a mobile robot according to an embodiment of the present invention.
- FIG. 3 is a plan view of the mobile robot of FIG. 2.
- FIG. 4 is a side view of the mobile robot of FIG. 2.
- FIG. 5 is a block diagram showing a control relationship between major components of a mobile robot according to an embodiment of the present invention.
- FIG. 6 is a view referred to for describing learning using product data according to an embodiment of the present invention.
- FIG. 7 is an example of a simplified internal block diagram of a server according to an embodiment of the present invention.
- FIG 8 and 9 are views referred to for describing spatial recognition according to an embodiment of the present invention.
- IoT 10 is an example of an Internet of Things (IoT) device.
- IoT Internet of Things
- 11A and 11B illustrate examples of using the IoT apparatus of FIG. 10.
- FIG. 12 is a flowchart illustrating a control method of a mobile robot according to an embodiment of the present invention.
- 13 to 17b are views referred to for describing a method for controlling a mobile robot according to an embodiment of the present invention.
- module and “unit” for the components used in the following description are merely given in consideration of ease of preparation of the present specification, and do not give particular meaning or role in themselves. Therefore, the “module” and “unit” may be used interchangeably.
- the mobile robot 100 refers to a robot that can move itself by using a wheel or the like, and may be a home helper robot or a robot cleaner.
- a robot cleaner having a cleaning function among mobile robots will be described with reference to the drawings, but the present invention is not limited thereto.
- FIG. 1 is a block diagram of a home appliance network system according to an embodiment of the present invention.
- a home appliance network system may include a home appliance including a communication module, which may communicate with another device, the server 70, or connect to a network.
- the home appliance may correspond to an air conditioner 10 having a communication module, a cleaner 20, a refrigerator 31, a washing machine 32, and the like.
- the air conditioner 10 may include at least one of an air conditioner 11, an air cleaner 12 and 13, a humidifier 14, and a hood 15.
- the cleaner 20 may be a vacuum cleaner 21, a robot cleaner 22, or the like.
- the communication module included in the home appliances 10, 20, 31, and 32 may be a Wi-Fi communication module, and the present invention is not limited to the communication method.
- the home appliances 10, 20, 31, and 32 may include other types of communication modules or may include a plurality of communication modules.
- the home appliances 10, 20, 31, and 32 may include an NFC module, a Zigbee communication module, a Bluetooth TM communication module, and the like.
- the home appliances 10, 20, 31, and 32 may be connected to a predetermined server 70 through a Wi-Fi communication module, and may support smart functions such as remote monitoring and remote control.
- the home appliance network system may include a mobile terminal 50 such as a smart phone and a tablet PC.
- the user may check information on the home appliances 10, 20, 31, and 32 in the home appliance network system or control the home appliances 10, 20, 31, and 32 through the portable terminal 50.
- the home appliance network system may include a plurality of Internet of Things (IoT) devices (not shown).
- the home appliance network system may include home appliances 10, 20, 31, and 32, portable terminal 50, and Internet of Things (IoT) devices.
- the home appliance network system is not limited to a communication scheme constituting a network.
- the home appliances 10, 20, 31, and 32, the portable terminal 50, and the Internet of Things (IoT) devices may be communicatively connected through the wire / wireless router 60.
- IoT Internet of Things
- devices in the home appliance network system may form a mesh topology that is individually communicated with each other.
- the home appliances 10, 20, 31, and 32 in the home appliance network system may communicate with the server 70 or the mobile terminal 50 via the wire / wireless router 60.
- the home appliances 10, 20, 31, and 32 in the home appliance network system may communicate with the server 70 or the portable terminal 50 by Ethernet.
- FIG. 2 is a perspective view illustrating a mobile robot according to an embodiment of the present invention
- FIG. 3 is a plan view of the mobile robot of FIG. 2
- FIG. 4 is a side view of the mobile robot of FIG. 2.
- the mobile robot 100 may drive a certain area by itself.
- the mobile robot 100 may perform a function of cleaning the floor. Cleaning of the floor here includes suctioning dust (including foreign matter) from the floor or mopping the floor.
- the mobile robot 100 includes a main body 110.
- the main body 110 includes a cabinet forming an appearance.
- the mobile robot 100 may include a suction unit 130 and a dust container 140 provided in the main body 110.
- the mobile robot 100 includes an image acquisition unit 120 that detects information related to an environment around the mobile robot.
- the mobile robot 100 includes a driving unit 160 for moving the main body.
- the mobile robot 100 includes a control unit 181 for controlling the mobile robot 100.
- the controller 181 is provided in the main body 110.
- the driving unit 160 includes a wheel unit 111 for traveling of the mobile robot 100.
- the wheel unit 111 is provided in the main body 110.
- the mobile robot 100 may be moved back, front, left, and right by the wheel unit 111 or rotated.
- the controller controls the driving of the wheel unit 111, the mobile robot 100 may autonomously travel the floor.
- the wheel unit 111 includes a main wheel 111a and a sub wheel 111b.
- the main wheels 111a are provided at both sides of the main body 110, and are configured to be rotatable in one direction or the other direction according to the control signal of the controller. Each main wheel 111a may be configured to be driven independently of each other. For example, each main wheel 111a may be driven by different motors.
- the sub wheel 111b supports the main body 110 together with the main wheel 111a, and is configured to assist driving of the mobile robot 100 by the main wheel 111a.
- the sub wheel 111b may also be provided in the suction unit 130 described later.
- the suction unit 130 may be disposed to protrude from the front side F of the main body 110.
- the suction unit 130 is provided to suck air containing dust.
- the suction unit 130 may have a form protruding from the front of the main body 110 to both left and right sides.
- the front end of the suction unit 130 may be disposed in a position spaced forward from one side of the main body 110.
- the left and right both ends of the suction unit 130 may be disposed at positions spaced apart from the main body 110 to the left and right sides.
- the main body 110 is formed in a circular shape, and as both rear ends of the suction unit 130 protrude from the main body 110 to the left and right sides, respectively, an empty space, that is, between the main body 110 and the suction unit 130. Gaps may be formed.
- the empty space is a space between the left and right both ends of the main body 110 and the left and right both ends of the suction unit 130, and has a shape recessed inside the mobile robot 100.
- the suction unit 130 may be detachably coupled to the main body 110.
- the mop module (not shown) may be detachably coupled to the main body 110 in place of the separated suction unit 130.
- the image acquisition unit 120 may be disposed in the main body 110.
- the image acquisition unit 120 may be disposed in front of the main body 110.
- the image acquisition unit 120 may be disposed to overlap the suction unit 130 in the vertical direction of the main body 110.
- the image acquisition unit 120 may be disposed above the suction unit 130.
- the image acquisition unit 120 may detect an obstacle around the mobile robot 100.
- the image acquisition unit 120 may detect an obstacle or a feature in front of the suction unit 130 located in the front of the mobile robot 100 so as not to collide with the obstacle.
- the image acquisition unit 120 may further perform other sensing functions to be described later in addition to the sensing function.
- the main body 110 may be provided with a dust container accommodating part (not shown).
- the dust container 140 is detachably coupled to the dust container 140 which separates and collects dust in the sucked air.
- the dust container accommodation part may be formed at the rear side R of the main body 110. Part of the dust container 140 is accommodated in the dust container receiving portion, the other part of the dust container 140 may be formed to protrude toward the rear (R) of the main body 110.
- the dust container 140 has an inlet (not shown) through which air containing dust is introduced and an outlet (not shown) through which air from which dust is separated is formed.
- the inlet and the outlet of the dust container 140 are configured to communicate with the first opening (not shown) and the second opening (not shown) formed in the inner wall of the dust container accommodation part when the dust container 140 is mounted on the dust container accommodation part. .
- a suction flow path for guiding air from the suction port of the suction unit 130 to the first opening is provided.
- An exhaust passage for guiding air to an exhaust port (not shown) opened toward the outside of the second opening is provided.
- the air containing the dust introduced through the suction unit 130 is introduced into the dust container 140 through the intake passage inside the main body 110, and the air and the dust are passed through the filter or the cyclone of the dust container 140. Are separated from each other. Dust is collected in the dust container 140, the air is discharged from the dust container 140, and then through the exhaust flow path inside the main body 110 is finally discharged to the outside through the exhaust port.
- FIG. 5 is a block diagram showing a control relationship between major components of a mobile robot according to an embodiment of the present invention.
- the mobile robot 100 includes a main body 110 and an image acquisition unit 120 that acquires an image around the main body 110.
- the mobile robot 100 includes a driving unit 160 for moving the main body 110.
- the driving unit 160 includes at least one wheel unit 111 for moving the main body 110.
- the driving unit 160 includes a driving motor (not shown) connected to the wheel unit 111 to rotate the wheel unit 111.
- the image acquisition unit 120 photographs a driving zone and may include a camera module.
- the camera module may include a digital camera.
- the digital camera includes at least one optical lens and an image sensor (eg, a CMOS image sensor) including a plurality of photodiodes (eg, pixels) formed by the light passing through the optical lens.
- the apparatus may include a digital signal processor (DSP) that forms an image based on signals output from the photodiodes.
- the digital signal processor may generate not only a still image but also a moving image composed of frames composed of the still image.
- Multiple cameras may be installed for each part for photographing efficiency.
- the image photographed by the camera may be used to recognize a kind of material such as dust, hair, floor, etc. present in the corresponding space, whether to clean or check the cleaning time.
- the camera may photograph a situation of an obstacle or a cleaning area existing on the front of the moving direction of the mobile robot 100.
- the image acquisition unit 120 may acquire a plurality of images by continuously photographing the periphery of the main body 110, and the obtained plurality of images may be stored in the storage unit 105. Can be.
- the mobile robot 100 improves the accuracy of spatial recognition, location recognition, and obstacle recognition using a plurality of images, or selects one or more images from the plurality of images and uses effective data to provide spatial recognition, location recognition, and obstacle recognition. You can increase the accuracy.
- the mobile robot 100 may include a sensor unit 170 including sensors for sensing various data related to the operation and state of the mobile robot.
- the sensor unit 170 may include an obstacle detecting sensor detecting a front obstacle.
- the sensor unit 170 may further include a cliff detection sensor for detecting the presence of a cliff on the floor in the driving zone, and a lower camera sensor for obtaining an image of the floor.
- the obstacle detecting sensor may include an infrared sensor, an ultrasonic sensor, an RF sensor, a geomagnetic sensor, a position sensitive device (PSD) sensor, and the like.
- the position and type of the sensor included in the obstacle detection sensor may vary depending on the type of the mobile robot, the obstacle detection sensor may include more various sensors.
- the sensor unit 170 may further include a motion detection sensor for detecting the operation of the mobile robot 100 according to the driving of the main body 110 and outputs the motion information.
- a motion detection sensor for detecting the operation of the mobile robot 100 according to the driving of the main body 110 and outputs the motion information.
- a gyro sensor, a wheel sensor, an acceleration sensor, or the like may be used as the motion detection sensor.
- the gyro sensor detects the rotation direction and detects the rotation angle when the mobile robot 100 moves according to the driving mode.
- the gyro sensor detects the angular velocity of the mobile robot 100 and outputs a voltage value proportional to the angular velocity.
- the controller 150 calculates the rotation direction and the rotation angle by using the voltage value output from the gyro sensor.
- the wheel sensor is connected to the wheel unit 111 to sense the number of revolutions of the wheel.
- the wheel sensor may be a rotary encoder.
- the acceleration sensor detects a change in the speed of the mobile robot 100, for example, a change in the mobile robot 100 due to start, stop, direction change, collision with an object, and the like.
- the acceleration sensor may be built in the controller 150 to detect a speed change of the mobile robot 100.
- the controller 150 may calculate a position change of the mobile robot 100 based on the motion information output from the motion detection sensor. This position becomes a relative position corresponding to the absolute position using the image information.
- the mobile robot can improve the performance of position recognition using image information and obstacle information through the relative position recognition.
- the mobile robot 100 may include a power supply unit (not shown) for supplying power to the mobile robot by having a rechargeable battery.
- the power supply unit supplies driving power and operation power to each component of the mobile robot 100, and when the remaining power is insufficient, power may be supplied and charged from a charging stand (not shown).
- the mobile robot 100 may further include a battery detector (not shown) that detects a charging state of the battery and transmits a detection result to the controller 150.
- the battery is connected to the battery detector so that the battery remaining amount and the charging state are transmitted to the controller 150.
- the battery remaining amount may be displayed on the display 182 of the output unit 180.
- the mobile robot 100 includes an input unit 125 for inputting on / off or various commands.
- the input unit 125 may include a button, a dial, a touch screen, and the like.
- the input unit 125 may include a microphone for receiving a user's voice command. Through the input unit 125, various control commands necessary for the overall operation of the mobile robot 100 may be input.
- the mobile robot 100 may include an output unit 180 to display reservation information, a battery state, an operation mode, an operation state, an error state, etc. as an image or output a sound.
- the output unit 180 may include a sound output unit 181 for outputting an audio signal.
- the sound output unit 181 may output a warning message, such as a warning sound, an operation mode, an operation state, an error state, etc., under the control of the controller 150.
- the sound output unit 181 may convert an electrical signal from the controller 150 into an audio signal and output the audio signal.
- a speaker or the like may be provided.
- the output unit 180 may further include a display 182 that displays reservation information, a battery state, an operation mode, an operation state, an error state, and the like as an image.
- the mobile robot 100 includes a controller 150 for processing and determining various information such as recognizing a current location, and a storage 105 for storing various data.
- the mobile robot 100 may further include a communication unit 190 for transmitting and receiving data with an external terminal.
- the external terminal includes an application for controlling the mobile robot 100, and displays an map of the driving area to be cleaned by the mobile robot 100 through execution of the application, and designates an area to clean a specific area on the map.
- Examples of the external terminal may include a remote controller, a PDA, a laptop, a smartphone, a tablet, and the like, having an application for setting a map.
- the external terminal may communicate with the mobile robot 100 to display a current location of the mobile robot together with a map, and information about a plurality of areas may be displayed. In addition, the external terminal updates and displays its position as the mobile robot travels.
- the controller 150 controls the image acquisition unit 120, the input unit 125, the driving unit 160, the suction unit 130, etc. constituting the mobile robot 100 to control the overall operation of the mobile robot 100. To control.
- the controller 150 may process a voice input signal of the user received through the microphone of the input unit 125 and perform a voice recognition process.
- the mobile robot 100 may include a voice recognition module that performs voice recognition inside or outside the controller 150.
- simple voice recognition may be performed by the mobile robot 100 itself, and high-level voice recognition such as natural language processing may be performed by the server 70.
- the storage unit 105 records various types of information necessary for the control of the mobile robot 100 and may include a volatile or nonvolatile recording medium.
- the recording medium stores data that can be read by a microprocessor, and includes a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD-ROM, a magnetic Tapes, floppy disks, optical data storage devices, and the like.
- the storage unit 105 may store a map for the driving zone.
- the map may be input by an external terminal, a server, or the like, which may exchange information with the mobile robot 100 through wired or wireless communication, or may be generated by the mobile robot 100 by learning itself.
- the map may indicate the location of the rooms in the driving zone.
- the current position of the mobile robot 100 may be displayed on the map, and the current position of the mobile robot 100 on the map may be updated during the driving process.
- the external terminal stores the same map as the map stored in the storage unit 105.
- the storage unit 105 may store cleaning history information. Such cleaning history information may be generated every time cleaning is performed.
- the map of the driving zone stored in the storage unit 105 stores a navigation map used for driving during cleaning, a Simultaneous Localization and Mapping (SLAM) map used for location recognition, an obstacle, and the like. It may be a learning map used for learning cleaning, a global location map used for global location recognition, an obstacle recognition map in which information about the recognized obstacle is recorded.
- SLAM Simultaneous Localization and Mapping
- maps may be stored and managed in the storage unit 105 for each use, but the map may not be clearly classified for each use.
- a plurality of pieces of information may be stored in one map to be used for at least two purposes.
- the controller 150 may include a driving control module 151, a map generation module 152, a position recognition module 153, and an obstacle recognition module 154.
- the driving control module 151 controls the driving of the mobile robot 100, and controls the driving of the driving unit 160 according to the driving setting.
- the driving control module 151 may determine the driving path of the mobile robot 100 based on the operation of the driving unit 160. For example, the driving control module 151 may determine the current or past moving speed of the mobile robot 100, the distance traveled, and the like based on the rotational speed of the wheel unit 111, and the mobile robot thus identified ( Based on the driving information of the 100, the position of the mobile robot 100 on the map may be updated.
- the map generation module 152 may generate a map of the driving zone.
- the map generation module 152 may generate a map by processing an image acquired through the image acquisition unit 120. That is, a cleaning map corresponding to the cleaning area can be created.
- the map generation module 152 may recognize the global position by processing the image acquired through the image acquisition unit 120 at each position in association with the map.
- the position recognition module 153 estimates and recognizes a current position.
- the position recognition module 153 uses the image information of the image acquisition unit 120 to determine the position in connection with the map generation module 152 to estimate the current position even when the position of the mobile robot 100 suddenly changes. Can be recognized.
- the location recognition module 153 may recognize the property of the current location, that is, the location recognition module 153 may recognize the space.
- the mobile robot 100 may recognize a position during continuous driving through the position recognition module 153, and also, through the map generation module 152 and the obstacle recognition module 154, without the position recognition module 153. Learn, estimate your current location, and more.
- the image acquisition unit 120 acquires images around the mobile robot 100.
- an image acquired by the image acquisition unit 120 is defined as an 'acquisition image'.
- the acquired image includes various features such as lightings on the ceiling, edges, corners, blobs, and ridges.
- the map generation module 152 detects a feature from each of the acquired images, and calculates a descriptor based on each feature point.
- the map generation module 152 classifies at least one descriptor into a plurality of groups according to a predetermined sub-classification rule for each acquired image based on descriptor information obtained through the acquired image of each position, and according to the predetermined sub-representation rule, the same group. Descriptors included in each can be converted into lower representative descriptors.
- all descriptors collected from acquired images in a predetermined area are classified into a plurality of groups according to a predetermined sub-classification rule, and descriptors included in the same group according to the predetermined sub-representation rule are respectively represented by lower representative descriptors. You can also convert to.
- the map generation module 152 can obtain the feature distribution of each location through this process.
- Each positional feature distribution can be represented by a histogram or an n-dimensional vector.
- the map generation module 152 may estimate an unknown current position based on a descriptor calculated from each feature point without passing through a predetermined sub classification rule and a predetermined sub representative rule.
- the current position of the mobile robot 100 when the current position of the mobile robot 100 is unknown due to a position leap or the like, the current position may be estimated based on data such as a previously stored descriptor or a lower representative descriptor.
- the mobile robot 100 obtains an acquired image through the image acquisition unit 120 at an unknown current position. Through the image, various features such as lightings on the ceiling, edges, corners, blobs, and ridges are identified.
- the position recognition module 153 detects features from the acquired image and calculates a descriptor.
- the position recognition module 153 is based on at least one descriptor information obtained through an acquired image of an unknown current position, and position information (for example, feature distribution of each position) to be compared according to a predetermined lower conversion rule. Convert to comparable information (lower recognition feature distribution).
- each position feature distribution may be compared with each recognition feature distribution to calculate each similarity. Similarity (probability) may be calculated for each location corresponding to each location, and the location where the greatest probability is calculated may be determined as the current location.
- the controller 150 may distinguish a driving zone and generate a map composed of a plurality of regions, or recognize a current position of the main body 110 based on a pre-stored map.
- the controller 150 may transmit the generated map to an external terminal, a server, etc. through the communication unit 190. As described above, the controller 150 may store the map in the storage 105 when a map is received from an external terminal, a server, or the like.
- the map may be divided into a plurality of cleaning areas, and include a connection path connecting the plurality of areas, and may include information about obstacles in the area.
- the controller 150 determines whether the position on the map matches the current position of the mobile robot.
- the cleaning command may be input from a remote controller, an input unit, or an external terminal.
- the controller 150 recognizes the current position and recovers the current position of the mobile robot 100 based on the current position.
- the driving unit 160 may be controlled to move to the designated area.
- the position recognition module 153 analyzes the acquired image input from the image acquisition unit 120 to estimate the current position based on the map. can do.
- the obstacle recognition module 154 or the map generation module 152 may also recognize the current position in the same manner.
- the driving control module 151 calculates a driving route from the current position to the designated region and controls the driving unit 160 to move to the designated region.
- the driving control module 151 may divide the entire driving zone into a plurality of areas according to the received cleaning pattern information, and set at least one area as a designated area.
- the driving control module 151 may calculate the driving route according to the received cleaning pattern information, travel along the driving route, and perform cleaning.
- the controller 150 may store the cleaning record in the storage unit 105 when cleaning of the set designated area is completed.
- controller 150 may transmit the operation state or cleaning state of the mobile robot 100 to the external terminal and the server at a predetermined cycle through the communication unit 190.
- the external terminal displays the location of the mobile robot along with the map on the screen of the running application based on the received data, and outputs information on the cleaning state.
- the mobile robot 100 moves in one direction until an obstacle or a wall surface is detected, and when the obstacle recognition module 154 recognizes the obstacle, the robot moves straight, rotates, or the like according to the recognized obstacle's properties.
- the pattern can be determined.
- the controller 150 may control to perform the avoidance driving in a different pattern based on the recognized property of the obstacle.
- the controller 150 may control to avoid driving in different patterns according to the properties of obstacles such as non-hazardous obstacles (general obstacles), dangerous obstacles, and movable obstacles.
- the controller 150 may control the dangerous obstacle to be bypassed in a state where a safe distance of a longer distance is secured.
- the controller 150 may control to perform the avoiding driving corresponding to the general obstacle or the avoiding driving corresponding to the dangerous obstacle.
- the controller 150 may control to travel accordingly.
- the mobile robot 100 may perform obstacle recognition and avoidance based on machine learning.
- the controller 150 may drive the driving unit 160 based on an obstacle recognition module 154 that recognizes an obstacle previously learned by machine learning in an input image and an attribute of the recognized obstacle. It may include a driving control module 151 for controlling.
- FIG. 5 illustrates an example in which the plurality of modules 151, 152, 153, and 154 are separately provided in the controller 160, the present invention is not limited thereto.
- the position recognition module 153 and the obstacle recognition module 154 may be integrated into one recognizer and constitute one recognition module 155.
- the recognizer may be trained using a learning technique such as machine learning, and the learned recognizer may recognize attributes of an area, an object, and the like by classifying data input thereafter.
- the map generation module 152, the position recognition module 153, and the obstacle recognition module 154 may be configured as one integrated module.
- the position recognition module 153 and the obstacle recognition module 154 are integrated as one recognizer and described with reference to an embodiment configured as one recognition module 155, but the position recognition module 153 and the obstacle recognition are described.
- the module 154 may operate in the same manner even when each is provided.
- the mobile robot 100 may include a recognition module 155 in which attributes of objects and spaces are learned by machine learning.
- Machine learning means that a computer can learn from data and let the computer take care of a problem without having to instruct the computer directly to the logic.
- ANN Deep Learning Based on Artificial Neural Networks
- the artificial neural network may be implemented in software or in the form of hardware such as a chip.
- the recognition module 155 may include an artificial neural network (ANN) in the form of software or hardware in which properties of an object, such as an object of a space or an obstacle, are learned.
- ANN artificial neural network
- the recognition module 155 may include a deep neural network (DNN) such as a convolutional neural network (CNN), a recurrent neural network (RNN), a deep belief network (DBN), and the like that have been learned by deep learning. It may include.
- DNN deep neural network
- CNN convolutional neural network
- RNN recurrent neural network
- DNN deep belief network
- the recognition module 155 may determine an attribute of a space and an object included in the input image data based on weights among nodes included in the deep neural network DNN.
- the driving control module 151 may control the driving of the driving unit 160 based on the recognized space and the properties of the obstacle.
- the recognition module 155 may recognize attributes of spaces and obstacles included in the selected specific viewpoint image based on data previously learned by machine learning.
- the storage unit 105 may store space, input data for determining object properties, and data for learning the deep neural network DNN.
- the storage unit 105 may store the original image obtained by the image acquisition unit 120 and the extracted images from which the predetermined region is extracted.
- the storage 105 may store weights and biases forming the deep neural network (DNN).
- DNN deep neural network
- weights and biases constituting the deep neural network structure may be stored in an embedded memory of the recognition module 155.
- the recognition module 155 performs a learning process by using a predetermined image as training data whenever the image acquisition unit 120 acquires an image or extracts a partial region of the image, or a predetermined number or more. After the image is acquired, the learning process may be performed.
- the mobile robot 100 may receive data related to machine learning from the predetermined server through the communication unit 190.
- the mobile robot 100 may update the recognition module 155 based on data related to machine learning received from the predetermined server.
- FIG. 6 is a view referred to for describing learning using product data according to an embodiment of the present invention.
- product data obtained by the operation of a predetermined device such as a mobile robot 100 may be transmitted to the server 70.
- the mobile robot 100 may transmit a space, an object, and usage related data to the server 70 to the server 70.
- the space and object related data may be a space recognized by the mobile robot 100 and data related to recognition of an object, or a space obtained by the image acquisition unit 120. And image data about an object.
- the usage-related data is data obtained according to the use of a predetermined product, for example, the mobile robot 100, the use history data, the sensing data obtained from the sensor unit 170, etc. Can be.
- control unit 150 more specifically, the recognition module 155 of the mobile robot 100 may be equipped with a deep neural network structure (DNN) such as a convolutional neural network (CNN).
- DNN deep neural network structure
- CNN convolutional neural network
- the learned deep neural network structure DNN may receive input data for recognition, recognize a property of an object and a space included in the input data, and output the result.
- the learned deep neural network structure may receive input data for recognition, analyze and learn the usage-related data (Data) of the mobile robot 100 to recognize the use pattern, the use environment, and the like. have.
- the space, object, and usage related data may be transmitted to the server 70 through the communication unit 190.
- the server 70 may generate a configuration of learned weights, and the server 70 may learn a deep neural network (DNN) structure using training data.
- DNN deep neural network
- the server 70 may transmit the updated deep neural network (DNN) structure data to the mobile robot 100 to be updated.
- DNN deep neural network
- home appliance products such as mobile robot 100 may become smarter and provide an evolving user experience (UX) as they are used.
- UX user experience
- FIG. 7 is an example of a simplified internal block diagram of a server according to an embodiment of the present invention.
- the server 70 may include a communication unit 720, a storage unit 730, a learning module 740, and a processor 710.
- the processor 710 may control the overall operation of the server 70.
- the server 70 may be a server operated by a home appliance manufacturer such as the mobile robot 100 or a server operated by a service provider, or may be a kind of cloud server.
- the communication unit 720 may receive various data such as status information, operation information, operation information, and the like from a portable terminal, a home appliance such as the mobile robot 100, a gateway, or the like.
- the communication unit 720 may transmit data corresponding to the received various information to a portable terminal, a home appliance such as the mobile robot 100, a gateway, or the like.
- the communication unit 720 may include one or more communication modules, such as an internet module and a mobile communication module.
- the storage unit 730 may store the received information and may include data for generating result information corresponding thereto.
- the storage unit 730 may store data used for machine learning, result data, and the like.
- the learning module 740 may serve as a learner of a home appliance such as the mobile robot 100.
- the learning module 740 may include an artificial neural network, for example, a deep neural network (DNN) such as a convolutional neural network (CNN), a recurrent neural network (RNN), and a deep belief network (DBN). You can learn neural networks.
- DNN deep neural network
- CNN convolutional neural network
- RNN recurrent neural network
- DBN deep belief network
- the controller 710 may control to update the artificial neural network structure of the home appliance such as the mobile robot 100 to the learned artificial neural network structure after learning according to a setting.
- the learning module 740 may receive input data for recognition, recognize a property of an object and a space included in the input data, and output the result.
- the communication unit 720 may transmit the recognition result to the mobile robot 100.
- the learning module 740 may analyze and learn usage-related data of the mobile robot 100 to recognize a usage pattern, a usage environment, and the like, and output the result.
- the communication unit 720 may transmit the recognition result to the mobile robot 100.
- home appliance products such as the mobile robot 100 may receive a recognition result from the server 70 and operate by using the received recognition result.
- the server 70 becomes smarter by learning using the product data, it is possible to provide an evolving user experience (UX) as using the home appliance product.
- UX evolving user experience
- the mobile robot 100 and the server 70 may also use external information.
- the mobile robot 100 and the server 70 may be obtained from spatial information of a specific home appliance product such as the mobile robot 100, object information, internal information such as a usage pattern, and other products, or the server 70. Can provide excellent user experience by comprehensively using external information obtained from other linked service servers.
- washing machine 32 Washing may be performed such that washing is finished in accordance with the time when the user arrives at home.
- the server 70 may perform voice recognition by receiving a voice input signal spoken by a user.
- the server 70 may include a speech recognition module, and the speech recognition module may include an artificial neural network trained to perform speech recognition on input data and output a speech recognition result.
- the server 70 may include a voice recognition server for voice recognition.
- the voice recognition server may also include a plurality of servers that share a predetermined process of the voice recognition process.
- a speech recognition server may include an automatic speech recognition (ASR) server that receives speech data and converts the received speech data into text data, and the text from the automatic speech recognition server. It may include a natural language processing (NLP) server that receives the data, and analyzes the received text data to determine the voice command.
- the speech recognition server may further include a text to speech (TTS) server that converts the text speech recognition result output from the natural language processing server into speech data and transmits the speech data to another server or a home appliance. .
- ASR automatic speech recognition
- NLP natural language processing
- TTS text to speech
- the mobile robot 100 and / or the server 70 may perform voice recognition, so that a user voice may be used as an input for controlling the mobile robot 100.
- the mobile robot 100 may provide a variety of active control functions to the user by actively providing information or outputting a voice recommending a function or service.
- the mobile robot 100 and / or the server 70 may provide various services by mapping the sensing data of the other device disposed in the driving zone and information on the space.
- controller 150 and / or the learning module 740 may utilize a map generated by the mobile robot 100 while driving the driving zone.
- FIG. 8 and 9 are views referred to for description of spatial recognition according to an embodiment of the present invention
- FIG. 8 illustrates an example of area classification and a map generation according to the mobile robot according to an embodiment of the present invention.
- 9 shows an example of attribute recognition of a region.
- the mobile robot 100 may generate a map while driving the driving zone X1 through the wall followers.
- the map generation module 152 generates a map as shown in FIG. 8C by dividing the driving area X1 into a plurality of regions A1 'to A9'. .
- the generated map is stored in the storage unit 105 and transmitted to an external terminal and a server through the communication unit 190.
- the map generation module 152 classifies the small area and the large area with respect to the driving area X1 and generates a map according to the above.
- the terminal executes the application and displays the received map on the screen.
- the plurality of divided areas A1 to A9 are displayed differently.
- the plurality of areas A1 to A9 are each displayed in different colors or different names are displayed.
- the mobile robot and the terminal are based on storing the same map, but the user map is displayed on the terminal to simplify the area as shown in FIG. 8 (c) so that the user can easily recognize the area.
- the driving and cleaning are performed based on the map as shown in FIG. Obstacles may also be displayed in the user map of FIG. 8C.
- the map illustrated in (b) of FIG. 8 may be a SLAM map or a navigation map based on the SLAM map.
- the mobile robot 100 determines the current location based on the stored map, performs a designated cleaning when the current location matches the location on the map, and when the current location does not match, After recognizing and recovering the location, clean it. Therefore, even if the mobile robot 100 is located at any position among the plurality of areas A1 to A9, the mobile robot 100 may move to a designated area and perform cleaning by determining the current location.
- the remote controller or the terminal may select at least one of the plurality of areas A1 to A9 and input a cleaning command to the mobile robot 100.
- the mobile robot 100 may set a part of any one area as a cleaning area through a remote controller or a terminal, or set the cleaning area by touching or dragging a plurality of areas without distinguishing the area.
- one of the areas may be set as a priority area, or after starting the priority area, the area may be moved to a near area and cleaned, or a cleaning order may be set.
- the cleaning order is set for a plurality of designated areas, the mobile robot 100 moves in the designated order and performs cleaning. The mobile robot 100 may perform cleaning by moving to a region closer to the current position when a separate order is not specified for the plurality of cleaning regions.
- the mobile robot 100 may recognize the attributes of the plurality of areas A1 to A9 included in the travel area X1.
- the navigation map 900 may include a plurality of local maps LM1, LM2, LM3, LM4, LM5,...
- the navigation map may be divided into a plurality of regions, and each region may include one or more local maps.
- Each local map is set not to overlap each other.
- the local maps LM1, LM2, LM3, LM4, and LM5 may be set to any size as a unit map.
- the local maps LM1, LM2, LM3, LM4, and LM5 may be set in a square shape having a size of N by N based on the wall.
- the mobile robot 100 may acquire area information utilizing continuous image information and map information during movement.
- the mobile robot 100 may move while cleaning the home, and the image acquisition unit 120 may capture a plurality of images by photographing during the movement.
- the controller 150 may control the image acquisition unit 120 to not perform further photographing in a region corresponding to the local map from which N images photographed in different directions are obtained. Accordingly, if the number of data required for attribute recognition is secured, no further image is acquired, thereby preventing unnecessary computation, processing, and recognition processes.
- the controller 150 may control the image acquisition unit 120 to acquire an image in four or eight directions for each local map and not acquire additional images when all the images are acquired.
- the position in the local map of the mobile robot 100 has no significant influence and the direction is important.
- the camera included in the image acquisition unit 120 may photograph an area of a predetermined range according to an angle of view, so that images captured at different angles from positions existing within a predetermined range may be photographed at a specific position even if they are not the same position. It can cover a range similar to 360 degrees of photographing while rotating, and analyzing it can accurately recognize the attributes of the space.
- the controller 150 may determine whether the area may be recognized as a plurality of images are acquired.
- the controller 150 may recognize an area corresponding to the local map based on the N images acquired from the local map.
- the controller 150 may recognize an object existing in an area corresponding to the local map based on the extracted image. Can be.
- the controller 150 may recognize an attribute of a predetermined region based on data previously learned by machine learning in the entire image.
- controller 150 may recognize an object based on data previously learned by machine learning in at least some of the image.
- the controller 150 may derive a final conclusion by integrating the object recognition results and the area recognition results of each local map and construct a hierarchical map including a plurality of semantic maps.
- the controller 150 may gradually generate a semantic map.
- the boundary of the semantic map may overlap.
- the controller 150 configures one semantic map with the first local map LM1, the second local map LM2, and the third local map LM3, and the second local map LM2,
- One semantic map is composed of the third local map LM3 and the fourth local map LM4, and one is composed of the third local map LM3, the fourth local map LM4, and the fifth local map LM5.
- the controller 150 may determine a final attribute of an area corresponding to the semantic map based on at least one of a frequency, a confidence value, and an average value of the confidence values of the recognized local map recognition results. have.
- control part 150 is a 1st local map LM1 and 2nd local in the semantic map comprised by the 1st local map LM1, the 2nd local map LM2, and the 3rd local map LM3. Based on at least one of the frequency of the recognition results of each of the map LM2 and the third local map LM3, a confidence value, and an average value of the confidence values, an attribute of an area corresponding to the semantic map may be determined as a bedroom.
- control part 150 is the 3rd local map LM3 and the 4th local map in the semantic map comprised from the 3rd local map LM3, the 4th local map LM4, and the 5th local map LM5.
- LM4 and the attribute of the area corresponding to the semantic map may be determined as a living room based on at least one of the frequency of the recognition results of each of the fifth local maps LM5, the confidence value, and the average value of the confidence values.
- the confidence value does not exceed a predetermined threshold in the semantic map composed of the second local map LM2, the third local map LM3, and the fourth local map LM4. Can be treated as Unknown.
- FIG. 10 is an example of an IoT device, and illustrates a multipurpose sensor capable of performing various functions by including sensors therein.
- the multipurpose sensor 45 may include a first unit 601 and a second unit 602 detachably coupled to the first unit 601.
- the second unit 602 may be attached to a home appliance, a wall, a door, or the like.
- the second unit 602 may be attached to a home appliance, a wall, a door, or the like by an adhesive or a double-sided tape.
- the first unit 601 may be separated from the second unit 602 while the second unit 602 is fixed at a specific position. In addition, a portion of the first unit 601 may be inserted into the second unit 602.
- Various components such as various modules and batteries may be accommodated in the internal space of the multipurpose sensor 45.
- the first unit 601 may include a communication module (not shown) for communication.
- the communication module may be a Zigbee communication module, a Bluetooth communication module, or a Wi-Fi communication module.
- the multipurpose sensor 45 may communicate with the mobile robot 100, the server 70, the mobile terminal 50, and the like through a communication module.
- the first unit 601 may further include a sensing module (not shown), and the sensing module may include one or more sensors.
- the sensing module may include an acceleration sensor. Movement of the product or door to which the multipurpose sensor 45 is attached may be detected by the acceleration sensor.
- the sensing module may include a temperature and humidity sensor for sensing temperature and humidity. It is also possible that the sensing module includes only one of a temperature sensor and a humidity sensor or both sensors.
- the sensing module may include a proximity sensor.
- the proximity sensor may be, for example, an infrared sensor, and may detect a proximity of a specific object to the multipurpose sensor 45 or a change in distance between a product or a door in which the multipurpose sensor 45 is installed, and a surrounding structure.
- the versatile sensor 45 may be attached to a home appliance that does not have a communication module. Accordingly, home appliances that do not have a communication module can use various smart functions.
- the versatile sensor 45 may be used to give additional functionality to a particular home appliance.
- the multipurpose sensor 45 may be used to detect a state around the multipurpose sensor 45 regardless of the operation of the home appliance.
- the multipurpose sensor 45 may be attached to a home appliance, or may be attached to an indoor wall or a window or a door.
- FIG. 11A and 11B illustrate an example of using the IoT apparatus of FIG. 10.
- the multi-purpose sensor 45 is attached to the door to detect an opening and closing of the door.
- 11A and 11B illustrate a state in which the multipurpose sensor 45 is attached to the door.
- 11A illustrates a state in which the door is opened
- FIG. 11B illustrates a state in which the door is opened.
- the multipurpose sensor 45 may be attached to a door 850 such as a window or a door to perform a door opening detection function.
- the reflector 851 may be attached to a position adjacent to the multipurpose sensor 45 attached to the door.
- the proximity sensor of the multipurpose sensor 45 may include a light emitting part and a light receiving part, the reflector 851 may reflect light emitted from the light emitting part, and the light receiving part may receive light reflected from the reflector 851. Can be detected.
- the multipurpose sensor 45 may determine whether the door is open based on information detected by the proximity sensor.
- the multipurpose sensor 45 may transmit door opening notification information to the mobile robot 100, the server 70, the portable terminal 50, and the like.
- the multipurpose sensor 45 may transmit door closing notification information to the mobile robot 100, the server 70, the mobile terminal 50, and the like.
- the data sensed by the multipurpose sensor 45 may be transferred to the mobile robot 100, the server 70, the portable terminal 50, and stored and managed.
- the mobile robot 100, the server 70, the mobile terminal 50, and the like may be stored in association with sensing data received from another device.
- the mobile robot 100, the server 70, the mobile terminal 50, and the like may learn using data received from the multipurpose sensor 45.
- FIG. 12 is a flowchart illustrating a control method of a mobile robot according to an embodiment of the present invention.
- the mobile robot 100 may generate a map for a driving area including a plurality of regions (S1210).
- the mobile robot 100 may travel a driving zone and generate a map for the driving zone.
- the mobile robot 100 may distinguish a plurality of regions included in the map. More preferably, the mobile robot 100 may recognize the attributes of the plurality of regions.
- the mobile robot 100 may map a plurality of units including one or more sensors to a specific area among the plurality of areas.
- the controller 150 may map a predetermined area and a unit disposed in the predetermined area.
- the unit may be a sensor that senses one or more pieces of information, or may be an IoT device or a home appliance that includes one or more sensors.
- the unit may be the multipurpose sensor 45 illustrated in FIG. 10.
- the multipurpose sensor 45 may include a temperature sensor, a humidity sensor, and the like to sense various data.
- the unit may be a home appliance such as the air conditioner 11 and the air cleaners 12 and 13 illustrated in FIG. 1.
- Home appliances are often equipped with multiple sensors, depending on features and specifications.
- the air conditioner 11 and the air cleaners 12 and 13 may include sensors related to air quality such as dust sensors.
- the air conditioner 11 may include a temperature sensor, a humidity sensor, and the like.
- mapping of the unit and the space S1220 may be performed manually. For example, after the user loads the map into the mobile terminal 50, the user may directly input unit information into a plurality of areas included in the map to map the unit and the space.
- the mobile robot 100 or the server 70 may automatically map units and spaces.
- the controller 150 and / or the learning module 740 may identify a unit disposed in a predetermined region based on the image acquired through the image acquisition unit 120.
- the controller 150 and / or the learning module 740 may map the identified unit to the predetermined area.
- the controller 150 and / or the learning module 740 may determine that the signal strength of the signal transmitted by the identified unit received through the communication unit 190 in the region is greater than or equal to a reference value. In this case, the identified unit may be mapped to the predetermined area.
- the controller 150 and / or the learning module 740 may map a unit having a signal strength greater than or equal to a reference value received through the communication unit 190 at a specific location to an area including the specific location.
- the mobile robot 100 may receive sensing data from the plurality of units through the communication unit 190 in operation S1230. Sensing data of the units may be transferred to the mobile robot 100 via the server 70. In some cases, the server 70 may also receive sensing data from a plurality of units.
- the mobile robot 100 may store the received sensing data in association with an area mapped to the corresponding sensor (S1240).
- the server 70 may store the received sensing data in association with an area mapped to the corresponding sensor (S1240).
- the controller 150 may control the sensing data to be stored in the storage unit 105 by mapping the sensing data to an area where the corresponding device is disposed.
- the sensing data can be used more accurately and conveniently.
- the user pattern in the driving zone may be determined based on the stored sensing data for each region (S1250).
- the user pattern may include a space usage pattern including usage time information for a predetermined space.
- the user pattern may include information on a use time of a specific unit in a specific space, a movement pattern in a specific space, an electronic device arranged in a specific space, and furniture.
- the controller 150 and / or the learning module 740 may determine the space usage pattern in the driving zone based on the stored sensing data for each region (S1250).
- the controller 150 and / or the learning module 740 may be based on various sensing data of units spaced apart from a plurality of areas, such as a user's time at home, a bedtime, a time out, a vacation period, and the like. User patterns can be detected.
- the space usage pattern may include usage time information for each region. That is, it is possible to check the time when there is more than one user per area.
- controller 150 and / or the learning module 740 may store and manage sensing data such as dust, temperature, and humidity for each region. Accordingly, the spatial cleanliness, pollution degree, pollution occurrence time, etc. for each region can be determined.
- controller 150 and / or the learning module 740 may analyze a correlation between the space usage pattern of the user and the space cleanliness.
- the mobile robot 100 may transmit data about the space usage pattern to the mobile terminal 50 through the communication unit 190.
- controller 150 and / or the learning module 740 may establish a cleaning plan based on the determined space usage pattern (S1260). It is possible to control to provide the established cleaning plan information to the user (S1270).
- This cleaning plan may be a new cleaning plan.
- the cleaning plan may be one in which the cleaning schedule set after the user purchases the mobile robot (! 00) is adjusted according to the surrounding environment and the life pattern of the user.
- the cleaning plan may include an optimal cleaning time for the travel zone.
- the controller 150 and / or the learning module 740 may calculate an optimal cleaning time for the driving zone based on the space usage pattern. For example, the controller 150 and / or the learning module 740 may calculate the time when the user is not at home as an optimal cleaning time.
- controller 150 and / or the learning module 740 may calculate an optimal cleaning time for at least one of the driving zones based on the space usage pattern and the sensing data of the corresponding area.
- the controller 150 and / or the learning module 740 may determine the occurrence of contamination or increase of contamination based on the amount of dust data, and the contamination during a time when the user is not at home. The time closest to the occurrence time and the increase time of contamination can be calculated as the optimum cleaning time.
- the controller 150 and / or the learning module 740 may calculate an optimal cleaning time for at least one of the plurality of areas. That is, the controller 150 and / or the learning module 740 may calculate the optimal cleaning time for each region.
- the controller 150 and / or the learning module 740 may calculate an optimal cleaning time for at least one of the driving zones based on the space usage pattern and the sensing data of the corresponding area.
- the controller 150 and / or the learning module 740 may determine, as the optimum cleaning time, a time in which a predetermined area is used the least of the day or the week. For example, if room A is rarely used during the weekday day time zone, the controller 150 and / or the learning module 740 may set the weekday day time zone as the optimal cleaning time for room A.
- the cleaning plan may include a cleaning mode setting. That is, the cleaning plan may include at least one of an optimum cleaning time and a cleaning mode setting.
- the controller 150 and / or the learning module 740 may determine an optimal cleaning mode for the driving zone based on the space usage pattern. In this case, the controller 150 and / or the learning module 740 may determine an optimal cleaning mode for at least one of the plurality of areas based on the space usage pattern and the sensing data of the corresponding area. have.
- the mobile robot 100 may establish a cleaning schedule for at least one of the plurality of areas based on the space use pattern (S1260).
- the cleaning schedule may include a cleaning time for a predetermined area and a cleaning mode setting at the cleaning time.
- the mobile robot 100 may transmit data about the established cleaning schedule to the mobile terminal 50 through the communication unit 190.
- the cleaning plan information may be output through the output unit 180 of the mobile robot 100. According to an embodiment, the cleaning plan information may be transmitted to the mobile terminal 100. The user may check the cleaning plan proposed by the mobile robot 100 or the server 70 by manipulating the mobile terminal 100.
- the mobile robot 100 may perform cleaning on the driving zone according to the established cleaning plan (S1280).
- the mobile robot 100 may actively establish a cleaning plan by itself and perform cleaning according to the established cleaning plan.
- the mobile robot 100 may perform cleaning according to a user command after user confirmation of an established cleaning plan.
- the present invention it is possible to automatically set or change the optimum cleaning schedule setting and pattern through the sensor disposed at various places, the surrounding environment and user pattern information collected from the unit including the sensor.
- the mobile robot 100 may continuously receive sensing data from the units to continuously monitor changes in the surrounding environment and the user pattern. If the specific information is changed, the controller 150 may reflect the change in the cleaning schedule and the pattern.
- the mobile robot 100 may predict a region requiring cleaning and determine when there is no one to establish and execute a cleaning plan by itself.
- the mobile robot 100 may measure the amount of dust collecting increase for each region.
- the mobile robot 100 may measure the amount of dust on its own or receive the amount of dust from a unit disposed in each area.
- the mobile robot 100 may measure the dust collection increase amount for each divided room area, and organize the cleaning area and the cleaning schedule on a daily basis based on the dust increase prediction amount.
- the mobile robot 100 may determine the time when there are no people in the house by day through the microphone of the input unit 125.
- the mobile robot 100 When the predicted time and area occurred, the mobile robot 100 actively said, “I'll start cleaning now. If you are noisy please tell me next time ”can be ignited through the sound output unit 181, such as a guide message related to cleaning.
- the mobile robot 100 may stop cleaning, classify the time as a prohibition time, and change the scheduling to the next prediction time. Accordingly, by identifying the desired time and the undesired time for the user to clean, it is possible to establish a cleaning schedule more suitable for the user's pattern.
- the cleaning plan can be actively established and cleaning can be performed.
- the mobile robot 100 may obtain voice data spoken by a plurality of users in a driving area including a plurality of areas.
- the mobile robot 100 may obtain voice data spoken by a plurality of users through a microphone while driving the driving zone.
- the mobile robot 100 may receive and acquire voice data spoken by a plurality of users from at least some of the units.
- the mobile robot 100 may classify the voice data by user by itself or by using the server 70, and may map the classified user voice data to one or more areas of the plurality of areas, respectively. can do.
- the controller 150 may classify the classified voice data for each user for each area in which the voice data are obtained, and map the user and the area according to the frequency of speech of each user.
- map user A to room A. can do.
- the controller 150 may recognize attributes of the plurality of areas. In this case, the controller 150 may map one or more users of the plurality of users to an area recognized as a dedicated space among the plurality of areas, and may not map a user to an area recognized as a common space.
- a couple two persons or one child (one person) may be mapped to an area recognized as a dedicated space such as a room, and no user may be mapped to an area recognized as a common space such as a kitchen.
- the mobile robot 100 combines the space state recognition result and sensing data received from the units to create a cleaning plan. It can provide the following analysis report by area.
- -Room B 1 user, low dust, short space for user, fast cleaning using zigzag cleaning mode
- the mobile robot 100 may propose and execute a cleaning plan for room A to perform cleaning in a meticulous cleaning mode after the user goes to work, and performs a smart turbo cleaning in which suction power is variable before leaving work.
- the mobile robot 100 may perform the cleaning according to the established cleaning plan, and if there is user feedback, reflect the user feedback, and if there is a change in the user's space usage pattern, the mobile robot 100 continuously applies the cleaning plan. Can be modified.
- 13 to 17b are views referred to for describing a method for controlling a mobile robot according to an embodiment of the present invention.
- the mobile robot 100 may generate a map and recognize an indoor space through spatial recognition technology (location estimation and mapping), such as a slam (Samultaneous localization and mapping (SLAM)) (S1210).
- spatial recognition technology such as a slam (Samultaneous localization and mapping (SLAM)) (S1210).
- SLAM Simultaneous localization and mapping
- units such as sensors capable of acquiring indoor environment data and home appliances including the sensors may be mapped to areas A, B, C, and D in which the units are disposed (S1220).
- the air cleaner 13 may have a PM 1.0 ultra-fine dust sensor
- the dehumidifier 16 may have a humidity sensor
- the air conditioner 11 may have a temperature sensor
- the multipurpose sensor 35 may have a temperature / humidity sensor.
- an IoT sensor such as a home appliance capable of transmitting sensor information and a smart ThinkQ sensor (multipurpose sensor 35) may correspond to the unit.
- the air cleaner 13 may be mapped to the space A, the dehumidifier 16 to the space B, the multipurpose sensor 35 to the space C, and the air conditioner 11 to the space D.
- the mobile robot 100 is.
- a sensor may be mapped in a corresponding space by recognizing the sensor by image-based object detection and sensor signal strength within the recognized space unit.
- the humidity sensor data of the dehumidifier 16 may be mapped to the space B.
- the mobile robot 100 may map the humidity / temperature sensor data of the multipurpose sensor 35 with the space C.
- the threshold may be different for each sensor, is stronger than the sensor intensity transmitted through the wall, and can be set differently for each space based on the recognized distance in the space.
- the mobile robot 100 may landmark the mapped sensor and home appliance information in association with the spatial map created by the SLAM.
- the mobile robot 100 may store the received sensing data in association with the mapped spatial information.
- the mobile robot 100 and / or the server 70 predicts the environment in the space based on the most recently acquired data or other sensor data when the sensor and the home appliance with the sensor moves or there is no sensor And reason about it.
- the space B of the space B may be based on the environmental data currently acquired through other sensors and the data for the latest specific period. Predict and infer the environment.
- the mobile robot 100 can correct the landmarking in conjunction with the spatial map created by the SLAM to the position of the moved home appliance, such as the dehumidifier (16).
- the mobile robot 100 may recognize a home appliance or an IoT sensor as an image or sensor signal strength, map a corresponding space and a sensor, and mark the location of the sensor on a spatial map configured by the mobile robot 100.
- the data obtained from the sensor may be periodically transmitted to the server 70 and the mobile robot 100 by using a Wi-Fi (WIFI), Bluetooth (BT) communication and the like.
- WIFI Wi-Fi
- BT Bluetooth
- the multipurpose sensor 45 may be attached to the door to detect the opening and closing of the door.
- FIG. 14 illustrates an example of detecting the opening and closing of each door by attaching the multipurpose sensor 45 to various doors in the home.
- the multipurpose sensor 45 of the space C serves to detect temperature and humidity of the space C, and the remaining multipurpose sensor 45 detects the opening and closing of the attached door. .
- the multi-purpose sensor 45 may detect the opening and closing of the window attached to the window.
- the user pattern may be determined based on the sensing data of the units (S1250).
- the user pattern may be a home use pattern of one or more users, such as a time when one or more users are at home, a bedtime, a time out, a vacation.
- the user pattern may be determined based on the sensing data of the units and the sensing data of the mobile robot 100 itself.
- the present invention may fuse a plurality of sensing data to determine a user pattern.
- the user pattern may be detected by a door including a door, a multipurpose sensor 45 attached to a window, an obstacle detecting sensor and a camera provided in a mobile robot, and other sensing data obtained from a home appliance and an IoT sensor capable of transmitting sensor information. have.
- the sensing data of the units may be periodically transmitted to the server 70 and the mobile robot 100, and the server 70 and / or the mobile robot 100 may determine a time used by the user for each space.
- FIG. 15A illustrates user use data for each space analyzed in units of 24 hours
- FIG. 15B illustrates user use data for each space analyzed in units of one week.
- the user's home usage pattern (life pattern, e.g. living / sleeping / outing / vacation, etc.) on a daily / weekly basis can be detected.
- the sensing data of the units it is possible to determine the spatial environment, such as space cleanliness, humidity, temperature, such as floor dust, fine dust in the air.
- the mobile robot 100 and / or the server 70 may analyze the correlation between the space usage pattern, the space cleanliness, the spatial environment and generate a trained inference model, and include a cleaning time and a cleaning mode according to the inference model. Cleaning schedule can be established (S1260). In addition, the mobile robot 100 may perform cleaning according to the established cleaning schedule (S1280).
- C space is absent between 9 am and 7 pm, and there is a lot of dust and hair on the floor at 8 am. You can clean it.
- the toilet connected to the space A is frequently used between 7:00 am and 8:00 am, and the user is absent between 9 am and 7:00 pm, so after checking the humidity and floor condition of the space A, Mop Cleaning Mode in 10 Minutes ”for 10 minutes.
- the mobile robot 100 may drive in the "mop cleaning mode from the space D (dining room) to the living room". In this case, the mobile robot 100 may mop the floor of the place where the fine dust is high based on the sensing data of the multipurpose sensor 35.
- the mobile robot 100 can operate in conjunction with other devices.
- the use and location of the dryer are sensed by sound and communication when the dryer is used, and the mobile robot 100 concentrates the hair falling on the floor after the dryer is finished. You can clean it.
- the user and the space may be mapped based on the frequency of speech.
- user information (age / sex, etc.) is added to the space use pattern of the space B, the accuracy of generating the inference / prediction algorithm of the optimal cleaning time / optimal cleaning mode can be increased.
- the optimum cleaning schedule and cleaning mode can be set through the surrounding environment and user space use pattern information collected from the units.
- the mobile robot 100 and / or the server 70 may use environment data such as dust amount, temperature, and humidity based on the sensing data received from the units and product use space, user occupancy time, bedtime, and go out.
- environment data such as dust amount, temperature, and humidity based on the sensing data received from the units and product use space, user occupancy time, bedtime, and go out.
- User patterns can be determined, such as when the user is away from home for a long time.
- the mobile robot 100 and / or the server 70 may generate an analysis report for each region based on the sensing data received from the units, and the analysis report may be sent to the user through the mobile terminal 50 or the like. May be provided.
- the mobile robot 100 and / or the server 70 can establish a cleaning plan according to the analysis report.
- an IoT sensor such as a smart ThinkQ sensor 35 having a door / window sensor is required.
- a fluorescent light is turned on, and when it is absent It is possible to grasp the user's space usage pattern acquired from IoT sensors such as Hanji.
- the mobile robot 100 and / or the server 70 also grasp environmental information such as cleanliness, humidity, and temperature of the corresponding space such as fine dust concentration and floor cleanliness, and then, based on the information, an optimal cleaning mode and an optimal cleaning mode.
- environmental information such as cleanliness, humidity, and temperature of the corresponding space such as fine dust concentration and floor cleanliness, and then, based on the information, an optimal cleaning mode and an optimal cleaning mode.
- a cleaning plan can be developed by predicting and inferring cleaning time.
- the mobile robot 100 may be automatically and efficiently driven according to the established cleaning plan.
- FIG. 16A illustrates the cleaning plan generated according to the analysis result for space A.
- the intensive cleaning mode 1610 is set as the cleaning mode 1 and the cleaning corresponding to the cleaning mode 1
- the start time may be set to 9:00 am 1615 on weekdays from Monday to Friday.
- the smart turbo cleaning mode 1620 may be set as the cleaning mode 2 in consideration of the space characteristic in which the carpet is present, and the weekday 6pm 1625 may be set as the cleaning start time corresponding to the cleaning mode 2 from Monday to Friday. have.
- the mobile robot 100 may perform intensive cleaning after the user leaves the space A, and perform smart turbo cleaning before the user comes to the space A.
- FIG. 1 is a diagrammatic representation of the mobile robot 100.
- FIG. 16B illustrates the cleaning plan generated according to the analysis result for space B.
- the zigzag cleaning mode 1630 is set to the cleaning mode 1 so that cleaning can be performed quickly in consideration of the characteristics of the space B, which is used from 11 pm to 10 am the next day and generates less dust.
- the cleaning start time corresponding to the cleaning mode 1 may be set to 11 am 1645 on weekdays from Monday to Friday.
- the cleaning mode 2 may be set not to be used (1640 and 1645) to set only one cleaning.
- the mobile robot 100 may perform zigzag cleaning quickly after the user leaves the space B.
- FIG. 1
- the following illustrates a modified analysis report of space A and space B.
- space A there is a change in the injection time
- space B there are changes in contamination-related matters such as dust generation and changes in the injection time.
- the present invention it is possible to continuously monitor the change of the surrounding environment and the user pattern, and when the change occurs, it may be reflected in the cleaning schedule, pattern, and mode application.
- FIG. 17A illustrates the cleaning plan modified according to the analysis result including the change to space A.
- the intensive cleaning mode 1610 is set as the cleaning mode 1 and the cleaning corresponding to the cleaning mode 1
- the start time may be changed from 6 am (1715) weekdays, Monday through Friday.
- the intensive cleaning and the smart turbo cleaning mode 1720 are set to the cleaning mode 2, and the cleaning start time corresponding to the cleaning mode 2 is 6 pm 16:00 on weekdays from Monday to Friday. Can be set.
- the mobile robot 100 may perform intensive cleaning after the user leaves the space A, and perform intensive cleaning and smart turbo cleaning before the user comes to the space A.
- FIG. 1 is a diagrammatic representation of the mobile robot 100.
- FIG. 17B illustrates the cleaning plan changed according to the analysis result including the change to space B.
- the edge cleaning mode 1730 is set as the cleaning mode 1 and the cleaning corresponding to the cleaning mode 1 is performed.
- a weekday 10 am (1735) can be set from Monday to Friday.
- the cleaning mode 2 may be set to be used (1740, 1745) to perform two cleanings. Intensive cleaning and the smart turbo cleaning mode 1740 may be set to the cleaning mode 2, and 5 pm 1745 on weekdays from Monday to Friday may be set as the cleaning start time corresponding to the cleaning mode 2.
- the mobile robot 100 may perform edge cleaning after the user leaves the space B, and perform intensive cleaning and smart turbo cleaning before the user comes to the space B.
- only the change-related information may be provided through the user application, and the user may manually input the cleaning time and the cleaning mode.
- the mobile robot actively establishes and recommends a cleaning plan before requesting the user, thereby increasing the user's reliability, preference, and product utilization.
- the Internet of Things (IoT) devices by communicating with the home appliance, the Internet of Things (IoT) devices, by interlocking control of the mobile robot, it is possible to increase the user convenience.
- IoT Internet of Things
- the mobile robot according to the present invention is not limited to the configuration and method of the embodiments described as described above, but the embodiments may be selectively combined with all or part of the embodiments so that various modifications can be made. It may be configured.
- the control method of the mobile robot it is possible to implement as a processor readable code on a processor-readable recording medium.
- the processor-readable recording medium includes all kinds of recording devices that store data that can be read by the processor. Examples of the processor-readable recording medium include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like, and may also be implemented in the form of a carrier wave such as transmission over the Internet. .
- the processor-readable recording medium can also be distributed over network coupled computer systems so that the processor-readable code is stored and executed in a distributed fashion.
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Robotics (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Manipulator (AREA)
Abstract
Selon un aspect, la présente invention concerne un procédé destiné à commander un robot mobile à intelligence artificielle, comprenant les étapes suivantes consistant : à générer une carte destinée à une région d'entraînement comprenant une pluralité de zones ; à mapper une pluralité d'unités comprenant chacune au moins un capteur à des zones spécifiques de la pluralité de zones ; à recevoir des données de détection provenant de la pluralité d'unités ; et à lier des données de détection reçues à une zone mappée à un capteur correspondant, et à mémoriser des données de détection. Par conséquent, la présente invention peut fournir divers services par mappage d'informations concernant un espace et détection de données d'un autre dispositif.
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201862701856P | 2018-07-23 | 2018-07-23 | |
US62/701,856 | 2018-07-23 | ||
KR10-2018-0132870 | 2018-11-01 | ||
KR1020180132870A KR102612827B1 (ko) | 2018-07-23 | 2018-11-01 | 인공지능 이동 로봇의 제어 방법 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020022622A1 true WO2020022622A1 (fr) | 2020-01-30 |
Family
ID=69181837
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2019/005470 WO2020022622A1 (fr) | 2018-07-23 | 2019-05-08 | Procédé de commande d'un robot mobile à intelligence artificielle |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2020022622A1 (fr) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117284721A (zh) * | 2023-11-23 | 2023-12-26 | 张家港市华申工业橡塑制品有限公司 | 用于橡塑传送带的智能除污方法和系统 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20070102197A (ko) * | 2006-04-14 | 2007-10-18 | 주식회사 대우일렉트로닉스 | 지능형 로봇청소기 및 그 구동 방법 |
KR20130039578A (ko) * | 2011-10-12 | 2013-04-22 | 한국과학기술연구원 | 지능 로봇, 지능 로봇과 사용자의 상호작용을 위한 시스템 및 지능 로봇과 사용자의 상호작용을 위한 방법 |
KR20150024027A (ko) * | 2013-08-26 | 2015-03-06 | 주식회사 케이티 | 마루와 연결된 홈 게이트웨이의 청소 제어 방법 및 이를 위한 홈 게이트웨이 및 이를 이용한 이동 청소 로봇 |
US20150358777A1 (en) * | 2014-06-04 | 2015-12-10 | Qualcomm Incorporated | Generating a location profile of an internet of things device based on augmented location information associated with one or more nearby internet of things devices |
KR20170098874A (ko) * | 2014-12-16 | 2017-08-30 | 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 | 사물 인터넷 장치의 3d 매핑 |
-
2019
- 2019-05-08 WO PCT/KR2019/005470 patent/WO2020022622A1/fr active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20070102197A (ko) * | 2006-04-14 | 2007-10-18 | 주식회사 대우일렉트로닉스 | 지능형 로봇청소기 및 그 구동 방법 |
KR20130039578A (ko) * | 2011-10-12 | 2013-04-22 | 한국과학기술연구원 | 지능 로봇, 지능 로봇과 사용자의 상호작용을 위한 시스템 및 지능 로봇과 사용자의 상호작용을 위한 방법 |
KR20150024027A (ko) * | 2013-08-26 | 2015-03-06 | 주식회사 케이티 | 마루와 연결된 홈 게이트웨이의 청소 제어 방법 및 이를 위한 홈 게이트웨이 및 이를 이용한 이동 청소 로봇 |
US20150358777A1 (en) * | 2014-06-04 | 2015-12-10 | Qualcomm Incorporated | Generating a location profile of an internet of things device based on augmented location information associated with one or more nearby internet of things devices |
KR20170098874A (ko) * | 2014-12-16 | 2017-08-30 | 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 | 사물 인터넷 장치의 3d 매핑 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117284721A (zh) * | 2023-11-23 | 2023-12-26 | 张家港市华申工业橡塑制品有限公司 | 用于橡塑传送带的智能除污方法和系统 |
CN117284721B (zh) * | 2023-11-23 | 2024-02-06 | 张家港市华申工业橡塑制品有限公司 | 用于橡塑传送带的智能除污方法和系统 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021010757A1 (fr) | Robot mobile et son procédé de commande | |
WO2018038553A1 (fr) | Robot mobile et procédé de commande associé | |
AU2019336870B2 (en) | Plurality of autonomous mobile robots and controlling method for the same | |
WO2020218652A1 (fr) | Purificateur d'air | |
AU2019262468B2 (en) | A plurality of robot cleaner and a controlling method for the same | |
WO2018097574A1 (fr) | Robot mobile et procédé de commande de celui-ci | |
WO2020045732A1 (fr) | Procédé de commande de robot mobile | |
WO2019151846A2 (fr) | Purificateur d'air | |
WO2019151845A2 (fr) | Climatiseur | |
WO2022075614A1 (fr) | Système de robot mobile | |
WO2021172932A1 (fr) | Robot mobile et son procédé de commande | |
WO2021006542A1 (fr) | Robot mobile faisant appel à l'intelligence artificielle et son procédé de commande | |
WO2019212240A1 (fr) | Pluralité de robots nettoyeurs et leur procédé de commande | |
WO2020050566A1 (fr) | Pluralité de robots mobiles autonomes et procédé de commande de tels robots mobiles autonomes | |
WO2022075615A1 (fr) | Système de robot mobile | |
WO2022075610A1 (fr) | Système de robot mobile | |
WO2020022622A1 (fr) | Procédé de commande d'un robot mobile à intelligence artificielle | |
WO2019004773A1 (fr) | Terminal mobile et système de robot comprenant ledit terminal mobile | |
WO2021006553A1 (fr) | Robot mobile et son procédé de commande | |
WO2022075616A1 (fr) | Système de robot mobile | |
WO2021006406A1 (fr) | Climatiseur fondé sur l'intelligence artificielle | |
WO2021225234A1 (fr) | Robot nettoyeur et son procédé de commande | |
WO2020027407A1 (fr) | Robot mobile à intelligence artificielle et procédé de commande de celui-ci | |
WO2020022621A1 (fr) | Procédé de commande de robot mobile à intelligence artificielle | |
WO2020050565A1 (fr) | Pluralité de robots mobiles autonomes et procédé de commande de ces derniers |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19841694 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19841694 Country of ref document: EP Kind code of ref document: A1 |