WO2020045732A1 - Procédé de commande de robot mobile - Google Patents

Procédé de commande de robot mobile Download PDF

Info

Publication number
WO2020045732A1
WO2020045732A1 PCT/KR2018/011837 KR2018011837W WO2020045732A1 WO 2020045732 A1 WO2020045732 A1 WO 2020045732A1 KR 2018011837 W KR2018011837 W KR 2018011837W WO 2020045732 A1 WO2020045732 A1 WO 2020045732A1
Authority
WO
WIPO (PCT)
Prior art keywords
voice
mobile robot
cleaning
user
feedback
Prior art date
Application number
PCT/KR2018/011837
Other languages
English (en)
Korean (ko)
Inventor
이성훈
조원철
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Publication of WO2020045732A1 publication Critical patent/WO2020045732A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1674Programme controls characterised by safety, monitoring, diagnostic
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • A47L9/2836Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means characterised by the parts which are controlled
    • A47L9/2852Elements for displacement of the vacuum cleaner or the accessories therefor, e.g. wheels, casters or nozzles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/008Manipulators for service tasks
    • B25J11/0085Cleaning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/06Safety devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/06Safety devices
    • B25J19/061Safety devices with audible signals
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • B25J9/1666Avoiding collision or forbidden zones
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection

Definitions

  • the present invention relates to a mobile robot and a control method thereof, and more particularly, to a mobile robot and a control method capable of actively providing information and services to a user.
  • Robots have been developed for industrial use and have been a part of factory automation. Recently, the application of robots has been further expanded, medical robots, aerospace robots, and the like have been developed, and home robots that can be used in general homes have also been made. Among these robots, a moving robot capable of traveling by magnetic force is called a mobile robot.
  • a representative example of a mobile robot used at home is a robot cleaner, which is a device that cleans a corresponding area by inhaling dust or foreign matter while driving around a certain area by itself.
  • the mobile robot is capable of moving by itself and is free to move, and is provided with a plurality of sensors for avoiding obstacles and the like while driving, so that the robot can travel to avoid obstacles.
  • voice recognition technology is applied to various devices, and researches on a method of controlling a mobile robot using voice recognition technology are increasing.
  • prior art document 1 Korean Patent Publication No. 10-2012-0114670, published on October 17, 2012 discloses that the robot cleaner has a speech recognition unit and recognizes a user's speech signal to provide a corresponding control command. Is running.
  • the voice input is unidirectional from the user to the robot cleaner, so that it stays in the additional means of the control operation of pressing a button or operating with a remote controller. Accordingly, there is a problem that the voice recognition function is hard to give the user more than simple control, and cannot provide other functions and services other than the addition of control input means.
  • the object of the present invention is to provide a mobile robot capable of interacting with a user through a voice and a control method thereof by using voice recognition as a single control input means.
  • An object of the present invention is to provide a mobile robot with various information and services to a user.
  • An object of the present invention is to provide a mobile robot and a control method thereof capable of user-friendly operation to improve user reliability and preference.
  • the mobile robot may utter a voice for guiding predetermined information or a service to a user, and may communicate with and interact with the user through the voice. .
  • the mobile robot according to an aspect of the present invention first increases the reliability, preference, and product utilization of the user by actively providing predetermined information and recommending predetermined services and functions before requesting them. Can be.
  • a method of operating a mobile robot includes: performing a cleaning operation while moving, determining a special region based on existing data obtained by performing a previous cleaning; And outputting a voice guidance message for the special region when the determined special region arrives, thereby providing voice guidance for the special region that needs to be actively provided before the user's request.
  • the special area may be a dangerous area in which an inoperable state has occurred, or a low efficiency area in which the running speed or cleaning rate is lower than the average value or the number of rotations is greater than the reference number.
  • the existing data obtained by the previous cleaning operation may include driving history data and cleaning history data.
  • the existing data obtained by performing the previous cleaning may further include obstacle information registered in the special area, and in this case, the voice guidance message may include a guide message for the obstacle present in the special area. have.
  • the voice guidance message may include a message for requesting confirmation of cleaning of the special area.
  • the voice guidance message may further include at least one of a description of the reason for determining as the special region or a guide for an example of a command that a user can input voice.
  • a method of operating a mobile robot comprising: receiving voice feedback of a user with respect to the voice guidance message, identifying a voice command included in the received voice feedback, and identifying the voice command.
  • the method may further include performing a corresponding operation.
  • the identification of the voice command may be performed by itself in the mobile robot, in the server, or stepwise in the mobile robot and the server.
  • a method of operating a mobile robot comprising: receiving at least one voice command of a user and recommending at least one of functions used less than a reference number based on a usage record; Outputting a voice announcement message, identifying a feedback voice command included in the received voice feedback when the user's voice feedback is received, and a predetermined function based on the identified feedback voice command.
  • the usage record includes the number of times the general cleaning is completed, the number of times of returning to the charging station, the number of cleaning functions for each area, the operating time, the average cleaning intensity for each area, the amount of dust for each area, the frequency of use of the cleaning mode for each area, the ratio of the cleaning mode for each area, and the cleaning. It may include at least one of the number of uses per function and additional functions.
  • the mobile robot may select and recommend at least one of functions used less than the reference number of times according to a predetermined priority.
  • the preset priority may be an order of high importance for each preset function or an order of small number of times of use.
  • the method of operating a mobile robot according to an aspect of the present invention may further include identifying a voice command of a user.
  • an operation corresponding to the voice command of the identified user may be performed, and a function associated with the voice command of the identified user may be selected and recommended from among functions used less than the reference number.
  • the identification of the voice command and the feedback voice command may be performed by the mobile robot by itself, by the server, or step by step by the mobile robot and the server.
  • the mobile robot may speak a voice to the user, and may communicate with and interact with the user through the voice.
  • the mobile robot may actively provide information and recommend services, functions, and the like before requesting, thereby increasing the reliability, preference, and product utilization of the user.
  • the user can easily set up and use the associated function without additional effort.
  • the speech recognition is performed by the mobile robot by itself, by the server, or by the mobile robot and the server step by step, thereby enabling effective speech recognition.
  • an evolving user experience may be provided.
  • FIG. 1 is a block diagram of a home appliance network system according to an embodiment of the present invention.
  • FIG. 2 is a perspective view showing a mobile robot according to an embodiment of the present invention.
  • FIG. 3 is a plan view of the mobile robot of FIG. 2.
  • FIG. 4 is a side view of the mobile robot of FIG. 2.
  • FIG. 5 is a block diagram showing a control relationship between major components of a mobile robot according to an embodiment of the present invention.
  • FIG. 6 is a view referred to for describing learning using product data according to an embodiment of the present invention.
  • FIG. 7 is an example of a simplified internal block diagram of a server according to an embodiment of the present invention.
  • FIG. 8 is a flowchart illustrating a control method of a mobile robot according to an embodiment of the present invention.
  • FIG. 9 is a view referred to for describing the control method of the mobile robot according to an embodiment of the present invention.
  • FIG. 10 is a flowchart illustrating a control method of a mobile robot according to an embodiment of the present invention.
  • FIG. 11 is a flowchart illustrating a control method of a mobile robot according to an embodiment of the present invention.
  • FIG. 12 is a view referred to for describing the method for controlling the mobile robot according to the embodiment of the present invention.
  • FIG. 13 is a flowchart illustrating a control method of a mobile robot according to an embodiment of the present invention.
  • FIG. 14 is a view referred to for describing the control method of the mobile robot according to the embodiment of the present invention.
  • module and “unit” for the components used in the following description are merely given in consideration of ease of preparation of the present specification, and do not give particular meaning or role in themselves. Therefore, the “module” and “unit” may be used interchangeably.
  • the mobile robot 100 refers to a robot that can move itself by using a wheel or the like, and may be a home helper robot or a robot cleaner.
  • a robot cleaner having a cleaning function among mobile robots will be described with reference to the drawings, but the present invention is not limited thereto.
  • FIG. 1 is a block diagram of a home appliance network system according to an embodiment of the present invention.
  • a home appliance network system may include a home appliance including a communication module, which may communicate with another device, the server 70, or connect to a network.
  • the home appliance may correspond to an air conditioner 10 having a communication module, a cleaner 20, a refrigerator 31, a washing machine 32, and the like.
  • the air conditioner 10 may include at least one of an air conditioner 11, an air cleaner 12 and 13, a humidifier 14, and a hood 15.
  • the cleaner 20 may be a vacuum cleaner 21, a robot cleaner 22, or the like.
  • the communication module included in the home appliances 10, 20, 31, and 32 may be a Wi-Fi communication module, and the present invention is not limited to the communication method.
  • the home appliances 10, 20, 31, and 32 may include other types of communication modules or may include a plurality of communication modules.
  • the home appliances 10, 20, 31, and 32 may include an NFC module, a Zigbee communication module, a Bluetooth TM communication module, and the like.
  • the home appliances 10, 20, 31, and 32 may be connected to a predetermined server 70 through a Wi-Fi communication module, and may support smart functions such as remote monitoring and remote control.
  • the home appliance network system may include a mobile terminal 50 such as a smart phone and a tablet PC.
  • the user may check information on the home appliances 10, 20, 31, and 32 in the home appliance network system or control the home appliances 10, 20, 31, and 32 through the portable terminal 50.
  • the home appliance network system may include a plurality of Internet of Things (IoT) devices (not shown).
  • the home appliance network system may include home appliances 10, 20, 31, and 32, portable terminal 50, and Internet of Things (IoT) devices.
  • the home appliance network system is not limited to a communication scheme constituting a network.
  • the home appliances 10, 20, 31, and 32, the portable terminal 50, and the Internet of Things (IoT) devices may be connected through a wired / wireless router (not shown).
  • IoT Internet of Things
  • devices in the home appliance network system may form a mesh topology that is individually communicated with each other.
  • the home appliances 10, 20, 31, and 32 in the home appliance network system may communicate with the server 70 or the mobile terminal 50 via a wired / wireless router (not shown).
  • the home appliances 10, 20, 31, and 32 in the home appliance network system may communicate with the server 70 or the portable terminal 50 by Ethernet.
  • FIG. 2 is a perspective view illustrating a mobile robot according to an embodiment of the present invention
  • FIG. 3 is a plan view of the mobile robot of FIG. 2
  • FIG. 4 is a side view of the mobile robot of FIG. 2.
  • the mobile robot 100 may drive a certain area by itself.
  • the mobile robot 100 may perform a function of cleaning the floor. Cleaning of the floor here includes suctioning dust (including foreign matter) from the floor or mopping the floor.
  • the mobile robot 100 includes a main body 110.
  • the main body 110 includes a cabinet forming an appearance.
  • the mobile robot 100 may include a suction unit 130 and a dust container 140 provided in the main body 110.
  • the mobile robot 100 includes an image acquisition unit 120 that detects information related to an environment around the mobile robot.
  • the mobile robot 100 includes a driving unit 160 for moving the main body.
  • the mobile robot 100 includes a control unit 181 for controlling the mobile robot 100.
  • the controller 181 is provided in the main body 110.
  • the driving unit 160 includes a wheel unit 111 for traveling of the mobile robot 100.
  • the wheel unit 111 is provided in the main body 110.
  • the mobile robot 100 may be moved back, front, left, and right by the wheel unit 111 or rotated.
  • the controller controls the driving of the wheel unit 111, the mobile robot 100 may autonomously travel the floor.
  • the wheel unit 111 includes a main wheel 111a and a sub wheel 111b.
  • the main wheels 111a are provided at both sides of the main body 110, and are configured to be rotatable in one direction or the other direction according to the control signal of the controller. Each main wheel 111a may be configured to be driven independently of each other. For example, each main wheel 111a may be driven by different motors.
  • the sub wheel 111b supports the main body 110 together with the main wheel 111a, and is configured to assist driving of the mobile robot 100 by the main wheel 111a.
  • the sub wheel 111b may also be provided in the suction unit 130 described later.
  • the suction unit 130 may be disposed to protrude from the front side F of the main body 110.
  • the suction unit 130 is provided to suck air containing dust.
  • the suction unit 130 may have a form protruding from the front of the main body 110 to both left and right sides.
  • the front end of the suction unit 130 may be disposed in a position spaced forward from one side of the main body 110.
  • the left and right both ends of the suction unit 130 may be disposed at positions spaced apart from the main body 110 to the left and right sides.
  • the main body 110 is formed in a circular shape, and as both rear ends of the suction unit 130 protrude from the main body 110 to the left and right sides, respectively, an empty space, that is, between the main body 110 and the suction unit 130. Gaps may be formed.
  • the empty space is a space between the left and right both ends of the main body 110 and the left and right both ends of the suction unit 130, and has a shape recessed inside the mobile robot 100.
  • the suction unit 130 may be detachably coupled to the main body 110.
  • the mop module (not shown) may be detachably coupled to the main body 110 in place of the separated suction unit 130.
  • the image acquisition unit 120 may be disposed in the main body 110.
  • the image acquisition unit 120 may be disposed in front of the main body 110.
  • the image acquisition unit 120 may be disposed to overlap the suction unit 130 in the vertical direction of the main body 110.
  • the image acquisition unit 120 may be disposed above the suction unit 130.
  • the image acquisition unit 120 may detect an obstacle around the mobile robot 100.
  • the image acquisition unit 120 may detect an obstacle or a feature in front of the suction unit 130 located in the front of the mobile robot 100 so as not to collide with the obstacle.
  • the image acquisition unit 120 may further perform other sensing functions to be described later in addition to the sensing function.
  • the main body 110 may be provided with a dust container accommodating part (not shown).
  • the dust container 140 is detachably coupled to the dust container 140 which separates and collects dust in the sucked air.
  • the dust container accommodation part may be formed at the rear side R of the main body 110. Part of the dust container 140 is accommodated in the dust container receiving portion, the other part of the dust container 140 may be formed to protrude toward the rear (R) of the main body 110.
  • the dust container 140 has an inlet (not shown) through which air containing dust is introduced and an outlet (not shown) through which air from which dust is separated is formed.
  • the inlet and the outlet of the dust container 140 are configured to communicate with the first opening (not shown) and the second opening (not shown) respectively formed on the inner wall of the dust container accommodation. .
  • a suction flow path for guiding air from the suction port of the suction unit 130 to the first opening is provided.
  • An exhaust passage for guiding air to an exhaust port (not shown) opened toward the outside of the second opening is provided.
  • the air containing the dust introduced through the suction unit 130 is introduced into the dust container 140 through the intake passage inside the main body 110, and the air and the dust are passed through the filter or the cyclone of the dust container 140. Are separated from each other. Dust is collected in the dust container 140, the air is discharged from the dust container 140, and then through the exhaust flow path inside the main body 110 is finally discharged to the outside through the exhaust port.
  • FIG. 5 is a block diagram showing a control relationship between major components of a mobile robot according to an embodiment of the present invention.
  • the mobile robot 100 includes a main body 110 and an image acquisition unit 120 that acquires an image around the main body 110.
  • the mobile robot 100 includes a driving unit 160 for moving the main body 110.
  • the driving unit 160 includes at least one wheel unit 111 for moving the main body 110.
  • the driving unit 160 includes a driving motor (not shown) connected to the wheel unit 111 to rotate the wheel unit 111.
  • the image acquisition unit 120 photographs the driving zone and may include a camera module.
  • the camera module may include a digital camera.
  • the digital camera includes at least one optical lens and an image sensor (eg, a CMOS image sensor) including a plurality of photodiodes (eg, pixels) formed by the light passing through the optical lens.
  • the apparatus may include a digital signal processor (DSP) that forms an image based on signals output from the photodiodes.
  • the digital signal processor may generate not only a still image but also a moving image composed of frames composed of the still image.
  • Multiple cameras may be installed for each part for photographing efficiency.
  • the image photographed by the camera may be used to recognize a kind of material such as dust, hair, floor, etc. present in the corresponding space, whether to clean or check the cleaning time.
  • the camera may photograph a situation of an obstacle or a cleaning area existing on the front of the moving direction of the mobile robot 100.
  • the image acquisition unit 120 may acquire a plurality of images by continuously photographing the periphery of the main body 110, and the obtained plurality of images may be stored in the storage unit 105. Can be.
  • the mobile robot 100 improves the accuracy of spatial recognition, location recognition, and obstacle recognition using a plurality of images, or selects one or more images from the plurality of images and uses effective data to provide spatial recognition, location recognition, and obstacle recognition. You can increase the accuracy.
  • the mobile robot 100 may include a sensor unit 170 including sensors for sensing various data related to the operation and state of the mobile robot.
  • the sensor unit 170 may include an obstacle detecting sensor detecting a front obstacle.
  • the sensor unit 170 may further include a cliff detection sensor for detecting the presence of a cliff on the floor in the driving zone, and a lower camera sensor for obtaining an image of the floor.
  • the obstacle detecting sensor may include an infrared sensor, an ultrasonic sensor, an RF sensor, a geomagnetic sensor, a position sensitive device (PSD) sensor, and the like.
  • the position and type of the sensor included in the obstacle detection sensor may vary depending on the type of the mobile robot, the obstacle detection sensor may include more various sensors.
  • the sensor unit 170 may further include a motion detection sensor for detecting the operation of the mobile robot 100 according to the driving of the main body 110 and outputs the motion information.
  • a motion detection sensor for detecting the operation of the mobile robot 100 according to the driving of the main body 110 and outputs the motion information.
  • a gyro sensor, a wheel sensor, an acceleration sensor, or the like may be used as the motion detection sensor.
  • the gyro sensor detects the rotation direction and detects the rotation angle when the mobile robot 100 moves according to the driving mode.
  • the gyro sensor detects the angular velocity of the mobile robot 100 and outputs a voltage value proportional to the angular velocity.
  • the controller 150 calculates the rotation direction and the rotation angle by using the voltage value output from the gyro sensor.
  • the wheel sensor is connected to the wheel unit 111 to sense the number of revolutions of the wheel.
  • the wheel sensor may be a rotary encoder.
  • the acceleration sensor detects a change in the speed of the mobile robot 100, for example, a change in the mobile robot 100 due to start, stop, direction change, collision with an object, and the like.
  • the acceleration sensor may be built in the controller 150 to detect a speed change of the mobile robot 100.
  • the controller 150 may calculate a position change of the mobile robot 100 based on the motion information output from the motion detection sensor. This position becomes a relative position corresponding to the absolute position using the image information.
  • the mobile robot can improve the performance of position recognition using image information and obstacle information through the relative position recognition.
  • the mobile robot 100 may include a power supply unit (not shown) for supplying power to the mobile robot by having a rechargeable battery.
  • the power supply unit supplies driving power and operation power to each component of the mobile robot 100, and when the remaining power is insufficient, power may be supplied and charged from a charging stand (not shown).
  • the mobile robot 100 may further include a battery detector (not shown) that detects a charging state of the battery and transmits a detection result to the controller 150.
  • the battery is connected to the battery detector so that the battery remaining amount and the charging state are transmitted to the controller 150.
  • the battery remaining amount may be displayed on the display 182 of the output unit 180.
  • the mobile robot 100 includes an input unit 125 for inputting on / off or various commands.
  • the input unit 125 may include a button, a dial, a touch screen, and the like.
  • the input unit 125 may include a microphone for receiving a user's voice command. Through the input unit 125, various control commands necessary for the overall operation of the mobile robot 100 may be input.
  • the mobile robot 100 may include an output unit 180 to display reservation information, a battery state, an operation mode, an operation state, an error state, etc. as an image or output a sound.
  • the output unit 180 may include a sound output unit 181 for outputting an audio signal.
  • the sound output unit 181 may output a warning message, such as a warning sound, an operation mode, an operation state, an error state, etc., under the control of the controller 150.
  • the sound output unit 181 may convert an electrical signal from the controller 150 into an audio signal and output the audio signal.
  • a speaker or the like may be provided.
  • the output unit 180 may further include a display 182 that displays reservation information, a battery state, an operation mode, an operation state, an error state, and the like as an image.
  • the mobile robot 100 includes a controller 150 for processing and determining various information such as recognizing a current location, and a storage 105 for storing various data.
  • the mobile robot 100 may further include a communication unit 190 for transmitting and receiving data with an external terminal.
  • the external terminal includes an application for controlling the mobile robot 100, and displays an map of the driving area to be cleaned by the mobile robot 100 through execution of the application, and designates an area to clean a specific area on the map.
  • Examples of the external terminal may include a remote controller, a PDA, a laptop, a smartphone, a tablet, and the like, having an application for setting a map.
  • the external terminal may communicate with the mobile robot 100 to display a current location of the mobile robot together with a map, and information about a plurality of areas may be displayed. In addition, the external terminal updates and displays its position as the mobile robot travels.
  • the controller 150 controls the image acquisition unit 120, the input unit 125, the driving unit 160, the suction unit 130, etc. constituting the mobile robot 100 to control the overall operation of the mobile robot 100. To control.
  • the controller 150 may process a voice input signal of the user received through the microphone of the input unit 125 and perform a voice recognition process.
  • the mobile robot 100 may include a voice recognition module that performs voice recognition inside or outside the controller 150.
  • simple voice recognition may be performed by the mobile robot 100 itself, and high-level voice recognition such as natural language processing may be performed by the server 70.
  • the storage unit 105 records various types of information necessary for the control of the mobile robot 100 and may include a volatile or nonvolatile recording medium.
  • the recording medium stores data that can be read by a microprocessor, and includes a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD-ROM, a magnetic Tapes, floppy disks, optical data storage devices, and the like.
  • the storage unit 105 may store a map for the driving zone.
  • the map may be input by an external terminal, a server, or the like, which may exchange information with the mobile robot 100 through wired or wireless communication, or may be generated by the mobile robot 100 by learning itself.
  • the map may display the locations of the rooms in the driving zone.
  • the current position of the mobile robot 100 may be displayed on the map, and the current position of the mobile robot 100 on the map may be updated during the driving process.
  • the external terminal stores the same map as the map stored in the storage unit 105.
  • the storage unit 105 may store cleaning history information. Such cleaning history information may be generated every time cleaning is performed.
  • the map of the driving zone stored in the storage unit 105 stores a navigation map used for driving during cleaning, a Simultaneous Localization and Mapping (SLAM) map used for location recognition, an obstacle, and the like. It may be a learning map used for learning cleaning, a global location map used for global location recognition, an obstacle recognition map in which information about the recognized obstacle is recorded.
  • SLAM Simultaneous Localization and Mapping
  • maps may be stored and managed in the storage unit 105 for each use, but the map may not be clearly classified for each use.
  • a plurality of pieces of information may be stored in one map to be used for at least two purposes.
  • the controller 150 may include a driving control module 151, a map generation module 152, a position recognition module 153, and an obstacle recognition module 154.
  • the driving control module 151 controls the driving of the mobile robot 100, and controls the driving of the driving unit 160 according to the driving setting.
  • the driving control module 151 may determine the driving path of the mobile robot 100 based on the operation of the driving unit 160. For example, the driving control module 151 may determine the current or past moving speed of the mobile robot 100, the distance traveled, and the like based on the rotational speed of the wheel unit 111, and the mobile robot thus identified ( Based on the driving information of the 100, the position of the mobile robot 100 on the map may be updated.
  • the map generation module 152 may generate a map of the driving zone.
  • the map generation module 152 may generate a map by processing the image acquired through the image acquisition unit 120. That is, a cleaning map corresponding to the cleaning area can be created.
  • the map generation module 152 may recognize the global position by processing the image acquired through the image acquisition unit 120 at each position in association with the map.
  • the position recognition module 153 estimates and recognizes a current position.
  • the position recognition module 153 uses the image information of the image acquisition unit 120 to determine the position in connection with the map generation module 152 to estimate the current position even when the position of the mobile robot 100 suddenly changes. Can be recognized.
  • the location recognition module 153 may recognize the property of the current location, that is, the location recognition module 153 may recognize the space.
  • the mobile robot 100 may recognize a position during continuous driving through the position recognition module 153, and also, through the map generation module 152 and the obstacle recognition module 154, without the position recognition module 153. Learn, estimate your current location, and more.
  • the image acquisition unit 120 acquires images around the mobile robot 100.
  • an image acquired by the image acquisition unit 120 is defined as an 'acquisition image'.
  • the acquired image includes various features such as lightings on the ceiling, edges, corners, blobs, and ridges.
  • the map generation module 152 detects a feature from each of the acquired images, and calculates a descriptor based on each feature point.
  • the map generation module 152 classifies at least one descriptor into a plurality of groups according to a predetermined sub-classification rule for each acquired image based on descriptor information obtained through the acquired image of each position, and according to the predetermined sub-representation rule, the same group. Descriptors included in each can be converted into lower representative descriptors.
  • all descriptors collected from acquired images in a predetermined area are classified into a plurality of groups according to a predetermined sub-classification rule, and descriptors included in the same group according to the predetermined sub-representation rule are respectively represented by lower representative descriptors. You can also convert to.
  • the map generation module 152 can obtain the feature distribution of each location through this process.
  • Each positional feature distribution can be represented by a histogram or an n-dimensional vector.
  • the map generation module 152 may estimate an unknown current position based on a descriptor calculated from each feature point without passing through a predetermined sub classification rule and a predetermined sub representative rule.
  • the current position of the mobile robot 100 when the current position of the mobile robot 100 is unknown due to a position leap or the like, the current position may be estimated based on data such as a previously stored descriptor or a lower representative descriptor.
  • the mobile robot 100 obtains an acquired image through the image acquisition unit 120 at an unknown current position. Through the image, various features such as lightings on the ceiling, edges, corners, blobs, and ridges are identified.
  • the position recognition module 153 detects features from the acquired image and calculates a descriptor.
  • the position recognition module 153 is based on at least one descriptor information obtained through an acquired image of an unknown current position, and position information (for example, feature distribution of each position) to be compared according to a predetermined lower conversion rule. Convert to comparable information (sub-recognition feature distribution).
  • each position feature distribution may be compared with each recognition feature distribution to calculate each similarity. Similarity (probability) may be calculated for each location corresponding to each location, and the location where the greatest probability is calculated may be determined as the current location.
  • the controller 150 may distinguish the driving zone and generate a map composed of a plurality of regions, or recognize the current position of the main body 110 based on the pre-stored map.
  • the controller 150 may transmit the generated map to an external terminal, a server, etc. through the communication unit 190. As described above, the controller 150 may store the map in the storage 105 when a map is received from an external terminal, a server, or the like.
  • the map may be divided into a plurality of cleaning areas, and include a connection path connecting the plurality of areas, and may include information about obstacles in the area.
  • the controller 150 determines whether the position on the map matches the current position of the mobile robot.
  • the cleaning command may be input from a remote controller, an input unit, or an external terminal.
  • the controller 150 recognizes the current position and recovers the current position of the mobile robot 100 based on the current position.
  • the driving unit 160 may be controlled to move to the designated area.
  • the position recognition module 153 analyzes the acquired image input from the image acquisition unit 120 to estimate the current position based on the map. can do.
  • the obstacle recognition module 154 or the map generation module 152 may also recognize the current position in the same manner.
  • the driving control module 151 calculates a driving route from the current position to the designated region and controls the driving unit 160 to move to the designated region.
  • the driving control module 151 may divide the entire driving zone into a plurality of areas according to the received cleaning pattern information, and set at least one area as a designated area.
  • the driving control module 151 may calculate the driving route according to the received cleaning pattern information, travel along the driving route, and perform cleaning.
  • the controller 150 may store the cleaning record in the storage unit 105 when cleaning of the set designated area is completed.
  • controller 150 may transmit the operation state or cleaning state of the mobile robot 100 to the external terminal and the server at a predetermined cycle through the communication unit 190.
  • the external terminal displays the location of the mobile robot along with the map on the screen of the running application based on the received data, and outputs information on the cleaning state.
  • the mobile robot 100 moves in one direction until an obstacle or a wall surface is detected, and when the obstacle recognition module 154 recognizes the obstacle, the robot moves straight, rotates, etc. according to the recognized characteristics of the obstacle.
  • the pattern can be determined.
  • the controller 150 may control to perform the avoidance driving in a different pattern based on the recognized property of the obstacle.
  • the controller 150 may control to avoid driving in different patterns according to the properties of obstacles such as non-hazardous obstacles (general obstacles), dangerous obstacles, and movable obstacles.
  • the controller 150 may control the dangerous obstacle to be bypassed in a state where a safe distance of a longer distance is secured.
  • the controller 150 may control to perform the avoiding driving corresponding to the general obstacle or the avoiding driving corresponding to the dangerous obstacle.
  • the controller 150 may control to travel accordingly.
  • the mobile robot 100 may perform obstacle recognition and avoidance based on machine learning.
  • the controller 150 may drive the driving unit 160 based on an obstacle recognition module 154 that recognizes an obstacle previously learned by machine learning in an input image and an attribute of the recognized obstacle. It may include a driving control module 151 for controlling.
  • FIG. 5 illustrates an example in which the plurality of modules 151, 152, 153, and 154 are separately provided in the controller 160, the present invention is not limited thereto.
  • the position recognition module 153 and the obstacle recognition module 154 may be integrated into one recognizer and constitute one recognition module 155.
  • the recognizer may be trained using a learning technique such as machine learning, and the learned recognizer may recognize attributes of an area, an object, and the like by classifying data input thereafter.
  • the map generation module 152, the position recognition module 153, and the obstacle recognition module 154 may be configured as one integrated module.
  • the position recognition module 153 and the obstacle recognition module 154 are integrated as one recognizer and described with reference to an embodiment configured as one recognition module 155, but the position recognition module 153 and the obstacle recognition are described.
  • the module 154 may operate in the same manner even when each is provided.
  • the mobile robot 100 may include a recognition module 155 in which attributes of objects and spaces are learned by machine learning.
  • Machine learning means that a computer can learn from data and let the computer take care of a problem without having to instruct the computer directly to the logic.
  • ANN Deep Learning Based on Artificial Neural Networks
  • the artificial neural network may be implemented in software or in the form of hardware such as a chip.
  • the recognition module 155 may include an artificial neural network (ANN) in the form of software or hardware in which properties of an object, such as an object of a space or an obstacle, are learned.
  • ANN artificial neural network
  • the recognition module 155 may include a deep neural network (DNN) such as a convolutional neural network (CNN), a recurrent neural network (RNN), a deep belief network (DBN), and the like that have been learned by deep learning. It may include.
  • DNN deep neural network
  • CNN convolutional neural network
  • RNN recurrent neural network
  • DNN deep belief network
  • the recognition module 155 may determine an attribute of a space and an object included in the input image data based on weights among nodes included in the deep neural network DNN.
  • the driving control module 151 may control the driving of the driving unit 160 based on the recognized space and the properties of the obstacle.
  • the recognition module 155 may recognize attributes of spaces and obstacles included in the selected specific viewpoint image based on data previously learned by machine learning.
  • the storage unit 105 may store space, input data for determining object properties, and data for learning the deep neural network DNN.
  • the storage unit 105 may store the original image obtained by the image acquisition unit 120 and the extracted images from which the predetermined region is extracted.
  • the storage 105 may store weights and biases forming the deep neural network (DNN).
  • DNN deep neural network
  • weights and biases constituting the deep neural network structure may be stored in an embedded memory of the recognition module 155.
  • the recognition module 155 performs a learning process by using a predetermined image as training data whenever the image acquisition unit 120 acquires an image or extracts a partial region of the image, or a predetermined number or more. After the image is acquired, the learning process may be performed.
  • the mobile robot 100 may receive data related to machine learning from the predetermined server through the communication unit 190.
  • the mobile robot 100 may update the recognition module 155 based on data related to machine learning received from the predetermined server.
  • FIG. 6 is a view referred to for describing learning using product data according to an embodiment of the present invention.
  • product data obtained by the operation of a predetermined device such as a mobile robot 100 may be transmitted to the server 70.
  • the mobile robot 100 may transmit a space, an object, and usage related data to the server 70 to the server 70.
  • the space and object related data may be a space recognized by the mobile robot 100 and data related to recognition of an object, or a space obtained by the image acquisition unit 120. And image data about an object.
  • the usage-related data is data obtained according to the use of a predetermined product, for example, the mobile robot 100, the use history data, the sensing data obtained from the sensor unit 170, etc. Can be.
  • control unit 150 more specifically, the recognition module 155 of the mobile robot 100 may be equipped with a deep neural network structure (DNN) such as a convolutional neural network (CNN).
  • DNN deep neural network structure
  • CNN convolutional neural network
  • the learned deep neural network structure DNN may receive input data for recognition, recognize a property of an object and a space included in the input data, and output the result.
  • the learned deep neural network structure may receive input data for recognition, analyze and learn the usage-related data (Data) of the mobile robot 100 to recognize the use pattern, the use environment, and the like. have.
  • the space, object, and usage related data may be transmitted to the server 70 through the communication unit 190.
  • the server 70 may generate a configuration of learned weights, and the server 70 may learn a deep neural network (DNN) structure using training data.
  • DNN deep neural network
  • the server 70 may transmit the updated deep neural network (DNN) structure data to the mobile robot 100 to be updated.
  • DNN deep neural network
  • home appliance products such as mobile robot 100 may become smarter and provide an evolving user experience (UX) as they are used.
  • UX user experience
  • FIG. 7 is an example of a simplified internal block diagram of a server according to an embodiment of the present invention.
  • the server 70 may include a communication unit 720, a storage unit 730, a learning module 740, and a processor 710.
  • the processor 710 may control the overall operation of the server 70.
  • the server 70 may be a server operated by a home appliance manufacturer such as the mobile robot 100 or a server operated by a service provider, or may be a kind of cloud server.
  • the communication unit 720 may receive various data such as status information, operation information, operation information, and the like from a portable terminal, a home appliance such as the mobile robot 100, a gateway, or the like.
  • the communication unit 720 may transmit data corresponding to the received various information to a portable terminal, a home appliance such as the mobile robot 100, a gateway, or the like.
  • the communication unit 720 may include one or more communication modules, such as an internet module and a mobile communication module.
  • the storage unit 730 may store the received information and may include data for generating result information corresponding thereto.
  • the storage unit 730 may store data used for machine learning, result data, and the like.
  • the learning module 740 may serve as a learner of a home appliance such as the mobile robot 100.
  • the learning module 740 may include an artificial neural network, for example, a deep neural network (DNN) such as a convolutional neural network (CNN), a recurrent neural network (RNN), and a deep belief network (DBN). You can learn neural networks.
  • DNN deep neural network
  • CNN convolutional neural network
  • RNN recurrent neural network
  • DBN deep belief network
  • the controller 710 may control to update the artificial neural network structure of the home appliance such as the mobile robot 100 to the learned artificial neural network structure after learning according to a setting.
  • the learning module 740 may receive input data for recognition, recognize a property of an object and a space included in the input data, and output the result.
  • the communication unit 720 may transmit the recognition result to the mobile robot 100.
  • the learning module 740 may analyze and learn usage-related data of the mobile robot 100 to recognize a usage pattern, a usage environment, and the like, and output the result.
  • the communication unit 720 may transmit the recognition result to the mobile robot 100.
  • home appliance products such as the mobile robot 100 may receive a recognition result from the server 70 and operate by using the received recognition result.
  • the server 70 becomes smarter by learning using the product data, it is possible to provide an evolving user experience (UX) as using the home appliance product.
  • UX evolving user experience
  • the mobile robot 100 and the server 70 may also use external information.
  • the mobile robot 100 and the server 70 may be obtained from spatial information of a specific home appliance product such as the mobile robot 100, object information, internal information such as a usage pattern, and other products, or the server 70. Can provide excellent user experience by comprehensively using external information obtained from other linked service servers.
  • washing machine 32 Washing may be performed such that washing is finished in accordance with the time when the user arrives at home.
  • the server 70 may perform voice recognition by receiving a voice input signal spoken by a user.
  • the server 70 may include a speech recognition module, and the speech recognition module may include an artificial neural network trained to perform speech recognition on input data and output a speech recognition result.
  • the server 70 may include a voice recognition server for voice recognition.
  • the voice recognition server may also include a plurality of servers that share a predetermined process of the voice recognition process.
  • a speech recognition server may include an automatic speech recognition (ASR) server that receives speech data and converts the received speech data into text data, and the text from the automatic speech recognition server. It may include a natural language processing (NLP) server that receives the data, and analyzes the received text data to determine the voice command.
  • the speech recognition server may further include a text to speech (TTS) server that converts the text speech recognition result output from the natural language processing server into speech data and transmits the speech data to another server or a home appliance. .
  • ASR automatic speech recognition
  • NLP natural language processing
  • TTS text to speech
  • the mobile robot 100 and / or the server 70 may perform voice recognition, so that a user voice may be used as an input for controlling the mobile robot 100.
  • voice recognition is merely a means for control.
  • the robot cleaner only executes the recognized control commands. Therefore, the analysis of which function is used a lot and which parts are less cleaned has a disadvantage of finding out by means other than voice.
  • the mobile robot 100 may ignite to provide a simple interaction function than simple terminal control.
  • the mobile robot 100 may provide a variety of active control functions to the user by actively providing information or outputting a voice recommending a function or service.
  • the mobile robot 100 can learn and understand the usage pattern of the user. Accordingly, the mobile robot 100 may interact while first suggesting a predetermined function to the user, thereby performing a more efficient and user friendly operation.
  • FIG. 8 is a flowchart illustrating a method for controlling a mobile robot according to an embodiment of the present invention
  • FIG. 9 is a view referred to for describing a method for controlling a mobile robot according to an embodiment of the present invention.
  • the mobile robot 100 may move according to a command or a setting and start cleaning (S810).
  • the mobile robot 100 may move based on a navigation map or a simulaneous localization and mapping (SLAM) map stored in the storage unit 105 according to a cleaning start command or cleaning reservation setting.
  • SLAM simulaneous localization and mapping
  • the mobile robot 100 may move and perform cleaning, and may store sensing data acquired by the sensor unit 170, image data obtained by the image acquisition unit 120, and the like in the storage unit 105.
  • the mobile robot 100 may store data obtained by performing cleaning, such as driving history data and cleaning history data, in the storage unit 105.
  • the data obtained by performing the cleaning may be a record of use of the mobile robot 100.
  • the number of normal cleaning completions, charging station return count, number of cleaning functions per area, operating time, average cleaning intensity per area, amount of dust for each area, cleaning mode for each area (cleaning, quick cleaning, general cleaning, etc.) Frequency and ratio, the number of times the function is used, such as monitoring, designated cleaning, intensive cleaning, cleaning with virtual wall (virtual wall), and at least one of the time-phase concentration of the above-mentioned record.
  • cleaning usage history such as designated cleaning, intensive cleaning, and cleaning with virtual walls
  • monitoring of a specified area recognizing moving objects, and photographing and transmitting the recognized objects to a specified device.
  • Usage records can be used for environmental sensing and driving strategies such as home guard functions, mapping functions that mobile robots can provide, object search, temperature, humidity, and air quality dust measurement.
  • the mobile robot 100 and / or the server 70 may use the user's mobile robot 100 usage pattern, spatial information for each area in the usage environment, such as a house, or an object existing in the usage environment. Analyze and learn information.
  • the mobile robot 100 may determine the special area based on the existing data obtained by the previous cleaning operation (S820).
  • the mobile robot 100 may identify the current location and space on the basis of the image obtained from the image acquisition unit 120 and may determine whether a special area exists.
  • the mobile robot 100 may additionally check region-specific information of the corresponding space on the basis of the currently identified position and space information to determine whether the special area is present.
  • the special area may be a dangerous area in which an inoperable state has occurred.
  • the dangerous area may be an area in which an inoperable situation occurs such as a stuck, a trap, a fall, a power off.
  • the special area may be a low efficiency area in which the running speed or the cleaning rate is lower than the average value or the number of rotations is greater than the reference number.
  • the low-efficiency area may correspond to a driving area below a predetermined average speed in a certain area, an area frequently turning, and an area where a predetermined number of times corresponding to a corresponding motion occurs.
  • the mobile robot 100 may output notification information about the special region (S840).
  • the sound output unit 181 of the mobile robot 100 may output a voice guidance message for the special area (S840). Accordingly, the user's attention can be attracted, and the user can naturally check the situation of the special area by turning or moving his / her head in the direction of sound.
  • the existing data obtained by performing the previous cleaning may further include obstacle information registered in the special area.
  • the voice guide message spoken by the mobile robot 100 may include a guide message for an obstacle existing in the special area. Accordingly, the user may immediately determine the type of the obstacle and the degree of danger to instruct the next operation of the mobile robot 100.
  • the voice guide message may include a message for requesting confirmation of cleaning of the special area.
  • the voice guidance message may further include at least one of a description of a reason for the determination as the special region or a guide for an example of a command that a user can input voice.
  • Providing guidance on an example of a command that a user can input voice can help the user input the feedback voice accurately.
  • the mobile robot 100 and / or the server 70 may recognize the user's voice command more quickly and accurately.
  • the voice can be informed simultaneously such as why the danger zone, what commands can be used.
  • the mobile robot 100 may say, "It is an area where past operation was stopped. If you want to clean, please answer” Cleaning. ""
  • the mobile robot 100 may determine a special area such as a dangerous area and a low efficiency area (S820). When arriving in a special area (S830), it may be inquired first whether or not to perform cleaning for the special area (S840).
  • a special area such as a dangerous area and a low efficiency area (S820).
  • S830 When arriving in a special area (S830), it may be inquired first whether or not to perform cleaning for the special area (S840).
  • the mobile robot 100 arriving at a danger zone 910 having many wires may utter a voice guide message 920 for guiding the danger zone.
  • the voice guide message 920 may include a content 921 that the wire is complicated, a content 922 that is difficult to clean, and the like.
  • the mobile robot 100 when entering a special area, such as a cleaning danger zone in which a dangerous obstacle exists, a cleaning failure zone having a cleaning failure history, and is recognized through a plurality of cleaning, Can inform you of areas for cleaning.
  • a special area such as a cleaning danger zone in which a dangerous obstacle exists, a cleaning failure zone having a cleaning failure history, and is recognized through a plurality of cleaning, Can inform you of areas for cleaning.
  • the mobile robot 100 may notify the user that the dangerous situation is recognized through ignition when entering the dangerous section. Accordingly, the reliability and preference of the user can be increased.
  • the mobile robot 100 when entering a special area, the mobile robot 100 utters a voice guidance message such as “here I am difficult to clean,” and the user can check and remove the complicated wires or toys in the area. Thereafter, the mobile robot 100 may safely perform the cleaning of the corresponding area.
  • a voice guidance message such as “here I am difficult to clean”
  • the mobile robot 100 may repeat the voice guidance message just output.
  • the mobile robot 100 may record the place where the inability to run situation such as restraint, power off when cleaning is performed, and may classify the area as a dangerous area when the inability to travel for a certain number of times occurs.
  • the guidance intensity of the voice guidance message and the like may be increased according to the frequency of occurrence of the incapability of driving.
  • the mobile robot 100 is in danger of encountering a habitually incapacitated situation, “Master, come here. It's really hard for me to clean here. If you put away a little, I'll try harder. "
  • the mobile robot 100 may interact with the user by first asking whether the mobile robot 100 does not need to perform an inefficient operation in a section in which an inefficient operation (running speed or cleaning rate is lower than the average value or the number of rotations is greater than the reference number). Can be.
  • the special area is a dangerous area that may adversely affect the risk of safety and the cleaning of other areas such as the movement of the mobile robot 100 is restricted, or a low efficiency area that is difficult to perform cleaning.
  • the user is asked whether or not to clean the special area (S840), if the user feedback is received (S850), it may operate according to the received feedback (S860).
  • the mobile robot 100 may receive a user's voice feedback on the voice guidance message during a predetermined voice waiting time (S850).
  • the voice command included in the received voice feedback may be identified and an operation corresponding to the identified voice command may be performed (S860).
  • the user's feedback response may be determined to be clean, if a positive vocabulary such as clean, proceed, continue, okay, uh, yes, is used.
  • the user's feedback response includes a negative vocabulary such as do not clean or do not clean, it may be determined that the user wants to clean.
  • the mobile robot 100 may identify the voice command with respect to the voice feedback of the user.
  • the mobile robot 100 may include an artificial neural network trained to recognize a voice included in the input data, and the artificial neural network may recognize a voice command included in the received voice feedback.
  • the voice command regarding the voice feedback of the user may be performed by the server 70.
  • identifying the voice command included in the received voice feedback may include transmitting the received voice feedback data to a voice recognition server including an artificial neural network trained to recognize the voice included in the input data.
  • the method may include receiving a voice recognition result of the received voice feedback data from the voice recognition server.
  • the speech recognition server may be configured as part of the server 70, but may be configured as a dedicated server for speech recognition separately from the server 70.
  • the operation may be performed step by step in the mobile robot 100 and the server 70.
  • the mobile robot 100 may primarily recognize the server, and if the mobile robot 100 does not recognize the voice command, the server 70 may be used.
  • the step of identifying the voice command if the received voice feedback includes a predetermined keyword, identifying the voice command corresponding to the keyword, the received voice feedback is Transmitting the received voice feedback data to a voice recognition server including an artificial neural network trained to recognize a voice included in the input data when the set keyword is not included, and the voice recognition server Receiving a voice recognition result of the received voice feedback data from the.
  • the mobile robot 100 may recognize a simple keyword such as a positive vocabulary, a negative vocabulary, or a specific vocabulary guided by an example of a command, and operate according to the recognized keyword.
  • the mobile robot 100 may transmit voice feedback data of the user to the server 70 to receive a voice recognition result from the server 70.
  • the mobile robot 100 may perform a predetermined avoidance operation (S870).
  • the mobile robot 100 may be set to force the cleaning.
  • the mobile robot 100 may start to speak to the user after a predetermined time has elapsed while cleaning only initially.
  • the mobile robot 100 may perform deep learning with data obtained by performing a plurality of cleaning operations, and recognize the situation of a dangerous obstacle or its location through deep learning, and may explain the corresponding situation in a complex manner.
  • the cleaning efficiency can also be improved by cleaning or not cleaning according to the user's voice input.
  • FIG. 10 is a flowchart illustrating a control method of a mobile robot according to an embodiment of the present invention.
  • the mobile robot 100 may receive a voice command of a user (S1010).
  • the conventional voice recognition mobile robot merely recognizes a received voice command and immediately performs a corresponding operation.
  • the mobile robot 100 may check and confirm existing data stored such as a usage record according to the user's voice command reception (S1010) (S1020).
  • the usage record may include the number of general cleaning completions, charging station return counts, area cleaning function completions, operating time, area cleaning intensity average, area dust amount, area cleaning mode use frequency, area cleaning mode use ratio, It may include at least one of the cleaning function and the number of times of use of each additional function.
  • the concentration of the items included in the above-described usage record may be further included.
  • a usage record regarding other functions as well as a cleaning function performed by the mobile robot 100 may be stored and later checked.
  • cleaning usage history such as designated cleaning, intensive cleaning, and cleaning with virtual walls
  • monitoring of a specified area recognizing moving objects, and photographing and transmitting the recognized objects to a specified device.
  • Usage records can be used for environmental sensing and driving strategies such as home guard functions, mapping functions that mobile robots can provide, object search, temperature, humidity, and air quality dust measurement.
  • the mobile robot 100 and / or the server 70 may use the user's mobile robot 100 usage pattern, spatial information for each area in the usage environment, such as a house, or an object existing in the usage environment. Analyze and learn information.
  • the mobile robot 100 may output a voice guidance message recommending at least one of the functions used less than the reference number of times based on the usage record (S1030).
  • the mobile robot 100 may induce a user to use other functions by proposing a voice to the user for functions that were not normally used.
  • the mobile robot 100 may recommend a predetermined function or provide information to a user located at a short distance before performing a command corresponding to the user's voice, and may allow natural interaction.
  • the mobile robot 100 may provide an improved cleaning service by first asking about a function and a case considered to be necessary by the user.
  • the controller 150 may control the sound output unit 181 to output at least one recommended voice guidance message by selecting at least one of functions used less than a reference number according to a preset priority.
  • the predetermined priority may be in the order of high importance for each predetermined function. For example, when the SLAM success rate decreases, the mapping function is ranked first, the uncleaned area cleaned second, the missed cleaning, home guard, etc. according to schedule, and the third uncleaned area virtual wall.
  • the danger zone setting function can be set to four priority levels. In this case, the user may recommend a function having the highest priority among functions that the user does not frequently use.
  • the preset priority may be an order of decreasing number of times of use. In other words, you can recommend a feature that is not used or has a low frequency. In this case, in the case of a function having the same number of times of use, it may be randomly recommended, or a predetermined function pool may be set and the functions included in the pool may be preferentially recommended.
  • controller 150 may determine the recommended function by checking and confirming necessary data according to the preset priority.
  • the mobile robot 100 and / or the server 70 may identify the user's voice command.
  • a function associated with the voice command of the identified user may be selected and recommended.
  • the mobile robot 100 and / or the server 70 may recognize the received voice command before or after checking the usage record (S1020).
  • a similar cleaning function or a helpful function may be recommended based on a control command determined by voice recognition. For example, if the user's voice command is determined as a meticulous cleaning command, the local concentrated cleaning performance function with a large amount of dust may be recommended, and if the user's voice command is determined as a quick cleaning command, a local skip may be recommended.
  • the feedback voice command included in the received voice feedback is identified, and a predetermined function is based on the identified feedback voice command. It may be set (S1050).
  • the mobile robot 100 may perform an operation corresponding to the voice command of the user identified in response to the reception of the user voice command (S1010) (S0160).
  • the mobile robot 100 may identify the voice command with respect to the voice feedback of the user.
  • the mobile robot 100 may include an artificial neural network trained to recognize a voice included in the input data, and the artificial neural network may recognize a voice command included in the received voice feedback.
  • the voice command regarding the voice feedback of the user may be performed by the server 70.
  • identifying the voice command included in the received voice feedback may include transmitting the received voice feedback data to a voice recognition server including an artificial neural network trained to recognize the voice included in the input data.
  • the method may include receiving a voice recognition result of the received voice feedback data from the voice recognition server.
  • the mobile robot 100 can recognize and analyze a user's usage pattern and talk with the user while first suggesting functions that are not normally used.
  • the mobile robot 100 determines that a better cleaning strategy (cleaning mode, schedule) is determined, the determined cleaning strategy or the like may be recommended by voice.
  • the function can be set and used according to the user's feedback voice input.
  • FIG. 11 is a flowchart illustrating a control method of a mobile robot according to an embodiment of the present invention
  • FIG. 12 is a view referred to for describing a control method of a mobile robot according to an embodiment of the present invention.
  • the mobile robot 100 may recognize that a front door is opened (S1110).
  • the recognition of the front door opening may be recognized based on a signal received from another device such as a sensor or a server attached to the front door.
  • the mobile robot 100 may move to the front door (S1120) and recognize the user (S1130).
  • user recognition may be performed by recognizing an image acquired through the image acquisition unit 120. Or other related home appliance or server.
  • the mobile robot 100 may briefly perform an operation brief operation of the mobile robot 100 by voice (S1140).
  • the mobile robot 100 may briefly perform an operation brief operation of the mobile robot 100 by voice in operation S1140.
  • a voice guidance message 1210 for approaching a user entering from the front door of the mobile robot 100 and briefing an operation of the mobile robot 100 in the absence of the user is provided through the sound output unit 181. You can print
  • the mobile robot 100 may pick up and provide voice feedback at the front door. After the briefing, when there is a lot of fine dust or a user's request, the mobile robot 100 may proceed to clean the surroundings.
  • the mobile robot 100 may brief the operation status and results in the absence of the user of the home appliance included in the home appliance network system connected by Wi-Fi communication.
  • the mobile robot 100 may first guide the operation record of the home appliances on the day.
  • the mobile robot 100 may follow a user's movement and provide a voice briefing.
  • the mobile robot 100 may recognize a user who has entered the home and then follow the user and talk about the cleaning record of the day.
  • the mobile robot 100 may operate in the user following driving mode in response to the user's speech such as "Roboking, Come!.
  • the mobile robot 100 may provide the user with predetermined data such as a picture of an uncleaned area, which is not cleaned when cleaning the house while the user is absent.
  • the mobile robot 100 may behave similarly to the 'pet animal' to make the user feel good and fun.
  • the user recognition may be performed until the failure criterion set by the time or the number of times is satisfied (S1150).
  • a predetermined operation such as waiting after returning to the charging stand and a home guard may be performed (S1160).
  • FIG. 13 is a flowchart illustrating a control method of a mobile robot according to an embodiment of the present invention
  • FIG. 14 is a view referred to for describing a control method of a mobile robot according to an embodiment of the present invention.
  • the mobile robot 100 may receive predetermined data from another device included in the home appliance network system in operation S1810.
  • the air cleaner 200 may convert dust or air quality index data for each space / area through an indoor air sensor.
  • the air cleaner 200 may transfer dust or air quality index data to the mobile robot 100. ) Can be shared with.
  • the air cleaner 200 may detect the presence of dust or detect a space in which a lot of people gather.
  • the air cleaner 200 may start driving and transmit dust generation position data to the mobile robot 100.
  • the mobile robot 100 may establish and recommend a cleaning plan based on the data received from the air cleaner 200 and previously learned data (S1320).
  • the mobile robot 100 may induce by recommending intensive cleaning of a dusty space and recommending a direction change of the air cleaner 200.
  • the mobile robot 100 may propose a cleaning plan after a predetermined time to the user in consideration of the degree of dust generation, the number of people detected, and the like. At this time, the mobile robot 100 may utter a voice guidance message 1410 including a current plan, a cleaning plan after one hour, and the like.
  • the mobile robot 100 may proceed with cleaning depending on whether the user responds.
  • the mobile robot 100 may start cleaning two hours later.
  • the home appliance first detects a situation that needs to be cleaned and suggests it to the user, thereby increasing product reliability.
  • the mobile robot may speak a voice to the user, and may communicate with and interact with the user through the voice.
  • the mobile robot may actively provide information and recommend services, functions, and the like before requesting, thereby increasing the reliability, preference, and product utilization of the user.
  • the user can easily set up and use the associated function without additional effort.
  • the speech recognition is performed by the mobile robot by itself, by the server, or by the mobile robot and the server step by step, thereby enabling effective speech recognition.
  • an evolving user experience may be provided.
  • the mobile robot according to the present invention is not limited to the configuration and method of the embodiments described as described above, but the embodiments may be selectively combined with all or part of the embodiments so that various modifications can be made. It may be configured.
  • the control method of the mobile robot it is possible to implement as a processor readable code on a processor-readable recording medium.
  • the processor-readable recording medium includes all kinds of recording devices that store data that can be read by the processor. Examples of the processor-readable recording medium include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like, and may also be implemented in the form of a carrier wave such as transmission over the Internet. .
  • the processor-readable recording medium can also be distributed over network coupled computer systems so that the processor-readable code is stored and executed in a distributed fashion.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Robotics (AREA)
  • Multimedia (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

Selon un aspect, la présente invention concerne un robot mobile qui est capable de fournir un énoncé vocal à un utilisateur, de manière à pouvoir converser et interagir avec l'utilisateur par la parole. Le robot mobile peut fournir de manière proactive des informations et recommander des services, des fonctions, etc. avant la réception d'une demande, et peut ainsi obtenir des niveaux accrus de confiance, de faveur et d'utilisation de produit de l'utilisateur.
PCT/KR2018/011837 2018-08-27 2018-10-08 Procédé de commande de robot mobile WO2020045732A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020180100647A KR102290983B1 (ko) 2018-08-27 2018-08-27 인공지능 이동 로봇의 제어 방법
KR10-2018-0100647 2018-08-27

Publications (1)

Publication Number Publication Date
WO2020045732A1 true WO2020045732A1 (fr) 2020-03-05

Family

ID=69644447

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2018/011837 WO2020045732A1 (fr) 2018-08-27 2018-10-08 Procédé de commande de robot mobile

Country Status (2)

Country Link
KR (1) KR102290983B1 (fr)
WO (1) WO2020045732A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111590571A (zh) * 2020-05-15 2020-08-28 深圳国信泰富科技有限公司 一种机器人安全控制系统
CN113793602A (zh) * 2021-08-24 2021-12-14 北京数美时代科技有限公司 一种未成年人的音频识别方法和系统
CN114453852A (zh) * 2022-02-16 2022-05-10 上海海事大学 基于语音识别控制机械臂进行叶片装配的方法和系统
US11580972B2 (en) * 2019-04-26 2023-02-14 Fanuc Corporation Robot teaching device

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024075949A1 (fr) * 2022-10-07 2024-04-11 삼성전자 주식회사 Procédé de commande de dispositif électronique à l'aide d'informations spatiales et dispositif électronique utilisant des informations spatiales

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20040042242A (ko) * 2002-11-13 2004-05-20 삼성전자주식회사 홈서버를 이용하는 홈로봇 및 이를 포함하는 홈네트워크시스템
KR20060131458A (ko) * 2005-06-16 2006-12-20 에스케이 텔레콤주식회사 이동 로봇과 사용자간의 상호 작용 방법 및 이를 위한시스템
KR20160021991A (ko) * 2014-08-19 2016-02-29 삼성전자주식회사 청소 로봇, 청소 로봇의 제어 장치, 제어 시스템, 및 제어 방법
WO2017051627A1 (fr) * 2015-09-24 2017-03-30 シャープ株式会社 Appareil de production de parole et procédé de production de parole
KR20180079824A (ko) * 2017-01-02 2018-07-11 엘지전자 주식회사 홈 로봇 및 그 동작 방법

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100752098B1 (ko) * 2006-03-07 2007-08-24 한국과학기술연구원 신경망 기반 로봇 시스템

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20040042242A (ko) * 2002-11-13 2004-05-20 삼성전자주식회사 홈서버를 이용하는 홈로봇 및 이를 포함하는 홈네트워크시스템
KR20060131458A (ko) * 2005-06-16 2006-12-20 에스케이 텔레콤주식회사 이동 로봇과 사용자간의 상호 작용 방법 및 이를 위한시스템
KR20160021991A (ko) * 2014-08-19 2016-02-29 삼성전자주식회사 청소 로봇, 청소 로봇의 제어 장치, 제어 시스템, 및 제어 방법
WO2017051627A1 (fr) * 2015-09-24 2017-03-30 シャープ株式会社 Appareil de production de parole et procédé de production de parole
KR20180079824A (ko) * 2017-01-02 2018-07-11 엘지전자 주식회사 홈 로봇 및 그 동작 방법

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11580972B2 (en) * 2019-04-26 2023-02-14 Fanuc Corporation Robot teaching device
CN111590571A (zh) * 2020-05-15 2020-08-28 深圳国信泰富科技有限公司 一种机器人安全控制系统
CN111590571B (zh) * 2020-05-15 2022-03-04 深圳国信泰富科技有限公司 一种机器人安全控制系统
CN113793602A (zh) * 2021-08-24 2021-12-14 北京数美时代科技有限公司 一种未成年人的音频识别方法和系统
CN114453852A (zh) * 2022-02-16 2022-05-10 上海海事大学 基于语音识别控制机械臂进行叶片装配的方法和系统

Also Published As

Publication number Publication date
KR20200027072A (ko) 2020-03-12
KR102290983B1 (ko) 2021-08-17

Similar Documents

Publication Publication Date Title
WO2020045732A1 (fr) Procédé de commande de robot mobile
AU2019334724B2 (en) Plurality of autonomous mobile robots and controlling method for the same
AU2019262468B2 (en) A plurality of robot cleaner and a controlling method for the same
AU2019262467B2 (en) A plurality of robot cleaner and a controlling method for the same
WO2019083291A1 (fr) Robot mobile à intelligence artificielle qui apprend des obstacles, et son procédé de commande
WO2019212239A1 (fr) Pluralité de robots nettoyeurs et leur procédé de commande
WO2014175605A1 (fr) Robot de nettoyage, appareil de surveillance domestique, et procédé pour commander un robot de nettoyage
WO2021006677A2 (fr) Robot mobile faisant appel à l'intelligence artificielle et son procédé de commande
WO2019212240A1 (fr) Pluralité de robots nettoyeurs et leur procédé de commande
WO2021006542A1 (fr) Robot mobile faisant appel à l'intelligence artificielle et son procédé de commande
WO2020050566A1 (fr) Pluralité de robots mobiles autonomes et procédé de commande de tels robots mobiles autonomes
WO2021029457A1 (fr) Serveur d'intelligence artificielle et procédé permettant de fournir des informations à un utilisateur
WO2018117616A1 (fr) Robot mobile
WO2020246647A1 (fr) Dispositif d'intelligence artificielle permettant de gérer le fonctionnement d'un système d'intelligence artificielle, et son procédé
WO2020246640A1 (fr) Dispositif d'intelligence artificielle pour déterminer l'emplacement d'un utilisateur et procédé associé
WO2019004742A1 (fr) Système de robot comprenant un robot mobile et un terminal mobile
WO2022075610A1 (fr) Système de robot mobile
WO2019004773A1 (fr) Terminal mobile et système de robot comprenant ledit terminal mobile
WO2020251101A1 (fr) Dispositif d'intelligence artificielle pour déterminer un trajet de déplacement d'un utilisateur, et procédé associé
WO2020122540A1 (fr) Robot nettoyeur et son procédé de fonctionnement
WO2019177418A1 (fr) Robot mobile et son procédé de commande
WO2020022622A1 (fr) Procédé de commande d'un robot mobile à intelligence artificielle
WO2022075616A1 (fr) Système de robot mobile
WO2020022621A1 (fr) Procédé de commande de robot mobile à intelligence artificielle
WO2020027406A1 (fr) Robot mobile à intelligence artificielle

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18931811

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18931811

Country of ref document: EP

Kind code of ref document: A1