WO2021149879A1 - Procédé de commande d'un robot de nettoyage selon le matériau de la surface en contact - Google Patents

Procédé de commande d'un robot de nettoyage selon le matériau de la surface en contact Download PDF

Info

Publication number
WO2021149879A1
WO2021149879A1 PCT/KR2020/006990 KR2020006990W WO2021149879A1 WO 2021149879 A1 WO2021149879 A1 WO 2021149879A1 KR 2020006990 W KR2020006990 W KR 2020006990W WO 2021149879 A1 WO2021149879 A1 WO 2021149879A1
Authority
WO
WIPO (PCT)
Prior art keywords
contact surface
cleaning
group
information
management method
Prior art date
Application number
PCT/KR2020/006990
Other languages
English (en)
Korean (ko)
Inventor
김태현
채종훈
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Publication of WO2021149879A1 publication Critical patent/WO2021149879A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • A47L9/2805Parameters or conditions being sensed
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • A47L9/2836Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means characterised by the parts which are controlled
    • A47L9/2842Suction motors or blowers
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • A47L9/2836Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means characterised by the parts which are controlled
    • A47L9/2852Elements for displacement of the vacuum cleaner or the accessories therefor, e.g. wheels, casters or nozzles
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/06Control of the cleaning action for autonomous devices; Automatic detection of the surface condition before, during or after cleaning

Definitions

  • the present specification relates to a cleaning robot control method according to the material of the contact surface.
  • the cleaning robot travels along the driving route included in the map data and can remove pollutants located on the route. Meanwhile, the cleaning robot needs to set different cleaning methods according to the state of the cleaning target surface or the contact surface to be cleaned.
  • the cleaning robot needs to classify the fresh water area or set the priority of cleaning according to the state of the surface to be cleaned or the contact surface to be cleaned.
  • an object of the present specification is to implement a cleaning area management method and a robot cleaner capable of grouping cleaning areas according to a predetermined criterion.
  • an object of the present specification is to realize a cleaning area management method and a robot cleaner for recognizing the structure and shape of a house and obstacles and creating an effective cleaning path.
  • an object of the present specification is to implement a cleaning area management method and a robot cleaner capable of performing a cleaning operation in the same mode on a similar contact surface (or cleaning surface).
  • a cleaning area management method includes: receiving data related to a contact surface in a plurality of indoor areas; storing data related to the contact surface for each area; generating material (or texture) information of the contact surface for each area based on the data related to the contact surface; and classifying the plurality of regions into at least one group based on the similarity of the material information generated for each region.
  • the data related to the contact surface may include a contour, a color, a current change of a driving motor, or a noise pattern of the contact surface.
  • the material information may include a type of the contact surface, an absorption degree, or a resistance degree.
  • the generating of the material information includes setting features extracted from the pattern and color of the contact surface as an input of the first model, and based on the output of the first model, information on the type of the contact surface (type of the contact surface) information) can be created.
  • the generating of the material information includes setting the feature extracted from the noise pattern as an input of a second model, and based on the output of the second model, information on the degree of adhesion between the contact surface and the suction port (absorption information) can create
  • the generating of the material information includes setting a feature extracted from a change in current of the driving motor as an input of a third model, and based on the output of the third model, resistance information of the contact surface.
  • classifying the plurality of regions into at least one group may include classifying the plurality of regions into at least one group using a K-means algorithm or a Mahalanobis distance.
  • determining the priority of cleaning by comparing the material information of the at least one group; and providing a driving route for performing cleaning according to the priority.
  • the priority may be determined by giving a higher weight to the resistance among the material information of the group.
  • the priority may be determined by giving a higher weight to the type of the contact surface among the material information of the group.
  • the priority may be determined by giving a higher weight to the adhesion degree among the material information of the group.
  • the value of the material information of the at least one group may be an average of the values of the material information of each of the plurality of regions constituting the at least one group.
  • the method may further include controlling to change the suction force or the driving mode when the region corresponding to the at least one group is reached.
  • the suction force may increase in proportion to the resistance among the material information of the at least one group.
  • the suction force may decrease until the degree of adhesion among the material information of the at least one group reaches a predetermined first threshold or more, until the adhesion decreases to less than or equal to a second threshold.
  • the first and second threshold values may be different from each other.
  • a robot cleaner includes: a sensor for receiving data related to a contact surface in a plurality of areas in the room; a memory for storing data related to the contact surface for each area; a control unit generating material information of the contact surface for each area based on data related to the contact surface, wherein the control unit classifies the plurality of areas into at least one group based on the similarity of the material information generated for each area can
  • the present specification may group cleaning areas by predetermined criteria.
  • the present specification can recognize the structure and shape of the house and obstacles and create an effective cleaning path.
  • similar contact surfaces (or cleaning surfaces) of the present specification may perform a cleaning operation in the same mode.
  • FIG. 1 illustrates a block diagram of a wireless communication system to which the methods proposed in the present specification can be applied.
  • FIG. 2 shows an example of a signal transmission/reception method in a wireless communication system.
  • FIG. 3 shows an example of basic operations of a user terminal and a 5G network in a 5G communication system.
  • FIG. 4 is a diagram illustrating a block diagram of an electronic device.
  • FIG. 5 shows a schematic block diagram of an AI server according to an embodiment of the present specification.
  • FIG. 6 is a schematic block diagram of an AI device according to another embodiment of the present specification.
  • FIG. 7 is a conceptual diagram illustrating an embodiment of an AI device.
  • FIG 8 and 9 are views for explaining the appearance of the robot cleaner used in an embodiment of the present specification.
  • FIG. 10 is an exemplary block diagram of a robot cleaner used in an embodiment of the present specification.
  • 11 is a view for illustrating a running process of a robot cleaner running on a cleaning surface of various materials.
  • FIG. 12 is a diagram illustrating a running process of a robot cleaner that acquires an image of a cleaning surface.
  • FIG. 13 and 14 are flowcharts of a cleaning area management method according to an embodiment of the present specification.
  • 15 and 16 are diagrams for explaining a cleaning area management method by way of example.
  • FIG. 17 is a flowchart of a cleaning area management method according to the generation of a new cleaning area.
  • FIG. 18 is a diagram for explaining FIG. 17 by way of example.
  • FIG. 1 illustrates a block diagram of a wireless communication system to which the methods proposed in the present specification can be applied.
  • a device (AI device) including an AI module may be defined as a first communication device ( 910 in FIG. 1 ), and a processor 911 may perform detailed AI operations.
  • a second communication device ( 920 in FIG. 1 ) may perform a 5G network including another device (AI server) that communicates with the AI device, and the processor 921 may perform detailed AI operations.
  • AI server another device that communicates with the AI device
  • the processor 921 may perform detailed AI operations.
  • the 5G network may be represented as the first communication device, and the AI device may be represented as the second communication device.
  • the first communication device or the second communication device may include a base station, a network node, a transmitting terminal, a receiving terminal, a wireless device, a wireless communication device, a vehicle, a vehicle equipped with an autonomous driving function, and a connected car.
  • drone Unmanned Aerial Vehicle, UAV
  • AI Artificial Intelligence
  • robot Robot
  • AR Algmented Reality
  • VR Virtual Reality
  • MR Magnetic
  • hologram device public safety device
  • MTC device IoT devices
  • medical devices fintech devices (or financial devices)
  • security devices climate/environmental devices, devices related to 5G services, or other devices related to the 4th industrial revolution field.
  • a terminal or user equipment includes a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, personal digital assistants (PDA), a portable multimedia player (PMP), a navigation system, and a slate PC.
  • PDA personal digital assistants
  • PMP portable multimedia player
  • slate PC slate PC
  • tablet PC tablet PC
  • ultrabook ultrabook
  • wearable device e.g., watch-type terminal (smartwatch), glass-type terminal (smart glass), HMD (head mounted display)
  • the HMD may be a display device worn on the head.
  • an HMD may be used to implement VR, AR or MR.
  • the drone may be a flying vehicle that does not ride by a person and flies by a wireless control signal.
  • the VR device may include a device that implements an object or a background of a virtual world.
  • the AR device may include a device that implements by connecting an object or background in the virtual world to an object or background in the real world.
  • the MR device may include a device that implements a virtual world object or background by fusion with a real world object or background.
  • the hologram device may include a device for realizing a 360-degree stereoscopic image by recording and reproducing stereoscopic information by utilizing an interference phenomenon of light generated by the meeting of two laser beams called holography.
  • the public safety device may include an image relay device or an image device that can be worn on a user's body.
  • the MTC device and the IoT device may be devices that do not require direct human intervention or manipulation.
  • the MTC device and the IoT device may include a smart meter, a bending machine, a thermometer, a smart light bulb, a door lock, or various sensors.
  • a medical device may be a device used for the purpose of diagnosing, treating, alleviating, treating, or preventing a disease.
  • a medical device may be a device used for the purpose of diagnosing, treating, alleviating or correcting an injury or disorder.
  • a medical device may be a device used for the purpose of examining, replacing, or modifying structure or function.
  • the medical device may be a device used for the purpose of controlling pregnancy.
  • the medical device may include a medical device, a surgical device, an (in vitro) diagnostic device, a hearing aid, or a device for a procedure.
  • the security device may be a device installed to prevent a risk that may occur and to maintain safety.
  • the security device may be a camera, CCTV, recorder or black box.
  • the fintech device may be a device capable of providing financial services such as mobile payment.
  • a first communication device 910 and a second communication device 920 include a processor 911,921, a memory 914,924, and one or more Tx/Rx RF modules (radio frequency module, 915,925). , including Tx processors 912 and 922 , Rx processors 913 and 923 , and antennas 916 and 926 . Tx/Rx modules are also called transceivers. Each Tx/Rx module 915 transmits a signal via a respective antenna 926 .
  • the processor implements the functions, processes and/or methods salpinned above.
  • the processor 921 may be associated with a memory 924 that stores program code and data. Memory may be referred to as a computer-readable medium.
  • the transmit (TX) processor 912 implements various signal processing functions for the L1 layer (ie, the physical layer).
  • the receive (RX) processor implements the various signal processing functions of L1 (ie, the physical layer).
  • the UL (second communication device to first communication device) is handled in the first communication device 910 in a manner similar to that described with respect to the receiver function in the second communication device 920 .
  • Each Tx/Rx module 925 receives a signal via a respective antenna 926 .
  • Each Tx/Rx module provides an RF carrier and information to the RX processor 923 .
  • the processor 921 may be associated with a memory 924 that stores program code and data. Memory may be referred to as a computer-readable medium.
  • FIG. 2 is a diagram illustrating an example of a signal transmission/reception method in a wireless communication system.
  • the UE performs an initial cell search operation such as synchronizing with the BS when the power is turned on or a new cell is entered ( S201 ).
  • the UE receives a primary synchronization channel (P-SCH) and a secondary synchronization channel (S-SCH) from the BS, synchronizes with the BS, and acquires information such as cell ID can do.
  • P-SCH primary synchronization channel
  • S-SCH secondary synchronization channel
  • the P-SCH and the S-SCH are called a primary synchronization signal (PSS) and a secondary synchronization signal (SSS), respectively.
  • PSS primary synchronization signal
  • SSS secondary synchronization signal
  • the UE may receive a physical broadcast channel (PBCH) from the BS to obtain broadcast information in the cell.
  • PBCH physical broadcast channel
  • the UE may check the downlink channel state by receiving a downlink reference signal (DL RS) in the initial cell search step.
  • DL RS downlink reference signal
  • the UE receives a physical downlink control channel (PDCCH) and a physical downlink shared channel (PDSCH) according to information carried on the PDCCH to obtain more specific system information. It can be done (S202).
  • PDCCH physical downlink control channel
  • PDSCH physical downlink shared channel
  • the UE may perform a random access procedure (RACH) to the BS (steps S203 to S206).
  • RACH random access procedure
  • the UE transmits a specific sequence as a preamble through a physical random access channel (PRACH) (S203 and S205), and a random access response to the preamble through the PDCCH and the corresponding PDSCH (random access response, RAR) message may be received (S204 and S206).
  • PRACH physical random access channel
  • RAR random access response
  • a contention resolution procedure may be additionally performed.
  • the UE receives PDCCH/PDSCH (S207) and a physical uplink shared channel (PUSCH)/physical uplink control channel as a general uplink/downlink signal transmission process.
  • Uplink control channel, PUCCH) transmission (S208) may be performed.
  • the UE receives downlink control information (DCI) through the PDCCH.
  • DCI downlink control information
  • the UE monitors a set of PDCCH candidates in monitoring opportunities set in one or more control element sets (CORESETs) on a serving cell according to corresponding search space configurations.
  • the set of PDCCH candidates to be monitored by the UE is defined in terms of search space sets, which may be a common search space set or a UE-specific search space set.
  • the CORESET consists of a set of (physical) resource blocks with a time duration of 1 to 3 OFDM symbols.
  • the network may configure the UE to have multiple CORESETs.
  • the UE monitors PDCCH candidates in one or more search space sets. Here, monitoring means trying to decode PDCCH candidate(s) in the search space. If the UE succeeds in decoding one of the PDCCH candidates in the search space, the UE determines that the PDCCH is detected in the corresponding PDCCH candidate, and performs PDSCH reception or PUSCH transmission based on the DCI in the detected PDCCH.
  • the PDCCH may be used to schedule DL transmissions on PDSCH and UL transmissions on PUSCH.
  • the DCI on the PDCCH is a downlink assignment (i.e., downlink grant; DL grant) including at least modulation and coding format and resource allocation information related to the downlink shared channel, or uplink It includes an uplink grant (UL grant) including a modulation and coding format and resource allocation information related to a shared channel.
  • DL grant downlink grant
  • UL grant uplink grant
  • an initial access (IA) procedure in a 5G communication system will be additionally described.
  • the UE may perform cell search, system information acquisition, beam alignment for initial access, DL measurement, and the like based on the SSB.
  • the SSB is mixed with an SS/PBCH (Synchronization Signal/Physical Broadcast channel) block.
  • SS/PBCH Synchronization Signal/Physical Broadcast channel
  • SSB consists of PSS, SSS and PBCH.
  • the SSB is configured in four consecutive OFDM symbols, and PSS, PBCH, SSS/PBCH or PBCH are transmitted for each OFDM symbol.
  • PSS and SSS consist of 1 OFDM symbol and 127 subcarriers, respectively, and PBCH consists of 3 OFDM symbols and 576 subcarriers.
  • Cell discovery refers to a process in which the UE acquires time/frequency synchronization of a cell, and detects a cell ID (Identifier) (eg, Physical layer Cell ID, PCI) of the cell.
  • PSS is used to detect a cell ID within a cell ID group
  • SSS is used to detect a cell ID group.
  • PBCH is used for SSB (time) index detection and half-frame detection.
  • the SSB is transmitted periodically according to the SSB period (periodicity).
  • the SSB basic period assumed by the UE during initial cell discovery is defined as 20 ms. After cell access, the SSB period may be set to one of ⁇ 5ms, 10ms, 20ms, 40ms, 80ms, 160ms ⁇ by the network (eg, BS).
  • the SI is divided into a master information block (MIB) and a plurality of system information blocks (SIB). SI other than MIB may be referred to as Remaining Minimum System Information (RMSI).
  • the MIB includes information/parameters for monitoring the PDCCH scheduling the PDSCH carrying the System Information Block1 (SIB1) and is transmitted by the BS through the PBCH of the SSB.
  • SIB1 includes information related to availability and scheduling (eg, transmission period, SI-window size) of the remaining SIBs (hereinafter, SIBx, where x is an integer of 2 or more). SIBx is included in the SI message and transmitted through the PDSCH. Each SI message is transmitted within a periodically occurring time window (ie, an SI-window).
  • RA random access
  • the random access process is used for a variety of purposes.
  • the random access procedure may be used for network initial access, handover, and UE-triggered UL data transmission.
  • the UE may acquire UL synchronization and UL transmission resources through a random access procedure.
  • the random access process is divided into a contention-based random access process and a contention free random access process.
  • the detailed procedure for the contention-based random access process is as follows.
  • the UE may transmit the random access preamble through the PRACH as Msg1 of the random access procedure in the UL.
  • Random access preamble sequences having two different lengths are supported.
  • the long sequence length 839 applies for subcarrier spacings of 1.25 and 5 kHz, and the short sequence length 139 applies for subcarrier spacings of 15, 30, 60 and 120 kHz.
  • the BS When the BS receives the random access preamble from the UE, the BS sends a random access response (RAR) message (Msg2) to the UE.
  • RAR random access response
  • the PDCCH scheduling the PDSCH carrying the RAR is CRC-masked and transmitted with a random access (RA) radio network temporary identifier (RNTI) (RA-RNTI).
  • RA-RNTI random access radio network temporary identifier
  • the UE detecting the PDCCH masked by the RA-RNTI may receive the RAR from the PDSCH scheduled by the DCI carried by the PDCCH.
  • the UE checks whether the random access response information for the preamble it has transmitted, that is, Msg1, is in the RAR.
  • Whether or not random access information for Msg1 transmitted by itself exists may be determined by whether a random access preamble ID for the preamble transmitted by the UE exists. If there is no response to Msg1, the UE may retransmit the RACH preamble within a predetermined number of times while performing power ramping. The UE calculates the PRACH transmit power for the retransmission of the preamble based on the most recent path loss and power ramping counter.
  • the UE may transmit UL transmission on the uplink shared channel as Msg3 of the random access procedure based on the random access response information.
  • Msg3 may include the RRC connection request and UE identifier.
  • the network may send Msg4, which may be treated as a contention resolution message on DL.
  • Msg4 the UE can enter the RRC connected state.
  • the BM process may be divided into (1) a DL BM process using SSB or CSI-RS, and (2) a UL BM process using a sounding reference signal (SRS).
  • each BM process may include Tx beam sweeping to determine a Tx beam and Rx beam sweeping to determine an Rx beam.
  • a configuration for a beam report using the SSB is performed during channel state information (CSI)/beam configuration in RRC_CONNECTED.
  • CSI channel state information
  • the UE receives from the BS a CSI-ResourceConfig IE including a CSI-SSB-ResourceSetList for SSB resources used for BM.
  • the RRC parameter csi-SSB-ResourceSetList indicates a list of SSB resources used for beam management and reporting in one resource set.
  • the SSB resource set may be set to ⁇ SSBx1, SSBx2, SSBx3, SSBx4, ⁇ .
  • the SSB index may be defined from 0 to 63.
  • - UE receives signals on SSB resources from the BS based on the CSI-SSB-ResourceSetList.
  • the UE reports the best SSBRI and RSRP corresponding thereto to the BS.
  • the reportQuantity of the CSI-RS reportConfig IE is set to 'ssb-Index-RSRP', the UE reports the best SSBRI and the corresponding RSRP to the BS.
  • the UE has the CSI-RS and the SSB similarly located in the 'QCL-TypeD' point of view ( quasi co-located, QCL).
  • QCL-TypeD may mean QCL between antenna ports in terms of spatial Rx parameters.
  • the Rx beam determination (or refinement) process of the UE using the CSI-RS and the Tx beam sweeping process of the BS will be described in turn.
  • the repetition parameter is set to 'ON'
  • the repetition parameter is set to 'OFF'.
  • the UE receives the NZP CSI-RS resource set IE including the RRC parameter for 'repetition' from the BS through RRC signaling.
  • the RRC parameter 'repetition' is set to 'ON'.
  • the UE repeats signals on the resource(s) in the CSI-RS resource set in which the RRC parameter 'repetition' is set to 'ON' in different OFDM symbols through the same Tx beam (or DL spatial domain transmission filter) of the BS receive
  • the UE determines its own Rx beam.
  • the UE omits CSI reporting. That is, the UE may omit the CSI report when the multi-RRC parameter 'repetition' is set to 'ON'.
  • the UE receives the NZP CSI-RS resource set IE including the RRC parameter for 'repetition' from the BS through RRC signaling.
  • the RRC parameter 'repetition' is set to 'OFF' and is related to the Tx beam sweeping process of the BS.
  • the UE receives signals on resources in the CSI-RS resource set in which the RRC parameter 'repetition' is set to 'OFF' through different Tx beams (DL spatial domain transmission filter) of the BS.
  • the UE selects (or determines) the best beam.
  • the UE reports the ID (eg, CRI) and related quality information (eg, RSRP) for the selected beam to the BS. That is, when the CSI-RS is transmitted for the BM, the UE reports the CRI and the RSRP to the BS.
  • ID eg, CRI
  • RSRP related quality information
  • the UE receives the RRC signaling (eg, SRS-Config IE) including the (RRC parameter) usage parameter set to 'beam management' from the BS.
  • SRS-Config IE is used for SRS transmission configuration.
  • the SRS-Config IE includes a list of SRS-Resources and a list of SRS-ResourceSets. Each SRS resource set means a set of SRS-resources.
  • the UE determines Tx beamforming for the SRS resource to be transmitted based on the SRS-SpatialRelation Info included in the SRS-Config IE.
  • the SRS-SpatialRelation Info is set for each SRS resource and indicates whether to apply the same beamforming as that used in SSB, CSI-RS, or SRS for each SRS resource.
  • SRS-SpatialRelationInfo is configured in the SRS resource, the same beamforming as that used in SSB, CSI-RS, or SRS is applied and transmitted. However, if SRS-SpatialRelationInfo is not configured in the SRS resource, the UE arbitrarily determines Tx beamforming and transmits the SRS through the determined Tx beamforming.
  • BFR beam failure recovery
  • Radio Link Failure may frequently occur due to rotation, movement, or beamforming blockage of the UE. Therefore, BFR is supported in NR to prevent frequent RLF from occurring. BFR is similar to the radio link failure recovery process, and can be supported when the UE knows new candidate beam(s).
  • the BS sets beam failure detection reference signals to the UE, and the UE determines that the number of beam failure indications from the physical layer of the UE is within a period set by the RRC signaling of the BS. When a threshold set by RRC signaling is reached (reach), a beam failure is declared (declare).
  • the UE triggers beam failure recovery by initiating a random access procedure on the PCell; Beam failure recovery is performed by selecting a suitable beam (if the BS provides dedicated random access resources for certain beams, these are prioritized by the UE). Upon completion of the random access procedure, it is considered that beam failure recovery has been completed.
  • URLLC transmission defined in NR is (1) a relatively low traffic size, (2) a relatively low arrival rate (low arrival rate), (3) extremely low latency requirements (eg, 0.5, 1ms), (4) a relatively short transmission duration (eg, 2 OFDM symbols), and (5) transmission for an urgent service/message.
  • transmission for a specific type of traffic eg, URLLC
  • eMBB previously scheduled transmission
  • eMBB and URLLC services may be scheduled on non-overlapping time/frequency resources, and URLLC transmission may occur on resources scheduled for ongoing eMBB traffic.
  • the eMBB UE may not know whether the PDSCH transmission of the corresponding UE is partially punctured, and the UE may not be able to decode the PDSCH due to corrupted coded bits.
  • NR provides a preemption indication.
  • the preemption indication may be referred to as an interrupted transmission indication.
  • the UE receives the DownlinkPreemption IE through RRC signaling from the BS.
  • the UE is provided with the DownlinkPreemption IE, for monitoring the PDCCH carrying DCI format 2_1, the UE is configured with the INT-RNTI provided by the parameter int-RNTI in the DownlinkPreemption IE.
  • the UE is additionally configured with a set of serving cells by INT-ConfigurationPerServing Cell including a set of serving cell indices provided by servingCellID and a corresponding set of positions for fields in DCI format 2_1 by positionInDCI, dci-PayloadSize It is established with the information payload size for DCI format 2_1 by , and is set with the indicated granularity of time-frequency resources by timeFrequencySect.
  • the UE receives DCI format 2_1 from the BS based on the DownlinkPreemption IE.
  • the UE When the UE detects the DCI format 2_1 for the serving cell in the configured set of serving cells, the UE determines that the DCI format of the set of PRBs and the set of symbols of the monitoring period immediately preceding the monitoring period to which the DCI format 2_1 belongs. It can be assumed that there is no transmission to the UE in the PRBs and symbols indicated by 2_1. For example, the UE sees that the signal in the time-frequency resource indicated by the preemption is not the scheduled DL transmission for itself and decodes data based on the signals received in the remaining resource region.
  • mMTC massive machine type communication
  • 5G to support hyper-connectivity service that communicates simultaneously with a large number of UEs.
  • the UE communicates intermittently with a very low transmission rate and mobility. Therefore, mMTC is primarily aimed at how long the UE can run at a low cost.
  • 3GPP deals with MTC and NB (NarrowBand)-IoT.
  • the mMTC technology has features such as repeated transmission of PDCCH, PUCCH, physical downlink shared channel (PDSCH), PUSCH, and the like, frequency hopping, retuning, and guard period.
  • a PUSCH (or PUCCH (particularly, long PUCCH) or PRACH) including specific information and a PDSCH (or PDCCH) including a response to specific information are repeatedly transmitted.
  • Repeated transmission is performed through frequency hopping, and for repeated transmission, (RF) retuning is performed in a guard period from a first frequency resource to a second frequency resource, and specific information
  • RF retuning is performed in a guard period from a first frequency resource to a second frequency resource
  • a response to specific information may be transmitted/received through a narrowband (ex. 6 RB (resource block) or 1 RB).
  • FIG. 3 shows an example of basic operations of a user terminal and a 5G network in a 5G communication system.
  • the UE transmits the specific information transmission to the 5G network (S1).
  • the 5G network performs 5G processing on the specific information (S2).
  • the 5G processing may include AI processing.
  • the 5G network transmits a response including the AI processing result to the UE (S3).
  • step S1 and step S3 of FIG. 3 in order for the UE to transmit/receive signals, information, etc. with the 5G network, the UE has an initial access procedure and random access with the 5G network before step S1 of FIG. random access) procedure.
  • the UE performs an initial connection procedure with the 5G network based on the SSB to obtain DL synchronization and system information.
  • a beam management (BM) process and a beam failure recovery process may be added to the initial access procedure, and in the process of the UE receiving a signal from the 5G network, a QCL (quasi-co location) relationship can be added.
  • BM beam management
  • QCL quadsi-co location
  • the UE performs a random access procedure with the 5G network for UL synchronization acquisition and/or UL transmission.
  • the 5G network may transmit a UL grant for scheduling transmission of specific information to the UE. Accordingly, the UE transmits specific information to the 5G network based on the UL grant.
  • the 5G network transmits a DL grant for scheduling transmission of a 5G processing result for the specific information to the UE. Accordingly, the 5G network may transmit a response including the AI processing result to the UE based on the DL grant.
  • the UE may receive a DownlinkPreemption IE from the 5G network. Then, the UE receives DCI format 2_1 including a pre-emption indication from the 5G network based on the DownlinkPreemption IE. And, the UE does not perform (or expect or assume) the reception of eMBB data in the resource (PRB and/or OFDM symbol) indicated by the pre-emption indication. Thereafter, the UE may receive a UL grant from the 5G network when it is necessary to transmit specific information.
  • the UE receives a UL grant from the 5G network to transmit specific information to the 5G network.
  • the UL grant includes information on the number of repetitions for the transmission of the specific information, and the specific information may be repeatedly transmitted based on the information on the number of repetitions. That is, the UE transmits specific information to the 5G network based on the UL grant.
  • repeated transmission of specific information may be performed through frequency hopping, transmission of the first specific information may be transmitted in a first frequency resource, and transmission of the second specific information may be transmitted in a second frequency resource.
  • the specific information may be transmitted through a narrowband of 6RB (Resource Block) or 1RB (Resource Block).
  • FIG. 4 is a diagram illustrating a block diagram of an electronic device.
  • the electronic device 100 includes at least one processor 110 , a memory 120 , an output device 130 , an input device 140 , an input/output interface 150 , a sensor module 160 , It may include a communication module 170 .
  • the processor 110 may include one or more application processors (APs), one or more communication processors (CPs), or at least one or more artificial intelligence processors (AI processors).
  • APs application processors
  • CPs communication processors
  • AI processors artificial intelligence processors
  • the application processor, communication processor, or AI processor may be included in different integrated circuit (IC) packages, respectively, or may be included in one IC package.
  • the application processor may control a plurality of hardware or software components connected to the application processor by driving an operating system or an application program, and may perform various data processing/operations including multimedia data.
  • the application processor may be implemented as a system on chip (SoC).
  • SoC system on chip
  • the processor 110 may further include a graphic processing unit (GPU).
  • the communication processor may perform a function of managing a data link and converting a communication protocol in communication between the electronic device 100 and other electronic devices connected through a network.
  • the communication processor may be implemented as an SoC.
  • the communication processor may perform at least a portion of the multimedia control function.
  • the communication processor may control data transmission/reception of the communication module 170 .
  • the communication processor may be implemented to be included as at least a part of the application processor.
  • the application processor or the communication processor may load and process a command or data received from at least one of a non-volatile memory or other components connected thereto to the volatile memory.
  • the application processor or the communication processor may store data received from at least one of the other components or generated by at least one of the other components in the nonvolatile memory.
  • the memory 120 may include an internal memory or an external memory.
  • the built-in memory may include a volatile memory (eg, dynamic RAM (DRAM), static RAM (SRAM), synchronous dynamic RAM (SDRAM), etc.) or a non-volatile memory non-volatile memory (eg, one time programmable ROM (OTPROM)); and at least one of programmable ROM (PROM), erasable and programmable ROM (EPROM), electrically erasable and programmable ROM (EEPROM), mask ROM, flash ROM, NAND flash memory, NOR flash memory, etc.).
  • the internal memory may take the form of a solid state drive (SSD).
  • the external memory is a flash drive, for example, CF (compact flash), SD (secure digital), Micro-SD (micro secure digital), Mini-SD (mini secure digital), xD (extreme digital) Alternatively, a memory stick may be further included.
  • the output device 130 may include at least one of a display module and a speaker.
  • the output device 130 may display various data including multimedia data, text data, voice data, and the like to the user or output it as sound.
  • the input device 140 may include a touch panel, a digital pen sensor, a key, or an ultrasonic input device.
  • the input device 140 may be an input/output interface 150 .
  • the touch panel may recognize a touch input using at least one of a capacitive type, a pressure sensitive type, an infrared type, and an ultrasonic type.
  • the touch panel may further include a controller (not shown). In the case of capacitive type, not only direct touch but also proximity recognition is possible.
  • the touch panel may further include a tactile layer. In this case, the touch panel may provide a tactile response to the user.
  • the digital pen sensor may be implemented using the same or similar method as receiving a user's touch input or a separate recognition layer.
  • the key may be a keypad or a touch key.
  • the ultrasonic input device is a device that can check data by detecting a micro sound wave in a terminal through a pen that generates an ultrasonic signal, and wireless recognition is possible.
  • the electronic device 100 may receive a user input from an external device (eg, a network, a computer, or a server) connected thereto using the communication module 170 .
  • the input device 140 may further include a camera module and a microphone.
  • the camera module is a device capable of capturing images and moving pictures, and may include one or more image sensors, an image signal processor (ISP), or a flash LED.
  • the microphone may receive a voice signal and convert it into an electrical signal.
  • the input/output interface 150 may transmit a command or data received from a user through an input device or an output device through a bus (not shown) to the processor 110 , the memory 120 , the communication module 170 , and the like.
  • the input/output interface 150 may provide data regarding a user's touch input input by tapping the touch panel to the processor 110 .
  • the input/output interface 150 may output a command or data received from the processor 110 , the memory 120 , the communication module 170 , and the like through the bus through the output device 130 .
  • the input/output interface 150 may output voice data processed through the processor 110 to the user through a speaker.
  • the sensor module 160 includes a gesture sensor, a gyro sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, an RGB (red, green, blue) sensor, a biometric sensor, a temperature/humidity sensor, an illuminance sensor, or UV ( Ultra violet) may include at least one or more of the sensors.
  • the sensor module 160 may measure a physical quantity or detect an operating state of the electronic device 100 to convert the measured or sensed information into an electrical signal.
  • the sensor module 160 may include an olfactory sensor (E-nose sensor), an electromyography sensor (EMG sensor), an electroencephalogram sensor (EEG sensor, not shown), an electrocardiogram sensor (ECG sensor), and a photoplethysmography sensor (PPG sensor). ), a heart rate monitor sensor (HRM), a perspiration sensor, or a fingerprint sensor.
  • the sensor module 160 may further include a control circuit for controlling at least one or more sensors included therein.
  • the communication module 170 may include a wireless communication module or an RF module.
  • the wireless communication module may include, for example, Wi-Fi, BT, GPS or NFC.
  • the wireless communication module may provide a wireless communication function using a radio frequency.
  • the wireless communication module includes a network interface or modem for connecting the electronic device 100 to a network (eg, Internet, LAN, WAN, telecommunication network, cellular network, satellite network, POTS or 5G network, etc.) may include
  • the RF module may be responsible for transmitting/receiving data, for example, transmitting/receiving an RF signal or a called electronic signal.
  • the RF module may include a transceiver, a power amp module (PAM), a frequency filter, or a low noise amplifier (LNA).
  • the RF module may further include a component for transmitting and receiving electromagnetic waves in free space in wireless communication, for example, a conductor or a conducting wire.
  • the electronic device 100 includes at least one of a server, a TV, a refrigerator, an oven, a clothing styler, a robot cleaner, a drone, an air conditioner, an air purifier, a PC, a speaker, a home CCTV, lighting, a washing machine, and a smart plug. may contain one. Since the components of the electronic device 100 described in FIG. 4 exemplify the components generally provided in the electronic device, the electronic device 100 according to the embodiment of the present specification is not limited to the above-described components. may be omitted and/or added accordingly.
  • the electronic device 100 performs an artificial intelligence-based control operation by receiving the AI processing result from the cloud environment shown in FIG. 5, or includes an AI module in which components related to the AI process are integrated into one module. AI processing may be performed in an on-device manner.
  • FIGS. 5 and 6 illustrate an example in which data or signals may be received in the electronic device 100, but AI processing for processing the input data or signals is performed in a cloud environment.
  • FIG. 6 shows an example of on-device processing in which an overall operation related to AI processing for input data or signal is performed in the electronic device 100 .
  • the device environment may be referred to as a 'client device' or an 'AI device', and the cloud environment may be referred to as a 'server'.
  • FIG. 5 shows a schematic block diagram of an AI server according to an embodiment of the present specification.
  • the server 200 may include a processor 210 , a memory 220 , and a communication module 270 .
  • the AI processor 215 may learn the neural network using a program stored in the memory 220 .
  • the AI processor 215 may learn a neural network for recognizing data related to the operation of the AI device 100 .
  • the neural network may be designed to simulate a human brain structure (eg, a neuron structure of a human neural network) on a computer.
  • the neural network may include an input layer, an output layer, and at least one hidden layer.
  • Each layer may include at least one neuron having a weight, and the neural network may include a neuron and a synapse connecting the neurons.
  • each neuron may output an input signal input through a synapse as a function value of an activation function for weight and/or bias.
  • the plurality of network modes may transmit and receive data according to a connection relationship, respectively, so as to simulate a synaptic activity of a neuron through which a neuron sends and receives a signal through a synapse.
  • the neural network may include a deep learning model developed from a neural network model.
  • a deep learning model a plurality of network nodes can exchange data according to a convolutional connection relationship while being located in different layers.
  • Examples of neural network models include deep neural networks (DNNs), convolutional neural networks (CNNs), recurrent neural networks, restricted Boltzmann machines, and deep belief networks. ), including various deep learning techniques such as deep Q-network, and can be applied in fields such as vision recognition, voice recognition, natural language processing, and voice/signal processing.
  • the processor 210 performing the above-described functions may be a general-purpose processor (eg, CPU), but may be an AI-only processor (eg, GPU) for AI learning.
  • a general-purpose processor eg, CPU
  • an AI-only processor eg, GPU
  • the memory 220 may store various programs and data necessary for the operation of the AI device 100 and/or the server 200 .
  • the memory 220 is accessed by the AI processor 215 , and reading/writing/modification/deletion/update of data by the AI processor 215 may be performed.
  • the memory 220 may store a neural network model (eg, a deep learning model) generated through a learning algorithm for data classification/recognition according to an embodiment of the present specification.
  • the memory 220 may store not only the learning model 221 , but also input data, learning data, learning history, and the like.
  • the AI processor 215 may include a data learning unit 215a that learns a neural network for data classification/recognition.
  • the data learning unit 215a may learn a criterion regarding which training data to use to determine data classification/recognition and how to classify and recognize data using the training data.
  • the data learning unit 215a may learn the deep learning model by acquiring learning data to be used for learning and applying the acquired learning data to the deep learning model.
  • the data learning unit 215a may be manufactured in the form of at least one hardware chip and mounted on the server 200 .
  • the data learning unit 215a may be manufactured in the form of a dedicated hardware chip for artificial intelligence, and may be manufactured as a part of a general-purpose processor (CPU) or a graphics-only processor (GPU) and mounted on the server 200 .
  • the data learning unit 215a may be implemented as a software module.
  • the software module When implemented as a software module (or a program module including instructions), the software module may be stored in a computer-readable non-transitory computer readable medium. In this case, at least one software module may be provided to an operating system (OS) or provided by an application.
  • OS operating system
  • application application
  • the data learning unit 215a may use the acquired learning data to learn so that the neural network model has a criterion for how to classify/recognize predetermined data.
  • the learning method by the model learning unit may be classified into supervised learning, unsupervised learning, and reinforcement learning.
  • supervised learning refers to a method of learning an artificial neural network in a state where a label for the training data is given, and the label is the correct answer (or result value) that the artificial neural network should infer when the training data is input to the artificial neural network can mean
  • Unsupervised learning may refer to a method of training an artificial neural network in a state where no labels are given for training data.
  • Reinforcement learning may refer to a method in which an agent defined in a specific environment is trained to select an action or sequence of actions that maximizes the cumulative reward in each state.
  • the model learning unit may train the neural network model by using a learning algorithm including an error backpropagation method or a gradient decent method.
  • the learned neural network model may be referred to as a learning model 221 .
  • the learning model 221 may be stored in the memory 220 and used to infer a result for new input data other than the training data.
  • the AI processor 215 improves the analysis result using the learning model 221 or the data preprocessor 215b and/or the data selector in order to save the resources or time required for generating the learning model 221 .
  • (215c) may be further included.
  • the data preprocessor 215b may preprocess the acquired data so that the acquired data can be used for learning/inference for situation determination.
  • the data preprocessor 215b may extract feature information as a preprocessing for input data obtained through an input device, and the feature information may include a feature vector, a feature point, or It may be extracted in a format such as a feature map.
  • the data selection unit 215c may select data necessary for learning from among the training data or the training data pre-processed by the pre-processing unit.
  • the selected training data may be provided to the model training unit.
  • the data selector 215c may select only data about an object included in the specific region as the learning data by detecting a specific region among images acquired through a camera of the electronic device.
  • the data selection unit 215c may select data necessary for inference from among input data acquired through an input device or input data preprocessed by the preprocessor.
  • the AI processor 215 may further include a model evaluation unit 215d to improve the analysis result of the neural network model.
  • the model evaluation unit 215d may input evaluation data to the neural network model and, when an analysis result output from the evaluation data does not satisfy a predetermined criterion, cause the model learning unit to learn again.
  • the evaluation data may be preset data for evaluating the learning model 221 .
  • the model evaluation unit 215d may not satisfy a predetermined criterion when, among the analysis results of the learned neural network model for the evaluation data, the number or ratio of evaluation data for which the analysis result is not accurate exceeds a preset threshold. can be evaluated as
  • the communication module 270 may transmit the AI processing result by the AI processor 215 to an external electronic device.
  • FIG. 5 an example in which the AI process is implemented in a cloud environment due to computing operation, storage, and power constraints has been described, but the present specification is not limited thereto, and the AI processor 215 may be implemented by being included in the client device.
  • FIG. 6 is an example in which AI processing is implemented in a client device, and is the same as shown in FIG. 5 except that the AI processor 215 is included in the client device.
  • FIG. 6 is a schematic block diagram of an AI device according to another embodiment of the present specification.
  • each configuration shown in FIG. 6 may refer to FIG. 5 .
  • the AI processor since the AI processor is included in the client device 100, it may not be necessary to communicate with the server (200 in FIG. 5) in performing a process such as data classification/recognition, and accordingly, immediate or real-time data classification /recognition operation is possible.
  • the server since there is no need to transmit the user's personal information to the server (200 in FIG. 5 ), a targeted data classification/recognition operation is possible without leakage of personal information to the outside.
  • each component shown in FIGS. 5 and 6 represents functional elements that are functionally separated, and at least one component may be implemented in a form (eg, AI module) that is integrated with each other in an actual physical environment. Note that there is Of course, components other than the plurality of components shown in FIGS. 5 and 6 may be included or omitted.
  • FIG. 7 is a conceptual diagram illustrating an embodiment of an AI device.
  • the AI system 1 includes at least one of an AI server 106 , a robot 101 , an autonomous vehicle 102 , an XR device 103 , a smart phone 104 , or a home appliance 105 . It is connected to this cloud network (NW).
  • NW cloud network
  • the robot 101 to which the AI technology is applied, the autonomous vehicle 102 , the XR device 103 , the smart phone 104 , or the home appliance 105 may be referred to as AI devices 101 to 105 .
  • the cloud network may refer to a network that forms part of the cloud computing infrastructure or exists within the cloud computing infrastructure.
  • the cloud network NW may be configured using a 3G network, a 4G or Long Term Evolution (LTE) network, or a 5G network.
  • LTE Long Term Evolution
  • each of the devices 101 to 106 constituting the AI system 1 may be connected to each other through the cloud network NW.
  • each of the devices 101 to 106 may communicate with each other through the base station, but may also directly communicate with each other without passing through the base station.
  • the AI server 106 may include a server performing AI processing and a server performing an operation on big data.
  • the AI server 106 includes at least one of the AI devices constituting the AI system, such as a robot 101, an autonomous vehicle 102, an XR device 103, a smartphone 104, or a home appliance 105, and a cloud network ( NW) and may help at least a part of AI processing of the connected AI devices 101 to 105 .
  • the AI devices constituting the AI system, such as a robot 101, an autonomous vehicle 102, an XR device 103, a smartphone 104, or a home appliance 105, and a cloud network ( NW) and may help at least a part of AI processing of the connected AI devices 101 to 105 .
  • NW cloud network
  • the AI server 106 may train the artificial neural network according to a machine learning algorithm on behalf of the AI devices 101 to 105 , and directly store the learning model or transmit it to the AI devices 101 to 105 .
  • the AI server 106 receives input data from the AI devices 101 to 105, infers a result value with respect to the input data received using the learning model, and a response or control command based on the inferred result value. can be generated and transmitted to the AI devices 101 to 105 .
  • the AI devices 101 to 105 may infer a result value with respect to input data using a direct learning model, and generate a response or a control command based on the inferred result value.
  • FIG. 8 and 9 are diagrams for explaining the appearance of a robot cleaner used in an embodiment of the present specification
  • FIG. 10 is an exemplary block diagram of the robot cleaner used in an embodiment of the present specification.
  • FIG 8 is a perspective view of a robot cleaner 300 according to an embodiment of the present invention.
  • the robot cleaner 300 may include a cleaner body 50 and a camera 321 or a sensing unit 340 .
  • the camera 321 or the sensing unit 340 may irradiate light to the front and receive the reflected light.
  • the camera 321 or the sensing unit 340 may acquire depth information using a time difference at which the received light returns.
  • the cleaner body 50 may include other components other than the camera 321 and the sensing unit 340 among the components described with reference to FIG. 10 .
  • FIG. 9 is a bottom view of the robot cleaner 300 according to an embodiment of the present invention.
  • the robot cleaner 300 may further include a cleaner body 50 , a left wheel 61a , a right wheel 61b , and a suction unit 70 in addition to the configuration of FIG. 10 .
  • the left wheel 61a and the right wheel 61b may drive the cleaner body 50 .
  • the left wheel driving unit 361 may drive the left wheel 61a
  • the right wheel driving unit 362 may drive the right wheel 61b.
  • the robot cleaner 300 may suction foreign substances such as dust or garbage through the suction unit 70 .
  • the suction unit 70 is provided in the cleaner body 50 to suck the dust on the floor surface.
  • the suction unit 70 may further include a filter (not shown) for collecting foreign substances from the sucked airflow, and a foreign substance receiver (not shown) for accumulating the foreign substances collected by the filter.
  • the robot cleaner 300 may further include a mopping unit (not shown) in addition to the configuration of FIG. 10 .
  • the mopping unit (not shown) may include a mop (not shown) and a motor (not shown) that rotates the mop in contact with the floor or moves according to a set pattern.
  • the robot cleaner 300 may wipe the floor surface through a mopping unit (not shown).
  • the robot cleaner 300 may further include a driving driving unit 360 and a cleaning unit 390 .
  • the robot cleaner of FIG. 10 is an example of the AI device 100 described above with reference to FIG. 6 , it may further include at least one component.
  • the input unit 320 may include a camera 331 for inputting an image signal, a microphone 322 for receiving an audio signal, and a user input unit 323 for receiving information from a user. there is.
  • the voice data or image data collected by the input unit 320 may be analyzed and processed as a user's control command.
  • the input unit 320 is for inputting image information (or signal), audio information (or signal), data, or information input from a user.
  • the robot cleaner 300 includes one or more Cameras 321 may be provided.
  • the camera 321 processes an image frame such as a still image or a moving image obtained by an image sensor in a video call mode or a shooting mode.
  • the processed image frame may be displayed on the display unit 351 or stored in the memory 370 .
  • the microphone 322 processes an external sound signal as electrical voice data.
  • the processed voice data may be utilized in various ways according to a function (or a running application program) being performed by the robot cleaner 300 . Meanwhile, various noise removal algorithms for removing noise generated in the process of receiving an external sound signal may be applied to the microphone 322 .
  • the user input unit 323 is for receiving information from a user. When information is input through the user input unit 323, the processor 380 may control the operation of the robot cleaner 300 to be related to the input information. .
  • the user input unit 323 includes a mechanical input means (or a mechanical key, for example, a button located on the front/rear or side of the terminal 300, a dome switch, a jog wheel, a jog switch, etc.) and It may include a touch-type input means.
  • the touch input means consists of a virtual key, a soft key, or a visual key displayed on the touch screen through software processing, or a part other than the touch screen. It may be made of a touch key (touch key) disposed on the.
  • the sensing unit 340 may be referred to as a sensor unit.
  • the sensing unit 340 may include at least one of a depth sensor (not shown) and an RGB sensor (not shown) to acquire image data about the periphery of the artificial intelligence device 300 .
  • the depth sensor may detect that light irradiated from the light emitting unit (not shown) is reflected back to the object.
  • the depth sensor may measure a distance to an object based on a time difference at which the returned light is sensed, an amount of the returned light, and the like.
  • the depth sensor may acquire 2D image information or 3D image information about the artificial intelligence device 300 based on the measured distance between objects.
  • the RGB sensor may acquire color image information about objects or users around the artificial intelligence device 300 .
  • the color image information may be a photographed image of an object.
  • the RGB sensor may be referred to as an RGB camera. In this case, the camera 321 may mean an RGB sensor.
  • the output unit 350 includes at least one of a display unit 351 , a sound output unit 352 , a haptic module 353 , and an optical output unit 354 . can do.
  • the display unit 351 displays (outputs) information processed by the robot cleaner 300 .
  • the display unit 351 may display information on an execution screen of an application program driven by the robot cleaner 300 , or user interface (UI) and graphic user interface (GUI) information according to the information on the execution screen.
  • the display unit 351 may implement a touch screen by forming a layer structure with the touch sensor or being formed integrally with the touch sensor. Such a touch screen may function as a user input unit 323 providing an input interface between the robot cleaner 300 and a user, and may provide an output interface between the terminal 300 and the user.
  • the sound output unit 352 may output audio data received from the communication unit 310 or stored in the memory 370 in a call signal reception, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, and the like.
  • the sound output unit 352 may include at least one of a receiver, a speaker, and a buzzer.
  • the haptic module 353 generates various tactile effects that the user can feel.
  • a representative example of the tactile effect generated by the haptic module 353 may be vibration.
  • the light output unit 354 outputs a signal for notifying the occurrence of an event by using the light of the light source of the robot cleaner 300 .
  • Examples of the event generated by the robot cleaner 300 may be message reception, call signal reception, missed call, alarm, schedule notification, email reception, information reception through an application, and the like.
  • the driving driving unit 360 may move the artificial intelligence device 300 in a specific direction or by a specific distance.
  • the driving driving unit 360 may include a left wheel driving unit 361 for driving the left wheel of the artificial intelligence device 300 and a right wheel driving unit 362 for driving the right wheel.
  • the left wheel driving unit 361 may include a motor for driving the left wheel
  • the right wheel driving unit 362 may include a motor for driving the right wheel.
  • FIG. 10 it has been described that the driving driving unit 360 includes the left wheel driving unit 361 and the right wheel driving unit 362 as an example, but the present invention is not limited thereto. That is, in an embodiment, the driving driving unit 360 may be configured with only one wheel.
  • the cleaning unit 390 may include at least one of the suction unit 391 and the mopping unit 392 to clean the floor surface near the artificial intelligence device 300 .
  • the suction unit 391 may also be called a vacuum cleaner.
  • the suction unit 391 may suck in air to suck in foreign substances such as dust or garbage around the artificial intelligence device 300 .
  • the suction unit 391 may include a brush or the like as a means for collecting foreign substances.
  • the mop unit 392 may wipe the floor while the mop is at least partially in contact with the floor surface of the artificial intelligence device 300 .
  • the mop part 392 may include a mop and a mop driving unit that moves the mop.
  • the distance from the ground to the mop of the mop part 392 can be adjusted through the mop driving unit. That is, the mop driving unit may operate so that the mop is in contact with the ground when mopping is required.
  • the robot cleaner 300 according to various embodiments of the present specification is not limited to having all of the above-described components, and some components are omitted or depending on the function and/or use of the robot cleaner 300 . may include more.
  • FIG. 11 is a view for illustrating a running process of the robot cleaner traveling on a cleaning surface of various materials
  • FIG. 12 is a diagram for illustrating a running process of the robot cleaner for acquiring an image of the cleaning surface.
  • the robot cleaner 300 may detect a cleaning surface made of a material different from the currently running cleaning surface.
  • the cleaning surfaces of different materials may be classified according to frictional force, the degree of adsorption to the robot cleaner 300, and the uniformity of the surfaces, but is not limited thereto.
  • the robot cleaner 300 may enter the second cleaning surface SURF2 such as a rug while driving along the first cleaning surface SURF1 that is a flat tile.
  • the robot cleaner 300 may detect the second cleaning surface SURF2 and perform a control operation related to the second cleaning surface SURF2 .
  • the control operation related to the first cleaning surface SURF1 and the control operation related to the second cleaning surface SURF2 may be different from each other.
  • the robot cleaner 300 may obtain an image for detecting the material of the cleaning surface SURF3 .
  • the robot cleaner 300 may obtain an image of the cleaning surface SURF3 located in the traveling direction through the camera 321 .
  • the cleaning surface SURF3 may have a different shape, shape, color, or combination thereof depending on the type of material. For example, in the case of a wood material, it may include a wood grain pattern and a yellow color related to the wood material.
  • the robot cleaner 300 may generate information about the type of material from the image of the cleaning surface SURF3 .
  • the robot cleaner 300 may extract a shape (eg, a contour) and/or a color from the image of the cleaning surface SURF3 .
  • the robot cleaner 300 sets the feature vector related to the shape or color extracted in this way as input data, and sets the label for the purpose of inference as the type of material from the image of the cleaning surface SURF3 through the supervised neural network model.
  • Information regarding the material of the cleaning surface SURF3 may be generated. Meanwhile, the generation of information on the material of the cleaning surface SURF3 may be performed in real time while driving or may be performed in a charged state after cleaning of the interior is completed.
  • the robot cleaner 300 used in an embodiment of the present specification may control at least one of a driving path and a suction force based on information about a material generated from an image of the cleaning surface SURF3. For example, when it is determined that the material of the cleaning surface SURF3 is a soft floor (SF), the robot cleaner 300 may run slowly with a stronger suction force in response to the determination result. For another example, when the material of the cleaning surface SURF3 is determined to be a hard floor (HF), the robot cleaner 300 may run faster with a relatively weak suction force in response to the determination result.
  • the control method of the robot cleaner 300 is not limited to the above-described example.
  • the robot cleaner 300 since the robot cleaner 300 may cause a malfunction of the robot cleaner 300 if it controls the cleaning operation based only on information about the material inferred from the image, the robot cleaner 300 according to an embodiment of the present specification is The operation of the robot cleaner 300 may be controlled using additional basic data. For example, in recent tiles, in addition to tiles actually made of wood, tiles made of other materials having the shape, shape, color, etc. of wood are also used. In this case, when the control is performed based on only the image of the cleaning surface, the above-described imitation tiles cannot be distinguished, and thus a target effect may not be derived.
  • a method for dividing a cleaning area using various basic data and controlling the robot cleaner 300 with different operations according to the divided cleaning area will be described.
  • FIGS. 13 and 14 are flowcharts of a cleaning area management method according to an embodiment of the present specification
  • FIGS. 15 and 16 are diagrams for explaining the cleaning area management method by way of example.
  • a 'contact surface' is a cleaning surface in contact with the robot cleaner 300, and 'contact surface' and 'cleaning surface' may be used interchangeably.
  • the robot cleaner 300 may obtain data related to the contact surface in a plurality of areas of the room through the sensing unit ( S110 ).
  • the data related to the contact surface may include, but is not limited to, at least one of a contour, a color, a current change of a driving motor, and a noise pattern of the contact surface.
  • the sensing unit may include a camera 321 including an image sensor, a microphone 322 , or a current sensor.
  • the camera 321 is arranged in parallel with the driving direction of the robot cleaner 300, so that the contact surface of the front surface of the driving path while the robot cleaner 300 is driving may be photographed in advance (S110a).
  • the microphone 322 may detect noise generated between the suction unit 391 and the contact surface while the robot cleaner 300 is operating (S110b).
  • the current sensor may detect a change in current of the driving motor that changes in relation to the material of the contact surface (S110c).
  • the robot cleaner 300 may receive data related to the contact surfaces of a plurality of areas in the room through a communication-connected network. The data (or information) obtained in this way may be recorded in the memory of the robot cleaner 300 .
  • the robot cleaner 300 may infer the material of the contact surface to generate information on the material of the contact surface (S120).
  • the information about the material may include a type of a material of the contact surface, an absorption of the material, or a resistance.
  • the type of material may include wood, marble, rug, and the like.
  • the degree of adhesion of the material may have a value between 0 and 100. The closer the object is to the suction unit 391, the closer to 100, and the closer to 0, the closer the object is to the suction unit 391.
  • the degree of adhesion may be determined from a noise pattern generated in the cleaning process received through the microphone 322 .
  • the degree of adhesion may be calculated as a higher value.
  • the resistance may be determined from the frictional force between the driving driving unit 360 and the contact surface in the driving process of the robot cleaner 300 .
  • the resistance may have a value of 0 to 100, and may have a value close to 100 as the frictional force between the driving driving unit 360 and the contact surface is strong, and may have a value close to 0 as the frictional force is weak.
  • the numerical range of the information on the material is only an embodiment according to some embodiments of the present specification, and the scope of rights is not limited to the numerical range described above.
  • the type, adhesion, or resistance may be used as base information for grouping the cleaning areas later.
  • the cleaning area classification method may be performed using a neural network model having a plurality of nodes.
  • information on the type of material of the contact surface may be performed using the first neural network model.
  • the first neural network model may be a neural network model trained by a supervised learning technique in which the pattern, color, or feature extracted from the pattern or color of the cleaning surface is set as an input and the type of material is set as a label.
  • information on adhesion may be performed using the second neural network model.
  • the second neural network model may be a neural network model trained by a supervised learning technique in which a noise pattern or a feature extracted from the noise pattern is set as an input, and adhesion is set as a label.
  • information on resistivity may be performed using a third neural network model.
  • the third neural network model may be a neural network model trained by a supervised learning technique in which a change in current of a driving motor or a feature extracted from a change in current is set as an input, and a degree of constancy is set as a label.
  • a feature may be defined as an embedding extracted from each corresponding information or data. Embedding may appear in a vector space, and clustering may be performed according to a relationship (eg, similarity) between a plurality of vectors.
  • the robot cleaner 300 may classify the plurality of regions into at least one group based on the information on the material generated for each region ( S130 ).
  • the robot cleaner 300 may analyze the material information generated for each area, calculate a similarity of each material information, and classify the plurality of areas into at least one group based on the similarity level.
  • the robot cleaner 300 may group regions in which the material type has a similarity greater than or equal to a preset first threshold into one group.
  • the robot cleaner 300 may group regions in which the similarity of the noise pattern based on the degree of adhesion is equal to or greater than a preset second threshold into one group.
  • the robot cleaner 300 may group regions in which the similarity of the current change of the driving motor is equal to or greater than a preset third threshold value into one group. Grouping (or clustering) based on the above-described similarity may be performed using an AI-based clustering technique.
  • the cleaning area management method according to an embodiment of the present specification may be performed by the K-means algorithm, but is not limited thereto.
  • the cleaning area management method according to an embodiment of the present specification may be performed using a Euclidean Distance or a Mahalanobis Distance, but is not limited thereto.
  • the first to third thresholds may be set differently for each information about each material.
  • the robot cleaner 300 selects an area corresponding to each information if the information on the remaining material is less than the second and third thresholds set in advance. It may not be possible to classify them into one group.
  • the robot cleaner 300 may group cleaning areas by combining one or two or more pieces of information about the material.
  • the cleaning zone division method according to various embodiments of the present specification divides the cleaning zone using two or more of the material type, adhesion, or resistance, thereby preventing misclassification due to the misrecognition described above in FIG. 12 . .
  • At least one group may include a characteristic related to the group.
  • the group-related feature may be defined as a group feature (Features of Group).
  • the group feature may be composed of an average of values indicating the type of material used to create each group, adhesion, or resistance.
  • the robot cleaner 300 may collect the floor type and resistance values for a plurality of areas in the room.
  • each of the first to seventh floors may have different floor types and/or resistances.
  • the first floor has a floor type of 1, a resistance of 1.1, a second floor of a floor type of 7, a resistance of 3.4, a third floor of a floor type of 8, a resistance of 4.2, and the fourth floor is a floor
  • the type may have 2, the resistivity of 1.5
  • the fifth floor may have 3 of the floor type
  • the seventh floor may have 3 of the floor type and the resistance of 1.9.
  • a value indicating each floor type corresponds to a cleaning surface of each material.
  • the robot cleaner 300 may perform clustering with respect to the embedding on the cleaning surface.
  • At least one group generated as a result of clustering may have a feature value of the same category (eg, floor type, resistance, adhesion, etc.) as an embedding that is a target of clustering.
  • the feature value may be an average value of feature values of embeddings targeted for clustering. For example, based on (floor type, resistance), the first group may have (2.3, 3.9), the second group may have (7.3, 4.6), and the third group may have (6.7, 2.3).
  • the robot cleaner 300 may convert it into an integer value.
  • the robot cleaner 300 may round off or round off a number after a decimal point with respect to a value related to a floor type of a group.
  • the robot cleaner 300 may determine the priority of cleaning based on information about at least one group of materials (S140). For example, the robot cleaner 300 may control such that the higher the resistance, the higher the priority is set. For another example, the robot cleaner 300 may control to set a lower priority as the degree of adhesion is higher. As another example, it is possible to control so that the priority is set according to a preset order according to the type of material.
  • the robot cleaner 300 gives priority according to any one of resistance, adhesion, and/or the type of material, and for a group that cannot determine the priority, the resistance, adhesion and/or Alternatively, the priority may be determined in detail according to another one of the types of materials.
  • the robot cleaner 300 may receive a characteristic value indicating resistance, adhesion, and/or a material type as an input, and generate an output for determining priority. The robot cleaner 300 may determine a priority for each group based on the output. As such, the robot cleaner 300 may prevent dust from adhering again to the cleaning surface such as the rug by giving a higher priority as the resistance of the group is lower.
  • the robot cleaner 300 may control at least one parameter of a neural network model for determining a priority in order to determine an appropriate priority.
  • the robot cleaner 300 may give a higher weight to the resistance among the information about the material of the group with respect to the neural network model.
  • the robot cleaner 300 may determine the priority by giving a higher weight to the type of the contact surface among the information about the material of the group with respect to the neural network model.
  • the robot cleaner 300 may determine the priority by giving a higher weight to the adhesion degree among the information about the material of the group with respect to the neural network model.
  • the robot cleaner 300 may provide a driving route for performing cleaning according to the determined priority (S150).
  • the robot cleaner 300 preferentially drives an area corresponding to a group having a high priority, and then drives an area corresponding to a group having a low priority.
  • the robot cleaner 300 according to various embodiments of the present specification may change the suction force or the driving mode (soft floor mode, hard floor mode, etc.) according to a group and position change during cleaning.
  • the information on the suction force or the driving mode may be tagged in map data of a driving route determined according to priority, but is not limited thereto.
  • the suction force may increase in proportion to the magnitude of the resistivity.
  • the suction force may increase in proportion to the size of the adhesion.
  • the robot cleaner may change the control operation differently according to the suction force.
  • the robot cleaner may reduce the suction force until the adhesion is reduced to less than or equal to the second threshold when the adhesion degree of the material information of at least one group reaches a preset first threshold or more.
  • the first and second threshold values may be different from each other.
  • the robot vacuum cleaner can separate by controlling the suction force when an object that has a low weight such as a towel but is in close contact with the suction unit and may cause an abnormal operation is in close contact while cleaning. Since a movable object such as a towel has a difference in weight unlike a rug, the above-described two cases can be separated based on the degree of adhesion. On the other hand, since the towel is movable but corresponds to an example of an object with high adhesion, and the rug is only an example of an object with high adhesion but has a weight that is not movable, the rights of various embodiments of the present specification are limited to the above-described examples not going to do
  • FIG. 14 is a flowchart of a cleaning area management method according to an embodiment of the present specification.
  • at least one step illustrated in FIG. 14 may be omitted. Meanwhile, content that overlaps with FIG. 13 among at least one step described in FIG. 14 will be omitted.
  • S210a, S210b, and S210c correspond to S110a, S110b, and S110c, respectively.
  • S230, S240, and S250 correspond to S130, S140, and S150, respectively.
  • the robot cleaner 300 may generate information on the material of the cleaning surface by using the values of at least one parameter generated from S210a, S210b, and S210c (S220).
  • TYPE, ABSORPTION DEGREE, and RESISTANCE DEGREE may be inferred through the first to third neural network models.
  • the material of the contact surface may be predicted by using an embedding indicating TYPE, ABSORPTION DEGREE, and RESISTANCE DEGREE or a fourth neural network model that inputs feature values.
  • the robot cleaner 300 may adjust the weight between at least one node constituting the fourth neural network model so that the output of the fourth neural network model is suitable for inferring the target material of the contact surface.
  • FIG. 16 is a diagram for explaining a specific implementation example of S140 of FIG. 13 .
  • the robot cleaner 300 may divide a plurality of indoor areas into first to third groups. In the embodiment of FIG. 16 , a plurality of regions of the room are divided based on the type of material and the resistivity.
  • the cleaning area division method may give priority to a plurality of groups. This prioritization is intended to improve cleaning efficiency. In the case of a soft floor including rugs, if it is cleaned first, dust generated in other areas may easily adhere thereafter. On the other hand, in the case of a hard floor including a hard floor and a back, dust generated in other areas is not easily attached. Accordingly, in some embodiments of the present specification, the robot cleaner 300 may be controlled such that cleaning is preferentially performed on the soft floor.
  • At least one area of the room includes a kitchen rug, a living room carpet, a children's playroom, and a general tile surface.
  • the kitchen rug, the living room carpet, the children's playroom, and the general tile surface may have different floor types and resistances.
  • the robot cleaner 300 may classify the children's playroom into a first group, kitchen rugs and living room carpets into a second group, and general tile surfaces into a third group.
  • the robot cleaner 300 extracts values related to the floor type and resistance extracted from a plurality of unit areas constituting the first to third groups, and calculates an average value of the values related to the extracted floor type and resistance. Thus, it can be assigned as a feature value for each of the first to third groups.
  • the robot cleaner 300 may determine a priority based on the feature values for each of the first to third groups. At this time, it is common that the material to be cleaned in the next priority, such as a rug, has strong friction and/or resistance. Accordingly, by analyzing the resistance, the robot cleaner 300 may set the third group having the lowest resistance as the first priority, the first group as the second priority, and the third group as the third priority. .
  • the robot cleaner 300 may perform a cleaning operation according to the set first to third order of priority. Meanwhile, the robot cleaner 300 needs to be controlled to be differently controlled not only for the determined cleaning order, but also for the type, resistance, and/or adhesion of the cleaning surface material determined for each group.
  • the robot cleaner 300 may control the suction power or the running speed of the robot cleaner 300 based on various characteristic values related to the pre-classified groups.
  • the feature value related to the pre-classified group may include a feature value related to the floor type, a feature value related to resistance, or a feature value related to adhesion.
  • the robot cleaner 300 may control to increase the suction force by the motor and/or the suction unit 391 in proportion to the resistance or adhesion.
  • the robot cleaner 300 may control the running speed to decrease in proportion to the resistance or adhesion.
  • the robot cleaner 300 may control the suction force and/or the running speed based on the type of the material of the cleaning surface.
  • the various suction force or driving speed control method of the present specification is not limited to the above-described example.
  • FIG. 17 is a flowchart of a cleaning area management method according to the generation of a new cleaning area. Meanwhile, in some embodiments of FIG. 17 , contents overlapping with FIGS. 13 and 14 will be omitted.
  • the robot cleaner 300 may detect the occurrence of a new cleaning area ( S310 ).
  • the robot cleaner 300 may obtain source data related to grouping of the cleaning area while cleaning is performed based on preset control information (suction force, travel speed, travel route, etc.).
  • preset control information suction force, travel speed, travel route, etc.
  • the robot cleaner 300 may perform material analysis of the cleaning surface according to the new source data.
  • a rug may be disposed in the entrance hall according to a change in the interior structure.
  • the robot cleaner 300 may check that the image information, resistance, noise pattern, etc. obtained while driving according to the control information associated with the third group before is different from the source data of the past while driving the door rug.
  • the robot cleaner 300 may perform regrouping.
  • the robot cleaner 300 may perform cleaning according to a preset priority (S310: No, S320).
  • the robot cleaner 300 may obtain the source data of the new cleaning area in response to the occurrence (S310: Yes, S331).
  • the robot cleaner 300 may generate information about the material of the contact surface (S332).
  • the robot cleaner 300 may group at least one cleaning area into at least one group based on the information on the material (S333).
  • the robot cleaner 300 may give priority to at least one group (S334).
  • the robot cleaner may control the driving motor and the suction unit 391 to change the suction force or the driving mode according to group and position changes while cleaning.
  • Some embodiments of the present specification may be performed in connection with a 5G network.
  • the processor 380 of the robot cleaner 300 transmits source data including image information of the contact surface, audio information generated in the cleaning process, or current change information of the driving motor to the AI system 1 included in the 5G network. You can control the communication module to transmit to Also, the robot cleaner 300 may control the communication module to receive AI-processed information from the AI system 1 .
  • the robot cleaner 300 may perform an initial connection procedure with the 5G network in order to transmit the source data to the 5G network.
  • the robot cleaner 300 may perform an initial access procedure with the 5G network based on a synchronization signal block (SSB).
  • SSB synchronization signal block
  • the robot cleaner 300 may receive from the network DCI (Downlink Control Information) used to schedule transmission of source data through the communication module.
  • the robot cleaner 300 may transmit source data or a feature value extracted from the source data to the 5G network based on the DCI.
  • Source data or feature values extracted from source data are transmitted to a 5G network through PUSCH, and the SSB and DM-RS of the PUSCH may be QCLed for QCL type D.
  • the above-described specification can be implemented as computer-readable code on a medium in which a program is recorded.
  • the computer-readable medium includes all kinds of recording devices in which data readable by a computer system is stored. Examples of computer-readable media include Hard Disk Drive (HDD), Solid State Disk (SSD), Silicon Disk Drive (SDD), ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
  • HDD Hard Disk Drive
  • SSD Solid State Disk
  • SDD Silicon Disk Drive
  • ROM Read Only Memory
  • RAM Compact Disk Drive
  • CD-ROM Compact Disk Read Only Memory
  • magnetic tape floppy disk
  • optical data storage device etc.
  • carrier wave eg, transmission over the Internet

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Mobile Radio Communication Systems (AREA)

Abstract

L'invention divulgue un procédé de gestion de zone de nettoyage et un robot nettoyeur. Un procédé de gestion de zone de nettoyage selon un mode de réalisation de la présente spécification : reçoit des données associées à des surfaces de contact dans une pluralité de zones intérieures ; stocke les données relatives à la surface de contact pour chaque zone ; génère des informations de matériau (ou de texture) concernant la surface de contact pour chaque zone sur la base des données relatives à la surface de contact ; et classifie la pluralité de zones en un ou plusieurs groupes sur la base de la similarité des informations de matériau générées pour chaque zone. Le robot nettoyeur et un système d'IA selon la présente invention peuvent être connectés à un module d'intelligence artificielle, un engin volant sans pilote embarqué (UAV), un robot, un dispositif de réalité augmentée (RA), un dispositif de réalité virtuelle (RV), un dispositif en lien avec des services 5G, ou similaires.
PCT/KR2020/006990 2020-01-22 2020-05-29 Procédé de commande d'un robot de nettoyage selon le matériau de la surface en contact WO2021149879A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020200008881A KR20210094961A (ko) 2020-01-22 2020-01-22 피접촉면의 재질에 따른 청소로봇 제어방법
KR10-2020-0008881 2020-01-22

Publications (1)

Publication Number Publication Date
WO2021149879A1 true WO2021149879A1 (fr) 2021-07-29

Family

ID=76993005

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2020/006990 WO2021149879A1 (fr) 2020-01-22 2020-05-29 Procédé de commande d'un robot de nettoyage selon le matériau de la surface en contact

Country Status (2)

Country Link
KR (1) KR20210094961A (fr)
WO (1) WO2021149879A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220167813A1 (en) * 2020-11-30 2022-06-02 The Boeing Company Smart industrial vacuum cleaner to reduce foreign object debris
CN114569003A (zh) * 2022-02-17 2022-06-03 美智纵横科技有限责任公司 可移动设备的控制方法、装置、可移动设备和存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160058594A (ko) * 2014-11-17 2016-05-25 삼성전자주식회사 로봇 청소기, 단말장치 및 그 제어 방법
KR20170033579A (ko) * 2015-09-17 2017-03-27 삼성전자주식회사 청소 로봇 및 그 제어 방법
KR101931365B1 (ko) * 2011-08-22 2018-12-24 삼성전자주식회사 로봇청소기 및 그 제어방법
KR20190092338A (ko) * 2019-05-30 2019-08-07 엘지전자 주식회사 청소 로봇
JP2019217208A (ja) * 2018-06-22 2019-12-26 東芝ライフスタイル株式会社 自律走行式掃除機

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101931365B1 (ko) * 2011-08-22 2018-12-24 삼성전자주식회사 로봇청소기 및 그 제어방법
KR20160058594A (ko) * 2014-11-17 2016-05-25 삼성전자주식회사 로봇 청소기, 단말장치 및 그 제어 방법
KR20170033579A (ko) * 2015-09-17 2017-03-27 삼성전자주식회사 청소 로봇 및 그 제어 방법
JP2019217208A (ja) * 2018-06-22 2019-12-26 東芝ライフスタイル株式会社 自律走行式掃除機
KR20190092338A (ko) * 2019-05-30 2019-08-07 엘지전자 주식회사 청소 로봇

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220167813A1 (en) * 2020-11-30 2022-06-02 The Boeing Company Smart industrial vacuum cleaner to reduce foreign object debris
US11882985B2 (en) * 2020-11-30 2024-01-30 The Boeing Company Smart industrial vacuum cleaner to reduce foreign object debris
CN114569003A (zh) * 2022-02-17 2022-06-03 美智纵横科技有限责任公司 可移动设备的控制方法、装置、可移动设备和存储介质

Also Published As

Publication number Publication date
KR20210094961A (ko) 2021-07-30

Similar Documents

Publication Publication Date Title
WO2021010506A1 (fr) Procédé et dispositif de régulation de la qualité de l'air intérieur utilisant un purificateur d'air intelligent
WO2021085778A1 (fr) Machine à laver intelligente
WO2020262737A1 (fr) Robot de nettoyage intelligent
WO2021149879A1 (fr) Procédé de commande d'un robot de nettoyage selon le matériau de la surface en contact
WO2020251079A1 (fr) Lave-linge intelligent et son procédé de commande
WO2020246649A1 (fr) Procédé au moyen duquel un dispositif informatique périphérique reconnaît une voix
WO2019031714A1 (fr) Procédé et appareil de reconnaissance d'objet
US11467604B2 (en) Control device and method for a plurality of robots
WO2020251066A1 (fr) Dispositif de robot intelligent
WO2020256177A1 (fr) Procédé de commande de véhicule
WO2021002486A1 (fr) Procédé de reconnaissance vocale et dispositif associé
KR20210128074A (ko) 립리딩 기반의 화자 검출에 따른 오디오 줌
WO2020246639A1 (fr) Procédé de commande de dispositif électronique de réalité augmentée
WO2020230943A1 (fr) Procédé de fourniture de service d'essayage de vêtement au moyen d'un avatar 3d, et système associé
WO2021020633A1 (fr) Appareil de traitement de vêtements basé sur l'intelligence artificielle et son procédé de fonctionnement
WO2020251067A1 (fr) Réfrigérateur exploitant l'intelligence artificielle et procédé de stockage d'aliments associé
WO2022030786A1 (fr) Procédé et appareil de fusion de mesures de radiofréquence et de capteur pour la gestion de faisceau
WO2021015301A1 (fr) Machine à laver intelligente et son procédé de commande
KR20210054796A (ko) 지능형 디바이스의 도어 오픈 모니터링
WO2019182378A1 (fr) Serveur d'intelligence artificielle
WO2017067465A1 (fr) Procédé et dispositif de reconnaissance de geste
WO2021010504A1 (fr) Lave-linge intelligent et procédé de commande dudit lave-linge intelligent
WO2020138856A1 (fr) Procédé et appareil pour une re-sélection de cellule dans un système de communication sans fil
WO2021150054A1 (fr) Procédé et appareil de gestion de mobilité dans un système de communication sans fil
WO2021167171A1 (fr) Commande pour dispositif de cuisson intelligent

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20915134

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20915134

Country of ref document: EP

Kind code of ref document: A1