CN111685657A - Information interaction control method and system - Google Patents

Information interaction control method and system Download PDF

Info

Publication number
CN111685657A
CN111685657A CN201910382404.4A CN201910382404A CN111685657A CN 111685657 A CN111685657 A CN 111685657A CN 201910382404 A CN201910382404 A CN 201910382404A CN 111685657 A CN111685657 A CN 111685657A
Authority
CN
China
Prior art keywords
coordinate
coordinate point
type
track
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910382404.4A
Other languages
Chinese (zh)
Inventor
田云龙
王晔
喻建琦
孙磊
白宏磊
周华
彭迪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Haier Co Ltd
Qingdao Haier Smart Technology R&D Co Ltd
Original Assignee
Qingdao Haier Co Ltd
Qingdao Haier Smart Technology R&D Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Haier Co Ltd, Qingdao Haier Smart Technology R&D Co Ltd filed Critical Qingdao Haier Co Ltd
Publication of CN111685657A publication Critical patent/CN111685657A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor

Abstract

The application discloses a control method and a system for information interaction, which relate to the technical field of communication and are characterized in that position information of mobile equipment is acquired; and drawing track or identification area information through the position information. The mobile path of the mobile device is checked in real time, the product function of the mobile device is enhanced, the mobile device is more intelligent, and the user experience is improved.

Description

Information interaction control method and system
Technical Field
The present application relates to the field of communications technologies, and in particular, to a method and a system for controlling information interaction.
Background
At present, with the development of communication network technology and the continuous improvement of requirements of people on household products, intelligent household appliances of the internet of things are more and more in sight lines concerned by people. The design and development of the internet of things intelligent household appliances are not only stopped at the experimental research and development stage, various intelligent household appliances begin to enter the market in a small scale, and the use objects begin to face common social users.
Along with the continuous progress of the intelligent home technology and the improvement of the living standard of the user, the requirements of the user on the intellectualization and the user experience of the mobile equipment, particularly the sweeper, are also continuously improved. In the process that a user uses the sweeper to sweep, as the interactive logics of the sweeper are different, the sweeping path and the working effect are also different, and the application program associated with the sweeper and the user terminal is increasingly urgent to inquire the sweeping track of the sweeper. In the process of implementing the embodiments of the present disclosure, it is found that at least the following problems exist in the related art: the cleaning track of the sweeper can be recognized in the prior art, but the user terminal of the mobile equipment cannot timely display the cleaning track to a user in real time.
Disclosure of Invention
The following presents a simplified summary in order to provide a basic understanding of some aspects of the disclosed embodiments. This summary is not an extensive overview nor is intended to identify key/critical elements or to delineate the scope of such embodiments but rather as a prelude to the more detailed description that is presented later.
According to an aspect of the embodiments of the present disclosure, a method for controlling information interaction is provided.
In some optional embodiments, the method comprises: acquiring position information of a mobile device; and drawing track or identification area information through the position information.
According to another aspect of the disclosed embodiments, a control system for information interaction is provided.
In some optional embodiments, the system comprises: the user terminal is configured to send a query instruction, receive position information sent by the mobile equipment, draw track or identification area information according to the position information and display a track image; and sending an instruction to the mobile equipment, and triggering the mobile equipment to calculate and store the coordinate point and the type corresponding to the coordinate point.
According to another aspect of an embodiment of the present disclosure, an electronic device is provided.
In some optional embodiments, the electronic device comprises:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor, and the instructions, when executed by the at least one processor, cause the at least one processor to perform the above-mentioned control method of information interaction.
According to another aspect of an embodiment of the present disclosure, a computer-readable storage medium is provided.
In some alternative embodiments, the computer-readable storage medium stores computer-executable instructions configured to perform the above-described method of controlling information interaction.
According to another aspect of an embodiment of the present disclosure, a computer program product is provided.
In some alternative embodiments, the computer program product comprises a computer program stored on a computer-readable storage medium, the computer program comprising program instructions that, when executed by a computer, cause the computer to perform the above-described method of controlling information interaction.
Some technical solutions provided by the embodiments of the present disclosure can achieve the following technical effects: the user terminal interacts with the mobile device, receives the position information of the mobile device, draws the moving track of the mobile device, displays the moving track to the user through the display screen, realizes real-time checking of the moving path of the mobile device, enhances the product function of the mobile device, enables the mobile device to be more intelligent, and improves user experience.
The foregoing general description and the following description are exemplary and explanatory only and are not restrictive of the application.
Drawings
One or more embodiments are illustrated by way of example in the accompanying drawings, which correspond to the accompanying drawings and not in limitation thereof, in which elements having the same reference numeral designations are shown as like elements and not in limitation thereof, and wherein:
FIG. 1 is a schematic flow chart diagram of one embodiment provided by an embodiment of the present disclosure;
FIG. 2 is a schematic diagram illustrating an embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of an electronic device provided in an embodiment of the present disclosure.
Reference numerals:
100. a processor; 101. a memory; 102. a communication interface; 103. a bus; 1. a user terminal; 2. a mobile device; 3. and (4) a server.
Detailed Description
So that the manner in which the features and elements of the disclosed embodiments can be understood in detail, a more particular description of the disclosed embodiments, briefly summarized above, may be had by reference to the embodiments, some of which are illustrated in the appended drawings. In the following description of the technology, for purposes of explanation, numerous details are set forth in order to provide a thorough understanding of the disclosed embodiments. However, one or more embodiments may be practiced without these details. In other instances, well-known structures and devices may be shown in simplified form in order to simplify the drawing.
As shown in fig. 1, an embodiment of the present disclosure provides a method for controlling information interaction, including:
s101, acquiring position information of the mobile equipment;
and S102, drawing track or identification area information through the position information.
Optionally, the location information includes:
an initial position and a current coordinate point; wherein the initial position is a position where the mobile device starts moving; the method further comprises the following steps: triggering the mobile equipment to record angle information at a frequency f1, and acquiring mobile mileage data; and triggering the mobile equipment to calculate the current coordinate point by the angle information and the mobile mileage data.
Optionally, the location information further includes a coordinate point type; the method further comprises the following steps:
triggering the mobile device at intervals of t1, storing the coordinates of the current coordinate point, and setting the current track coordinate point as a first coordinate type, representing that the track coordinate point is the position of the mobile device, where t1 is greater than 0.
Optionally, when the mobile device is triggered to operate to the current coordinate point, determining whether an infrared sensor and a microswitch are triggered, and when the infrared sensor and the microswitch are not triggered, setting the coordinate point to be a second coordinate type; and when the infrared sensor and the micro switch are triggered, setting the coordinate point as a third coordinate type.
Optionally, at time t2, the method further includes storing the coordinates and the type of the current coordinate point, where t2 > 0.
Optionally, the tracing comprises: connecting the track coordinate points of the first coordinate type; and marking the finally received track coordinate point by using an icon.
Optionally, the identifying area information includes:
and respectively marking the coordinate points of the second coordinate type and the third coordinate type by using different color blocks.
And after receiving the position information, continuously updating the track or the identification area information drawn last time.
Obtaining the position information of the equipment; and drawing track or identification area information according to the position information, and displaying the track or identification area information to a user through a display screen, so that the moving path of the mobile equipment is checked in real time, the product function of the mobile equipment is enhanced, the mobile equipment is more intelligent, and the user experience is improved.
As shown in fig. 2, an embodiment of the present disclosure further provides an information interaction control system, including:
the user terminal 1 is configured to send a query instruction, receive the position information sent by the mobile device 2, draw a track or mark area information according to the position information, and display a track image; and sending an instruction to the mobile device 2, and triggering the mobile device 2 to calculate and store the coordinate point and the type corresponding to the coordinate point.
Optionally, the user terminal 1 sends the query instruction to the server 3 through a user terminal WIFI; after receiving the query instruction, the server 3 sends the query instruction to the mobile device 2 through the WIFI of the mobile device; after receiving the query instruction, the mobile device uploads the position information of the mobile track of the mobile device to the server 3 through the mobile device WIFI; and after receiving the position information, the server 3 sends the position information to the user terminal through the WIFI of the user terminal.
Optionally, the user terminal is further configured to: triggering the mobile device to record angle information through a gyroscope at a frequency f 1; and triggering the mobile equipment to acquire mobile mileage data through the odometer.
Optionally, the location information includes an initial location, a current coordinate point, and a coordinate point type;
the initial position is the position of the mobile device when the mobile device starts to move;
the user terminal is further configured to: triggering the mobile equipment to set the current track coordinate point as a first coordinate type; when the current coordinate point is operated, judging whether the infrared sensor and the microswitch are triggered, and when the infrared sensor and the microswitch are not triggered, setting the coordinate point as a second coordinate type; when the infrared sensor and the microswitch are triggered, setting the coordinate point as a third coordinate type;
the user terminal is further configured to: connecting the track coordinate points of the first coordinate type; marking the finally received track coordinate point by using an icon; and respectively marking the coordinate points of the second coordinate type and the third coordinate type by using different color blocks.
The user terminal interacts with the mobile device, receives the position information of the mobile device, draws the moving track of the mobile device, displays the moving track to the user through the display screen, realizes real-time checking of the moving path of the mobile device, enhances the product function of the mobile device, enables the mobile device to be more intelligent, and improves user experience.
Optionally, when the user terminal triggers the sweeper to start, the position where the sweeper is located is an initial position, the initial position coordinates are set to be (100 ), and when the sweeper runs, the internal gyroscope chip records current angle information at the frequency of 50 Hz; and acquiring mileage data through the odometer, and calculating the position information of the current coordinate point relative to the initial position, namely coordinate point coordinates (x, y) by using the angle information and the mileage data.
Optionally, the mobile device is a sweeper, the user terminal triggers the sweeper to store the coordinates of the coordinate point once every 500ms, the coordinates are track coordinate points, the track coordinate point status is defined to be "1", and the track coordinate point is represented as a position of the sweeper.
Optionally, when the sweeper is triggered by the user terminal and runs to the coordinate point, judging whether the infrared sensor and the microswitch are triggered before being collided, and if the infrared sensor and the microswitch are not triggered before being collided, defining the coordinate point status as 0, which represents that the coordinate point is a cleaned area; when the current infrared sensor and the micro switch are triggered, the type of the coordinate point status is defined as 2, which represents that the coordinate point is an obstacle.
Optionally, the user terminal connects the trajectory coordinate points of type "1", i.e. representing the sweeping path of the sweeper.
Optionally, the last received trajectory coordinate point is identified with an icon, i.e., representing the real-time location of the sweeper.
Optionally, after receiving the position information uploaded by the sweeper, the user terminal draws a map in the display screen, wherein the coordinate point types of "1" and "2" are respectively identified by different color blocks, that is, the coordinate point types represent the cleaned area and the obstacle respectively.
Optionally, the storage module of the sweeper stores the coordinates and coordinate type of the coordinate points once every 100ms, i.e. at a frequency of 10 Hz.
Optionally, the user terminal sends an inquiry instruction every 2 seconds, that is, the sweeper uploads the position information every 2 seconds, and the position information uploaded each time includes 24 coordinate point information, including 4 track coordinate points of type "1" and 20 coordinate points of type "0" or type "2".
Optionally, after the user terminal triggers the sweeper to receive the position information, the map is continuously updated on the last drawn track, that is, the map is updated about 2 seconds.
The sweeper is received through the user terminal, the position information is analyzed, then the sweeping track of the sweeper is drawn, the sweeping path of the sweeper can be checked in real time through the user terminal, the product function of the sweeper is enhanced, the sweeper is more intelligent, and the user experience is improved.
The embodiment of the disclosure also provides a computer-readable storage medium, which stores computer-executable instructions, wherein the computer-executable instructions are configured to execute the control method for information interaction.
The disclosed embodiments also provide a computer program product comprising a computer program stored on a computer-readable storage medium, the computer program comprising program instructions that, when executed by a computer, cause the computer to execute the above-mentioned control method of information interaction.
The computer-readable storage medium described above may be a transitory computer-readable storage medium or a non-transitory computer-readable storage medium.
An embodiment of the present disclosure further provides an electronic device, a structure of which is shown in fig. 3, and the electronic device includes:
at least one processor (processor)100, one processor 100 being exemplified in fig. 3; and a memory (memory)101, and may further include a Communication Interface (Communication Interface)102 and a bus 103. The processor 100, the communication interface 102, and the memory 101 may communicate with each other via a bus 103. The communication interface 102 may be used for information transfer. The processor 100 may call the logic instructions in the memory 101 to execute the control method of information interaction of the above-described embodiment.
In addition, the logic instructions in the memory 101 may be implemented in the form of software functional units and stored in a computer readable storage medium when the logic instructions are sold or used as independent products.
The memory 101, which is a computer-readable storage medium, may be used for storing software programs, computer-executable programs, such as program instructions/modules corresponding to the methods in the embodiments of the present disclosure. The processor 100 executes functional applications and data processing by executing software programs, instructions and modules stored in the memory 101, that is, implements the control method of information interaction in the above method embodiments.
The memory 101 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the terminal device, and the like. In addition, the memory 101 may include a high-speed random access memory, and may also include a nonvolatile memory.
The technical solution of the embodiments of the present disclosure may be embodied in the form of a software product, where the computer software product is stored in a storage medium and includes one or more instructions to enable a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method of the embodiments of the present disclosure. And the aforementioned storage medium may be a non-transitory storage medium comprising: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes, and may also be a transient storage medium.
The above description and drawings sufficiently illustrate embodiments of the disclosure to enable those skilled in the art to practice them. Other embodiments may incorporate structural, logical, electrical, process, and other changes. The examples merely typify possible variations. Individual components and functions are optional unless explicitly required, and the sequence of operations may vary. Portions and features of some embodiments may be included in or substituted for those of others. The scope of the disclosed embodiments includes the full ambit of the claims, as well as all available equivalents of the claims. As used in this application, although the terms "first," "second," etc. may be used in this application to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, unless the meaning of the description changes, so long as all occurrences of the "first element" are renamed consistently and all occurrences of the "second element" are renamed consistently. The first and second elements are both elements, but may not be the same element. Furthermore, the words used in the specification are words of description only and are not intended to limit the claims. As used in the description of the embodiments and the claims, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. Similarly, the term "and/or" as used in this application is meant to encompass any and all possible combinations of one or more of the associated listed. Furthermore, the terms "comprises" and/or "comprising," when used in this application, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method or apparatus that comprises the element. In this document, each embodiment may be described with emphasis on differences from other embodiments, and the same and similar parts between the respective embodiments may be referred to each other. For methods, products, etc. of the embodiment disclosures, reference may be made to the description of the method section for relevance if it corresponds to the method section of the embodiment disclosure.
Those of skill in the art would appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software may depend upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the disclosed embodiments. It can be clearly understood by the skilled person that, for convenience and brevity of description, the specific working processes of the system, the apparatus and the unit described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments disclosed herein, the disclosed methods, products (including but not limited to devices, apparatuses, etc.) may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units may be merely a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form. The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to implement the present embodiment. In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. Each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.

Claims (10)

1. A method for controlling information interaction is characterized by comprising the following steps:
acquiring position information of a mobile device;
and drawing track or identification area information through the position information.
2. The method of claim 1, wherein the location information comprises: an initial position and a current coordinate point; wherein the initial position is a position where the mobile device starts moving;
the method further comprises the following steps:
triggering the mobile equipment to record angle information at the frequency f1, and acquiring mobile mileage data;
and triggering the mobile equipment to calculate the current coordinate point by the angle information and the mobile mileage data.
3. The method of claim 2, wherein the location information further includes a coordinate point type;
the method further comprises the following steps: triggering the mobile equipment at intervals of t1, storing the coordinates of the current coordinate point, and setting the current track coordinate point as a first coordinate type; wherein t1 is greater than 0.
4. The method of claim 2, further comprising:
triggering the mobile equipment, judging whether an infrared sensor and a microswitch are triggered when the mobile equipment runs to the current coordinate point, and setting the coordinate point as a second coordinate type when the infrared sensor and the microswitch are not triggered; and when the infrared sensor and the micro switch are triggered, setting the coordinate point as a third coordinate type.
5. The method of claim 2, further comprising:
storing the coordinates and the type of the current coordinate point at intervals of t 2; wherein t2 is greater than 0.
6. The method of claim 1, wherein drawing the trajectory comprises:
connecting the track coordinate points of the first coordinate type; and marking the finally received track coordinate point by using an icon.
7. The method of claim 1, wherein the identifying the region information comprises:
and respectively marking the coordinate points of the second coordinate type and the third coordinate type by using different color blocks.
8. An information interaction control system, comprising:
the user terminal is configured to send a query instruction, receive position information sent by the mobile equipment, draw track or identification area information according to the position information and display a track image; and sending an instruction to the mobile equipment, and triggering the mobile equipment to calculate and store the coordinate point and the type corresponding to the coordinate point.
9. The system of claim 8, wherein the user terminal is further configured to: triggering the mobile device to record angle information through a gyroscope at a frequency f 1; and triggering the mobile equipment to acquire mobile mileage data through the odometer.
10. The system of claim 8,
the position information comprises an initial position, a current coordinate point and a coordinate point type;
the initial position is the position of the mobile device when the mobile device starts to move;
the user terminal is further configured to: triggering the mobile equipment to set the current track coordinate point as a first coordinate type; when the current coordinate point is operated, judging whether the infrared sensor and the microswitch are triggered, and when the infrared sensor and the microswitch are not triggered, setting the coordinate point as a second coordinate type; when the infrared sensor and the microswitch are triggered, setting the coordinate point as a third coordinate type;
the user terminal is further configured to: connecting the track coordinate points of the first coordinate type; marking the finally received track coordinate point by using an icon; and respectively marking the coordinate points of the second coordinate type and the third coordinate type by using different color blocks.
CN201910382404.4A 2019-03-14 2019-05-09 Information interaction control method and system Pending CN111685657A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2019101943994 2019-03-14
CN201910194399 2019-03-14

Publications (1)

Publication Number Publication Date
CN111685657A true CN111685657A (en) 2020-09-22

Family

ID=72476016

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910382404.4A Pending CN111685657A (en) 2019-03-14 2019-05-09 Information interaction control method and system

Country Status (1)

Country Link
CN (1) CN111685657A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114488836A (en) * 2022-01-24 2022-05-13 珠海格力电器股份有限公司 Intelligent device control method and device, electronic device, cleaning system and medium
WO2023124859A1 (en) * 2021-12-28 2023-07-06 速感科技(北京)有限公司 Cleaning robot, cleaning methods thereof and computer readable storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104536447A (en) * 2014-12-29 2015-04-22 重庆广建装饰股份有限公司 Navigation method for sweeping robot
CN106200639A (en) * 2016-07-20 2016-12-07 成都广迈科技有限公司 A kind of method of sweeping the floor of full-automatic sweeping robot
CN106292662A (en) * 2016-08-09 2017-01-04 衣佳鑫 Robot trajectory's method for drafting and system
CN106537169A (en) * 2015-01-22 2017-03-22 江玉结 Color block tag-based localization and mapping method and device thereof
CN108572646A (en) * 2018-03-19 2018-09-25 深圳悉罗机器人有限公司 The rendering method and system of robot trajectory and environmental map

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104536447A (en) * 2014-12-29 2015-04-22 重庆广建装饰股份有限公司 Navigation method for sweeping robot
CN106537169A (en) * 2015-01-22 2017-03-22 江玉结 Color block tag-based localization and mapping method and device thereof
CN106200639A (en) * 2016-07-20 2016-12-07 成都广迈科技有限公司 A kind of method of sweeping the floor of full-automatic sweeping robot
CN106292662A (en) * 2016-08-09 2017-01-04 衣佳鑫 Robot trajectory's method for drafting and system
CN108572646A (en) * 2018-03-19 2018-09-25 深圳悉罗机器人有限公司 The rendering method and system of robot trajectory and environmental map

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023124859A1 (en) * 2021-12-28 2023-07-06 速感科技(北京)有限公司 Cleaning robot, cleaning methods thereof and computer readable storage medium
CN114488836A (en) * 2022-01-24 2022-05-13 珠海格力电器股份有限公司 Intelligent device control method and device, electronic device, cleaning system and medium

Similar Documents

Publication Publication Date Title
CN110383274B (en) Method, device, system, storage medium, processor and terminal for identifying equipment
JP5942456B2 (en) Image processing apparatus, image processing method, and program
CN108548300B (en) Air supply method and device of air conditioner and electronic equipment
CN106643774B (en) Navigation route generation method and terminal
CN102473068A (en) Information processing device, information processing method, and program
WO2022267795A1 (en) Regional map processing method and apparatus, storage medium, and electronic device
CN108919653B (en) Method and device for searching home equipment
CN111685657A (en) Information interaction control method and system
JP2013164697A (en) Image processing device, image processing method, program and image processing system
CN112083801A (en) Gesture recognition system and method based on VR virtual office
CN111338721A (en) Online interaction method, system, electronic device and storage medium
CN115265520A (en) Intelligent operation equipment and mapping method, device and storage medium thereof
CN113778294A (en) Processing method, device, equipment and medium for AVP (Audio video Standard) interactive interface
CN113345108B (en) Augmented reality data display method and device, electronic equipment and storage medium
US20210088348A1 (en) Method and apparatus for acquiring information
CN108874141B (en) Somatosensory browsing method and device
CN110505287B (en) Service-based business line recommendation method, device and storage medium
CN109298782B (en) Eye movement interaction method and device and computer readable storage medium
CN114726664B (en) Binding method and binding equipment for household equipment
CN112446651A (en) Method and device for monitoring transportation equipment
CN113055707B (en) Video display method and device
CN109640314B (en) Method and device for processing attribution data and computer readable storage medium
CN114519123A (en) Sweeper history track playback interaction method and device, storage medium and sweeper
CN109284730B (en) Method and device applied to screening data and monitoring system
CN113869097A (en) Information display method and device, storage medium and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200922

RJ01 Rejection of invention patent application after publication