CN115089078B - Intelligent robot control instruction generation method, control method and system - Google Patents

Intelligent robot control instruction generation method, control method and system Download PDF

Info

Publication number
CN115089078B
CN115089078B CN202210912319.6A CN202210912319A CN115089078B CN 115089078 B CN115089078 B CN 115089078B CN 202210912319 A CN202210912319 A CN 202210912319A CN 115089078 B CN115089078 B CN 115089078B
Authority
CN
China
Prior art keywords
intelligent robot
user
information
control
robot control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210912319.6A
Other languages
Chinese (zh)
Other versions
CN115089078A (en
Inventor
陈秀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gree Electric Appliances Inc of Zhuhai
Original Assignee
Gree Electric Appliances Inc of Zhuhai
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gree Electric Appliances Inc of Zhuhai filed Critical Gree Electric Appliances Inc of Zhuhai
Priority to CN202210912319.6A priority Critical patent/CN115089078B/en
Publication of CN115089078A publication Critical patent/CN115089078A/en
Application granted granted Critical
Publication of CN115089078B publication Critical patent/CN115089078B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4061Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated

Landscapes

  • Manipulator (AREA)

Abstract

The invention discloses a method, a method and a system for generating control instructions of an intelligent robot; dividing the total operation space of the intelligent robot into m sub-operation spaces and presenting the sub-operation spaces in a three-dimensional form, rotating the n surface body model through a three-dimensional rotation control, and inputting operation indication information for the operation surface facing a user currently through an operation indication information input control; and finally, generating an intelligent robot control instruction according to the information of the operation surface and the job instruction information input for the corresponding operation surface. The invention ensures that the efficiency is higher when the user switches, browses and operates the sub-operation spaces, and is more visual and convenient. In addition, the intelligent and convenient operation direction of the invention is further improved by a plurality of interactive triggering modes, and the personalized requirement of realizing functional operation by a user is greatly met.

Description

Intelligent robot control instruction generation method, control method and system
Technical Field
The invention belongs to the technical field of intelligent robots, and particularly relates to a method for generating control instructions of an intelligent robot, a control method and a control system.
Background
The intelligent robot is used for intelligently responding to the control instruction and completing corresponding operation content. For example, a cleaning robot has been widely used in modern households as an intelligent home appliance that performs cleaning work of floors in houses by responding to a cleaning mode set by a user, a planned route, or real-time control of the user.
For the control of intelligent robots, how the robot itself responds to and completes the control instructions is an important aspect, and another important aspect is how to generate the control instructions.
In the prior art, the cleaning robot can well complete corresponding control instructions based on a map stored by the cleaning robot, a self-arranged action mechanism, various sensors and operation components. However, for the generation of control commands, especially for cleaning specific locations, it is common practice to: a user hand-held terminal (such as a mobile phone or an intelligent remote controller) invokes a planar map of the whole room stored in a local or server side, finds out an area where a specific position is located through an enlarged map, and then carries out delineation on the specific position; a cleaning control instruction is then generated based on this information. Of course, the cleaning control command may also include a specific cleaning time and a setting of a cleaning mode, and the cleaning mode information needs to call another interface for input, such as cleaning intensity, intermittent control, a specific cleaning path (arcuate path or involute path with inside to outside), and the like.
It can be seen that the prior art has complicated, not intelligent and intuitive operation for determining the specific position to be cleaned and setting the cleaning mode through another interface.
Disclosure of Invention
The invention aims to provide a method, a method and a system for generating an intelligent robot control instruction, which can generate the intelligent robot control instruction more intelligently, conveniently and intuitively. The invention is realized by the following technical scheme.
The intelligent robot control instruction generation method is characterized by comprising the following steps of:
s1, creating an n-surface body model for interface display, dividing a total operation space of the intelligent robot into m sub-operation spaces, and respectively taking the topographic maps of the m sub-operation spaces as an operation surface and presenting the operation surfaces on m surfaces of the n-surface body model;
s2, configuring a three-dimensional rotation control and a job indication information input control for the n-face model;
s3, rotating the n-face model through the three-dimensional rotating control, and inputting operation indication information for the operation face facing the user currently through an operation indication information input control;
s4, generating an intelligent robot control instruction according to the information of the operation surface and the job instruction information input for the corresponding operation surface.
Specifically, the three-dimensional rotation control and the operation indication information input control are both controls responding to touch information sensed by the touch interface.
Specifically, the three-dimensional rotating control rotates the n-surface body model by sensing touch information of pressing and dragging at the corner of the n-surface body model.
Specifically, the job instruction information includes circling information of a job position and job control information; in the step S3, the operation position is defined by touch mode in the topographic map of the operation surface facing the user at present, and the operation control information is input by touch mode outside the topographic map of the operation surface facing the user at present.
Specifically, the job control information includes a job time; in the step S3, the operation time is automatically set by long touch time at a first position outside the topographic map of the operation surface facing the user.
Specifically, the long press touch triggers a working time display bar.
Specifically, the job control information includes job mode information; in the step S3, a job mode setting page is invoked by clicking touch at a second position outside the topographic map of the operation surface currently facing the user, and the job mode information is input through the job mode setting page.
Further, in the step S3, job instruction information is input for at least two operation surfaces facing the user at present; and in the step S4, time sequence configuration is carried out on the information of at least two operation surfaces and the operation instruction information input corresponding to the operation surfaces, so as to generate the intelligent robot control instruction.
An intelligent robot control method, comprising:
(1) Based on the intelligent robot control instruction generation method, the intelligent robot control instruction is generated through the user handheld terminal or through the cooperation of the user handheld terminal and the server;
(2) The intelligent robot receives and responds to the intelligent robot control instruction.
An intelligent robot control system, comprising:
executing the intelligent robot control instruction generation method, wherein the user handheld terminal or the user handheld terminal is added with a server; and
And the intelligent robot receives and responds to the intelligent robot control instruction.
The beneficial effects of the invention include: the total working space of the intelligent robot is divided into m sub-working spaces and is displayed in a three-dimensional form, so that the efficiency of a user is higher when the sub-working spaces are switched, browsed and operated, and the intelligent robot is more visual and convenient. In addition, the intelligent and convenient operation direction of the invention is further improved by a plurality of interactive triggering modes, and the personalized requirement of realizing functional operation by a user is greatly met.
Drawings
Fig. 1 is a flowchart of a method for generating an intelligent robot control instruction according to an embodiment of the present invention.
Fig. 2 is an effect diagram of m sub-job spaces of a total job space presented through each face of an n-face body model in the intelligent robot control instruction generation method provided by the embodiment of the present invention.
Fig. 3 is an effect diagram of displaying a corresponding topographic map on a certain operating surface of an n-surface body model and defining a working position in the intelligent robot control command generating method according to the embodiment of the present invention.
Fig. 4 is an effect diagram of automatically setting operation time by long-touch time on a certain operation surface of an n-surface body model in the intelligent robot control instruction generation method according to the embodiment of the invention.
Detailed Description
The intelligent robot control instruction generation method provided by the embodiment aims at generating a control instruction for controlling the operation of the intelligent robot through the user handheld terminal. The described operation is in this embodiment a floor cleaning operation, but other forms of operation, such as article handling, etc., are not excluded.
Referring to fig. 1, the method for generating an intelligent robot control instruction according to the present embodiment includes:
s1, creating an n-surface body model for interface display, dividing a total operation space of the intelligent robot into m sub-operation spaces, and respectively taking the topographic maps of the m sub-operation spaces as an operation surface and presenting the operation surfaces on m surfaces of the n-surface body model, wherein m is less than or equal to n. The n-face model can be created on a server for the user to call by the handheld terminal, or can be created in the intelligent robot control software of the user handheld terminal. Referring to fig. 2, in this embodiment, n=6, that is, the n-face model is a hexahedron, more specifically, a cube; of course, a polyhedron which is a triangular pyramid or more is not excluded. In addition, the total working space is a set of houses, and the set of houses can be divided into 6 sub-working spaces, for example, including: the topographic maps of the 6 sub-working spaces are respectively used as an operation surface and are displayed on 6 surfaces of the hexahedral model.
S2, configuring a three-dimensional rotation control and a job indication information input control for the n-face model. Likewise, the three-dimensional rotation control and the operation indication information input control can be configured on the server for the user handheld terminal to call, and can also be configured in intelligent robot control software of the user handheld terminal. The three-dimensional rotation control and the operation indication information input control are controls responding to touch information sensed by a touch interface (for example, a touch interface of a mobile phone serving as a handheld terminal of a user). The three-dimensional rotation control is used for enabling a user to rotate the n-face body model, so that one operation face of the n-face body model faces the user, and the operation indication information input control is used for inputting operation indication information to the operation face facing the user currently.
S3, rotating the n-face model through the three-dimensional rotating control, and inputting operation indication information for the operation face facing the user currently through the operation indication information input control. The three-dimensional rotating control rotates the n-surface body model by sensing touch information of pressing and dragging at the corner of the n-surface body model.
In this embodiment, the job instruction information includes the circling information of the job position and the job control information; referring to fig. 3, the operation position is defined by touch in the topographic map of the operation surface (i.e., the topographic map of the room 2) currently facing the user (see the hatched portion in fig. 3), and the operation control information is input by touch outside the topographic map of the operation surface currently facing the user. Specifically, the job control information includes a job time; in S3, the operation time is automatically performed at a first position (an upper edge position indicated by an arrow in fig. 4) outside the topographic map of the operation surface currently facing the user by long-pressing the duration of the touch, and an operation time display bar (not shown) is triggered when the touch is long-pressed. In addition, the job control information includes job mode information; in S3, a operation mode setting page (not shown) is invoked by clicking touch at a second position (for example, the lower edge or the side edge position of the operation surface in fig. 4) outside the topographic map of the operation surface currently facing the user, and the operation mode information, such as cleaning intensity, intermittent control, specific cleaning path (arcuate path or involute path with inside to outside), etc., is input through the operation mode setting page.
S4, generating an intelligent robot control instruction according to the information of the operation surface and the job instruction information input for the corresponding operation surface.
In the intelligent robot control instruction generation method, one control instruction can be generated for one operation surface at a time, or one comprehensive control instruction can be generated once after job instruction information is respectively input for a plurality of operation surfaces; in S4, time sequence configuration is performed on the information of at least two operation surfaces and the job instruction information input corresponding to the operation surfaces, so as to generate the intelligent robot control instruction.
The embodiment also provides an intelligent robot control method, which comprises the following steps:
(1) Based on the intelligent robot control instruction generation method, the intelligent robot control instruction is generated through the user handheld terminal or through the cooperation of the user handheld terminal and the server; (2) The intelligent robot receives and responds to the intelligent robot control instruction.
The embodiment also provides an intelligent robot control system, which comprises:
executing the intelligent robot control instruction generation method, wherein the user handheld terminal or the user handheld terminal is added with a server; and the intelligent robot receives and responds to the intelligent robot control instruction.
The above embodiments are merely for fully disclosing the present invention, but not limiting the present invention, and substitution of equivalent technical features based on the gist of the present invention, which can be obtained without inventive effort, should be considered as the scope of the present disclosure.

Claims (9)

1. The intelligent robot control instruction generation method is characterized by comprising the following steps of:
s1, creating an n-surface body model for interface display, dividing a total operation space of the intelligent robot into m sub-operation spaces, and respectively taking the topographic maps of the m sub-operation spaces as an operation surface and presenting the operation surfaces on m surfaces of the n-surface body model;
s2, configuring a three-dimensional rotation control and a job instruction information input control for the n-face model, wherein the job instruction information input control is used for inputting job instruction information to an operation face facing a user currently;
s3, rotating the n-face model through the three-dimensional rotating control, and inputting operation indication information for the operation face facing the user currently through an operation indication information input control; the operation instruction information comprises the setting information of the operation position and the operation control information, the operation position is set in the topographic map of the operation surface facing the user at present, and the operation control information is input outside the topographic map of the operation surface facing the user at present; the operation control information comprises operation time, and the operation time is automatically set at a first position outside a topographic map of the operation surface facing a user through long touch time;
s4, generating an intelligent robot control instruction according to the information of the operation surface and the job instruction information input for the corresponding operation surface.
2. The method for generating the intelligent robot control command according to claim 1, wherein the three-dimensional rotation control and the operation indication information input control are both controls responding to touch information sensed by a touch interface.
3. The intelligent robot control command generation method according to claim 2, wherein the three-dimensional rotation control rotates the n-surface body model by sensing touch information of pressing and dragging at a corner of the n-surface body model.
4. The method according to claim 2, wherein in S3, the operation position is defined by touch in the topographic map of the operation surface facing the user, and the operation control information is input by touch outside the topographic map of the operation surface facing the user.
5. The intelligent robot control command generation method according to claim 1, wherein the long press touch triggers a working time display bar.
6. The intelligent robot control instruction generation method according to claim 4, wherein the job control information includes job mode information; in the step S3, a job mode setting page is invoked by clicking touch at a second position outside the topographic map of the operation surface currently facing the user, and the job mode information is input through the job mode setting page.
7. The intelligent robot control command generation method according to any one of claims 1 to 6, wherein in S3, job instruction information is input for at least two of the operation surfaces currently facing a user, respectively; and in the step S4, time sequence configuration is carried out on the information of at least two operation surfaces and the operation instruction information input corresponding to the operation surfaces, so as to generate the intelligent robot control instruction.
8. An intelligent robot control method, comprising:
(1) Generating the intelligent robot control instruction through a user handheld terminal or through a user handheld terminal and a server based on the intelligent robot control instruction generation method of any one of claims 1-7;
(2) The intelligent robot receives and responds to the intelligent robot control instruction.
9. An intelligent robot control system, comprising:
a user-held terminal or a user-held terminal plus server that performs the intelligent robot control instruction generation method of any one of claims 1 to 7; and
And the intelligent robot receives and responds to the intelligent robot control instruction.
CN202210912319.6A 2022-07-30 2022-07-30 Intelligent robot control instruction generation method, control method and system Active CN115089078B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210912319.6A CN115089078B (en) 2022-07-30 2022-07-30 Intelligent robot control instruction generation method, control method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210912319.6A CN115089078B (en) 2022-07-30 2022-07-30 Intelligent robot control instruction generation method, control method and system

Publications (2)

Publication Number Publication Date
CN115089078A CN115089078A (en) 2022-09-23
CN115089078B true CN115089078B (en) 2023-11-24

Family

ID=83300741

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210912319.6A Active CN115089078B (en) 2022-07-30 2022-07-30 Intelligent robot control instruction generation method, control method and system

Country Status (1)

Country Link
CN (1) CN115089078B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101957719A (en) * 2009-07-14 2011-01-26 Lg电子株式会社 Portable terminal and display control method thereof
CN108214492A (en) * 2017-12-29 2018-06-29 北京视觉世界科技有限公司 Clean clean method, device, computer equipment and the storage medium in region
CN110554615A (en) * 2019-09-06 2019-12-10 珠海格力电器股份有限公司 method and device for centralized control and management of intelligent household equipment
CN111638824A (en) * 2020-05-27 2020-09-08 维沃移动通信(杭州)有限公司 Unread message display method and device and electronic equipment
CN112558832A (en) * 2020-12-16 2021-03-26 珠海格力电器股份有限公司 Scene setting method and device, electronic equipment and storage medium
CN112842149A (en) * 2021-02-03 2021-05-28 追创科技(苏州)有限公司 Control method of intelligent cleaning equipment and intelligent cleaning equipment
CN113995355A (en) * 2021-09-28 2022-02-01 云鲸智能(深圳)有限公司 Robot management method, device, equipment and readable storage medium
CN114158980A (en) * 2020-09-11 2022-03-11 科沃斯机器人股份有限公司 Job method, job mode configuration method, device, and storage medium
CN114468856A (en) * 2022-01-10 2022-05-13 珠海格力电器股份有限公司 Sweeping robot control method and device, electronic equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9375847B2 (en) * 2013-01-18 2016-06-28 Irobot Corporation Environmental management systems including mobile robots and methods using same

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101957719A (en) * 2009-07-14 2011-01-26 Lg电子株式会社 Portable terminal and display control method thereof
CN108214492A (en) * 2017-12-29 2018-06-29 北京视觉世界科技有限公司 Clean clean method, device, computer equipment and the storage medium in region
CN110554615A (en) * 2019-09-06 2019-12-10 珠海格力电器股份有限公司 method and device for centralized control and management of intelligent household equipment
CN111638824A (en) * 2020-05-27 2020-09-08 维沃移动通信(杭州)有限公司 Unread message display method and device and electronic equipment
CN114158980A (en) * 2020-09-11 2022-03-11 科沃斯机器人股份有限公司 Job method, job mode configuration method, device, and storage medium
CN112558832A (en) * 2020-12-16 2021-03-26 珠海格力电器股份有限公司 Scene setting method and device, electronic equipment and storage medium
CN112842149A (en) * 2021-02-03 2021-05-28 追创科技(苏州)有限公司 Control method of intelligent cleaning equipment and intelligent cleaning equipment
CN113995355A (en) * 2021-09-28 2022-02-01 云鲸智能(深圳)有限公司 Robot management method, device, equipment and readable storage medium
CN114468856A (en) * 2022-01-10 2022-05-13 珠海格力电器股份有限公司 Sweeping robot control method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN115089078A (en) 2022-09-23

Similar Documents

Publication Publication Date Title
US8436832B2 (en) Multi-touch system and driving method thereof
CN102197377B (en) Multi-touch object inertia simulation
CN102197359B (en) Multi-touch manipulation of application objects
CN102681754B (en) Messaging device and information processing method
US20080062169A1 (en) Method Of Enabling To Model Virtual Objects
WO2016138661A1 (en) Processing method for user interface of terminal, user interface and terminal
CN102479039A (en) Control method of touch device
TW201305761A (en) An autonomous robot and a positioning method thereof
KR20150091409A (en) Method and system for implementing suspended global button on touch screen terminal interface
CN102929556A (en) Method and equipment for interaction control based on touch screen
CN103106028B (en) A kind of electronic apparatus system and control method thereof
CN102968245B (en) Mouse touches cooperative control method, device and Intelligent television interaction method, system
JP6360509B2 (en) Information processing program, information processing system, information processing method, and information processing apparatus
CN108803586B (en) Working method of sweeping robot
CN105242839A (en) Control method and system of touch menu
CN115089078B (en) Intelligent robot control instruction generation method, control method and system
CN107430474B (en) Desktop icon display method and mobile terminal
CN112180841B (en) Man-machine interaction method, device, equipment and storage medium
CN102402361A (en) Method and device for controlling on computer based on movement track of mouse
CN105373329A (en) Interactive method and system for display and booth
CN106393081A (en) Man-machine interactive manipulator control method, terminal and system
CN103809846A (en) Function calling method and electronic equipment
JP2002202839A (en) Electronic equipment and display method for pointer
CN106775273B (en) Multifunctional high-voltage frequency converter control device and method for presenting control menu
CN105700707B (en) A kind of double-deck cursor towards big screen display device clicks exchange method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant