CN117234403A - Interaction method and electronic equipment - Google Patents

Interaction method and electronic equipment Download PDF

Info

Publication number
CN117234403A
CN117234403A CN202311281810.4A CN202311281810A CN117234403A CN 117234403 A CN117234403 A CN 117234403A CN 202311281810 A CN202311281810 A CN 202311281810A CN 117234403 A CN117234403 A CN 117234403A
Authority
CN
China
Prior art keywords
cleaning
interface
electronic device
sequence
mode
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311281810.4A
Other languages
Chinese (zh)
Inventor
张少华
劳鹏飞
任娟娟
叶力荣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Silver Star Intelligent Group Co Ltd
Original Assignee
Shenzhen Silver Star Intelligent Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Silver Star Intelligent Group Co Ltd filed Critical Shenzhen Silver Star Intelligent Group Co Ltd
Priority to CN202311281810.4A priority Critical patent/CN117234403A/en
Publication of CN117234403A publication Critical patent/CN117234403A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application relates to the technical field of robots, and discloses an interaction method and electronic equipment, wherein the electronic equipment displays a first interface, the first interface comprises a cleaning map displayed in a first display mode, and the cleaning map comprises at least two cleaning areas; the user may swipe a finger over a touch screen of the electronic device to form a swipe gesture, the electronic device displaying a second interface in response to the swipe gesture, the second interface including a cleaning map displayed in a second display mode including a cleaning sequence displaying cleaning areas sequentially selected in response to the swipe gesture. In this embodiment, the user may freely select the cleaning area to be cleaned by sliding a finger on the touch screen of the electronic device, determine the cleaning sequence, and the second display mode may reflect the cleaning sequence for the user to view. Compared with the selection mode of clicking the cleaning areas one by one, the cleaning areas are selected and the cleaning sequence is determined through the sliding gesture, and the interactive operation is intelligent, concise, quick and efficient.

Description

Interaction method and electronic equipment
Technical Field
The embodiment of the application relates to the technical field of robots, in particular to an interaction method and electronic equipment.
Background
Cleaning robots are an important branch of the modern robot field. A cleaning robot refers to a robot capable of autonomously performing a cleaning task in an environment such as a home room, a large-sized place, or the like. Common cleaning robots include a floor sweeping robot, a floor mopping robot and a floor sweeping and mopping integrated machine.
Typically, the cleaning robot has an associated application program, which a user can manipulate through the mobile terminal to enable control and interaction of the cleaning robot, such as determining a cleaning sequence or setting a cleaning mode, etc. However, the interaction method of some related application programs known by the inventor is complex in operation mode and not intelligent and concise.
Disclosure of Invention
In view of this, some embodiments of the present application provide an interaction method, an electronic device, and a storage medium, where the method makes an application associated with a cleaning robot more intelligent and concise in interaction control, and can effectively improve intelligent interaction experience.
In a first aspect, some embodiments of the present application provide an interaction method, including:
the electronic equipment displays a first interface, wherein the first interface comprises a cleaning map displayed in a first display mode, and the cleaning map comprises at least two cleaning areas;
The electronic device displays a second interface in response to the swipe gesture, the second interface including a cleaning map displayed in a second display manner, the second display manner including a sequence of displaying cleaning areas sequentially selected in response to the swipe gesture.
In some embodiments, in the first interface, each cleaning zone includes a selection control;
in the second interface, the second display mode includes displaying a cleaning sequence of target selection controls sequentially selected in response to the sliding gesture, the target selection controls being connected in a line in a touch sequence.
In some embodiments, the swipe gesture includes a first swipe segment for indicating a selected cleaning area or a second swipe segment for indicating bypassing the cleaning area;
in the second interface, the second display means includes displaying a cleaning sequence of cleaning areas sequentially selected in response to the first sliding section.
In some embodiments, the method further comprises:
the electronic device updates the second interface in response to the deselected first operation such that the cleaning region deselected by the first operation is removed from the cleaning sequence.
In some embodiments, the method further comprises:
and in the case that the sliding gesture is closed end to end, cleaning the target cleaning area in the cleaning sequence in a first cleaning mode.
In some embodiments, before the electronic device displays the first interface, the method further comprises:
the electronic equipment displays a third interface, wherein the third interface comprises a setting window and a preview control for editing the cleaning mode;
the electronic device displays a first interface comprising: after receiving a second operation for editing the cleaning mode in the setup window, the first device displays a first interface in response to a third operation acting on the preview control, each cleaning area including a cleaning mode icon.
In some embodiments, the method further comprises:
the electronic device responds to the fourth operation on the target cleaning mode icon, and a cleaning mode setting popup window corresponding to the target cleaning mode icon is displayed;
and the electronic equipment responds to the editing operation of the setting popup window and updates the cleaning mode icon of the cleaning area corresponding to the target cleaning mode icon in the first interface.
In some embodiments, the method further comprises:
the electronic device adaptively adjusts a display ratio of the first interface based on a screen size thereof.
In a second aspect, some embodiments of the present application provide an electronic device, comprising:
at least one processor;
a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the interaction method of the first aspect.
In a third aspect, some embodiments of the present application provide a computer-readable storage medium having stored thereon computer-executable instructions for causing a computer device to perform the interaction method of the second aspect.
According to the interaction method provided by the embodiment of the application, the electronic equipment displays a first interface, the first interface comprises a cleaning map displayed in a first display mode, and the cleaning map comprises at least two cleaning areas; during display of the first interface, a user may swipe a finger across a touch screen of the electronic device to form a swipe gesture, the electronic device displaying a second interface in response to the swipe gesture, the second interface including a cleaning map displayed in a second display mode including a cleaning sequence displaying cleaning regions sequentially selected in response to the swipe gesture. In this embodiment, the user can freely select the cleaning area to be cleaned by sliding the finger on the touch screen of the electronic device, and the cleaning sequence of the cleaning area is defined by the touch sequence of the sliding gesture, and the second display mode can reflect the cleaning sequence for the user to view. Compared with the mode of selecting the cleaning areas one by one, the user can freely select the cleaning areas and determine the cleaning sequence through the sliding gestures according to actual requirements, and the interactive operation is intelligent, concise, quick and efficient.
Drawings
One or more embodiments are illustrated by way of example and not limitation in the figures of the accompanying drawings, in which like references indicate similar elements, and in which the figures of the drawings are not to be taken in a limiting sense, unless otherwise indicated.
FIG. 1 is a schematic view of an application environment of an interaction method according to some embodiments of the present application;
FIG. 2 is a flow chart of an interaction method in some embodiments of the application;
FIG. 3 is a schematic diagram of a cleaning map in accordance with some embodiments of the application;
FIG. 4 is a schematic diagram of a first interface and a second interface according to some embodiments of the application;
FIG. 5 is a schematic diagram of a first interface and a second interface according to some embodiments of the application;
FIG. 6 is a schematic diagram of a first interface and a second interface in some embodiments of the application;
FIG. 7 is a schematic diagram of a first interface and a second interface in some embodiments of the application;
FIG. 8 is a schematic diagram of a first interface and a second interface in some embodiments of the application;
FIG. 9 is a schematic diagram of a closed swipe gesture in some embodiments of the application;
FIG. 10 is a schematic illustration of a third interface in some embodiments of the application;
FIG. 11 is a schematic diagram of a first interface according to some embodiments of the application;
Fig. 12 is a schematic structural diagram of an electronic device according to some embodiments of the application.
Detailed Description
The present application will be described in detail with reference to specific examples. The following examples will assist those skilled in the art in further understanding the present application, but are not intended to limit the application in any way. It should be noted that variations and modifications could be made by those skilled in the art without departing from the inventive concept. These are all within the scope of the present application.
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
It should be noted that, if not in conflict, the features of the embodiments of the present application may be combined with each other, which is within the protection scope of the present application. In addition, while functional block division is performed in a device diagram and logical order is shown in a flowchart, in some cases, the steps shown or described may be performed in a different order than the block division in the device, or in the flowchart. Moreover, the words "first," "second," "third," and the like as used herein do not limit the data and order of execution, but merely distinguish between identical or similar items that have substantially the same function and effect.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used in the description of the application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. The term "and/or" as used in this specification includes any and all combinations of one or more of the associated listed items.
In addition, the technical features of the embodiments of the present application described below may be combined with each other as long as they do not collide with each other.
Fig. 1 is a schematic diagram of an application scenario of an interaction method according to an embodiment of the present application. The application scene comprises a cleaning robot 10, a server 20 and a terminal 30, wherein the cleaning robot 10 is in communication connection with the server 20, and the server 20 is in communication connection with the terminal 30.
The server 20 may be a physical server, a cloud server, or the like. The terminal 30 may be a mobile terminal such as a smart phone, tablet computer, etc.
In some embodiments, the terminal 30 downloads an application program (app) related to the cleaning robot 10, and a back end module of the application program (app) is provided in the server 20. Thus, the user controls and interacts with the cleaning robot 10, such as determining a cleaning sequence or setting a cleaning mode, etc., through an application program on the manipulation terminal 30.
It will be appreciated that the user clicks on an icon of an application on the touch screen of the terminal 30, triggers the opening of the application, and displays the main interface of the application. The main interface may include a top navigation bar, a content area, or a bottom navigation bar. These areas include one or more functionality controls. When the user clicks on these functionality controls, the corresponding functionality may be triggered to be opened. Illustratively, the content area of the main interface includes functionality controls such as "cleaning mode", "cleaning sequence", "timed cleaning", "map reconstruction" or "cleaning report".
A functionality control is a trigger window that implements certain functions of a first application. For example, after the "cleaning mode" function space is touched, an interface for setting a cleaning mode is entered for a user to input a setting information instruction; for another example, after the "cleaning sequence" functionality control is touched, an interface for selecting a cleaning region and determining a cleaning sequence is entered for the user to select and determine the cleaning sequence.
The terminal 30 transmits information instructions input by the user to the server 20, and the server 20 issues the information instructions to the cleaning robot 10, so that the cleaning robot 10 can perform corresponding tasks according to the information instructions input by the user. In addition, the cleaning robot 10 may transmit the sharing information to the server 20, and the server 20 may issue the sharing information to the terminal 30. Thus, the user can know the shared information fed back by the cleaning robot 10 at the terminal 30.
In some embodiments, the cleaning robot 10 has a laser radar and a camera mounted thereon. The laser radar is provided to the body of the cleaning robot 10, for example: the lidar is provided to a moving chassis of the body of the cleaning robot 10. In some embodiments, the lidar comprises a pulsed lidar, a continuous wave lidar, or the like, and the mobile chassis comprises a robotic mobile chassis such as a universal chassis, a vaulted mobile chassis, or the like. The camera is mounted to the body of the cleaning robot 10, for example, on top of the body of the cleaning robot 10.
The laser radar scans the environment where the cleaning robot 10 is located, and a laser point cloud is obtained. The camera photographs the environment in which the cleaning robot 10 is located, and acquires an image. The laser radar and the camera are respectively connected with the controller in a communication way, the laser point cloud and the image are respectively sent to the controller, the controller calls a program for constructing a map preset in the memory of the cleaning robot 10, and the map is constructed based on the laser point cloud and/or the image. The map construction procedure may include a procedure corresponding to a SLAM algorithm (Simultaneous Localization and Mapping, SLAM), which will not be described in detail herein. In some embodiments, the map is a grid map. The map is saved in the memory of the cleaning robot 10. When the robot moves to work, the controller calls the map as the basis of autonomous positioning, path planning and obstacle avoidance.
It is understood that the SLAM algorithm has both positioning and navigation functions. In the positioning process, the laser radar is controlled to rotate at a high speed to emit laser, the distance between the cleaning robot and the obstacle is measured, and the relative position between the cleaning robot and the obstacle is judged by combining a map, so that positioning is realized. In some embodiments, the cleaning robot 10 may be visually positioned based on a camera. And in the navigation process, cleaning path planning is performed based on positioning and cleaning tasks, and a full-coverage cleaning path is planned to complete corresponding cleaning tasks.
The cleaning robot 10 may be configured in any suitable shape to perform a cleaning operation. The cleaning robot 10 includes, but is not limited to, a sweeping robot, a dust collection robot, a mopping robot, a washing robot, or the like.
In some embodiments, the cleaning robot 10 includes a body and drive wheel assembly, a camera, sensors, and a controller. The exterior shape of the fuselage may be generally elliptical, triangular, D-shaped, or other shapes. The controller is arranged on the machine body, the machine body is a main body structure of the cleaning robot 10, and can select corresponding shape structures and manufacturing materials (such as hard plastics or metals of aluminum, iron and the like) according to the actual needs of the cleaning robot 10, for example, the cleaning robot is arranged into a flat cylinder commonly used for cleaning robots.
Wherein the driving wheel unit is mounted to the main body for driving the cleaning robot 10 to move on the surface to be cleaned. In some embodiments, the drive wheel assembly includes a left drive wheel, a right drive wheel, and an omni wheel, the left and right drive wheels being mounted to opposite sides of the body, respectively. The omni wheel is installed at a forward position of the bottom of the main body, is a movable caster wheel, and can horizontally rotate 360 degrees, so that the cleaning robot 10 can flexibly turn. The left driving wheel, the right driving wheel and the omni wheel are installed to form a triangle shape to improve the walking stability of the cleaning robot 10.
In some embodiments, sensors are used to collect some motion parameters of the cleaning robot 10 and various types of data of the environmental space, including various types of suitable sensors, such as gyroscopes, infrared sensors, odometers, magnetic field meters, accelerometers or speedometers, and so forth.
In some embodiments, the controller is disposed inside the machine body, and is an electronic computing core of the cleaning robot 10, and is configured to perform logic operation steps to implement intelligent control of the cleaning robot 10. The controller is electrically connected with the left driving wheel, the right driving wheel and the omnidirectional wheel respectively. The controller serves as a control core of the cleaning robot 10 for controlling the cleaning robot 10 to travel, to retreat, and to some business logic process. For example: the controller is used for receiving the voice command sent by the microphone and controlling the cleaning robot 10 to complete the corresponding task based on the voice command.
It is to be appreciated that the controller may be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA), a single-chip, ARM (Acorn RISC Machine) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof. The controller may also be any conventional processor, controller, microcontroller, or state machine. A controller may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP and/or any other such configuration, or one or more of a micro-control unit (Microcontroller Unit, MCU), a Field-programmable gate array (Field-Programmable Gate Array, FPGA), a System on Chip (SoC).
It will be appreciated that the memory of the robot 10 in embodiments of the present application includes, but is not limited to: FLASH memory, NAND FLASH memory, vertical NAND FLASH memory (VNAND), NOR FLASH memory, resistive Random Access Memory (RRAM), magnetoresistive Random Access Memory (MRAM), ferroelectric Random Access Memory (FRAM), spin transfer torque random access memory (STT-RAM), and the like.
It should be noted that, according to the task to be completed, besides the above functional modules, one or more other different functional modules (such as a water storage tank, a cleaning device, etc.) may be mounted on the main body of the cleaning robot 10, and cooperate with each other to perform the corresponding task.
According to some interactive methods for selecting cleaning areas and determining cleaning sequences, which are known by the inventor, cleaning modes of all rooms are set one by one in a custom cleaning mode interface, after confirmation, the main interface is returned, and room icons are clicked one by one so as to select rooms to be cleaned. In some arrangements, some cleaning sequences are recommended for the selected room for selection by the user. However, this method is cumbersome to operate, requires the user to select rooms one by one, and is inefficient, not intelligent and compact enough.
In view of the above problems, an embodiment of the present application provides an interaction method, where an electronic device displays a first interface, where the first interface includes a cleaning map displayed in a first display manner, and the cleaning map includes at least two cleaning areas; during display of the first interface, a user may swipe a finger across a touch screen of the electronic device to form a swipe gesture, the electronic device displaying a second interface in response to the swipe gesture, the second interface including a cleaning map displayed in a second display mode including a cleaning sequence displaying cleaning regions sequentially selected in response to the swipe gesture. In this embodiment, the user can freely select the cleaning area to be cleaned by sliding the finger on the touch screen of the electronic device, and the cleaning sequence of the cleaning area is defined by the touch sequence of the sliding gesture, and the second display mode can reflect the cleaning sequence for the user to view. Compared with the mode of selecting the cleaning areas one by one, the user can freely select the cleaning areas and determine the cleaning sequence through the sliding gestures according to actual requirements, and the interactive operation is intelligent, concise, quick and efficient.
It can be appreciated from the foregoing that the interaction method provided by the embodiments of the present application may be implemented by an electronic device communicatively connected to the cleaning robot, for example, by a terminal such as a smart phone, a tablet computer, or other devices having computing processing capabilities. The electronic device is provided with a touch screen, which is an interface for displaying and interacting with a user.
The following describes an interaction method provided by the embodiment of the present application in connection with exemplary applications and implementations of an electronic device provided by the embodiment of the present application. Referring to fig. 2, fig. 2 is a flow chart of an interaction method according to an embodiment of the application. It will be appreciated that the execution subject of the interaction method may be one or more processors of the electronic device.
As shown in fig. 2, the method S100 may specifically include the following steps:
s10: the electronic device displays a first interface including a cleaning map displayed in a first display mode, the cleaning map including at least two cleaning areas.
Wherein the first interface is a main interface or a sub-interface of an application associated with the cleaning robot. Illustratively, the first interface is a sub-interface of an application, and the main interface of the application includes a top navigation bar, a content area, or a bottom navigation bar. The content area includes functionality controls such as "cleaning mode", "cleaning sequence", "timed cleaning", "reconstructing map" or "cleaning report". In one implementation, the first interface is entered after the user clicks on the "cleaning order" functionality control. In one implementation, the first interface may also be accessed by other sub-interfaces.
As shown in fig. 3, the first interface includes a cleaning map displayed in a first display manner, the cleaning map including at least two cleaning areas. The cleaning map is a map of an environment space where the cleaning robot is located. In some embodiments, the cleaning map may be a grid map. Those skilled in the art will appreciate that the cleaning robot may build a cleaning map based on point cloud data from the lidar scan and SLAM algorithms.
The cleaning areas are respective areas in the cleaning map, and it is understood that the cleaning map may be divided into a plurality of cleaning areas based on the division of the obstacle. For example, when the cleaning map reflects the floor of the home housing, since the home space is divided by the wall, one room unit may be taken as one cleaning area, that is, the cleaning map may be divided by room units, resulting in 6 cleaning areas of room 1, room 2, room 3, room 4, room 5, and room 6. For example, when the cleaning map is reflected as a mall floor, since the mall floor is divided by an elevator, an exhibit, or the like, a large-area communication area in the mall floor can be regarded as a cleaning area.
It will be appreciated that prior to selecting the cleaning zone, the cleaning map is displayed in a first display mode including, but not limited to: the first display mode is not limited in any way, and the display mode is only used for distinguishing the display after the cleaning area is selected. For example, the first display mode may be gray non-filled, and each cleaning area in the cleaning map is displayed gray non-filled before the user selects the cleaning area.
In some embodiments, a user may select a cleaning area to be cleaned through a touch screen of the electronic device. For example, clicking, pressing, or long pressing the cleaning area a, the cleaning area a is selected as a target cleaning area requiring cleaning.
S20: the electronic device displays a second interface in response to the swipe gesture, the second interface including a cleaning map displayed in a second display manner, the second display manner including displaying a cleaning sequence of cleaning areas sequentially selected in response to the swipe gesture.
The sliding gesture is a gesture formed by a user sliding a finger on a touch screen of the electronic device, and the sliding gesture corresponds to some preset instructions. The touch screen of the electronic device can sense the sliding gesture, so that the electronic device responds to the sliding gesture to execute an instruction corresponding to the sliding gesture, and a second interface is displayed. That is, after the user slides a finger over the touch screen of the electronic device, the touch screen of the electronic device jumps from displaying the first interface to displaying the second interface.
The second interface includes a cleaning map displayed in a second display manner including displaying a cleaning sequence of cleaning areas sequentially selected in response to the swipe gesture. That is, the second display mode can reflect the cleaning order for viewing by the user, the user can know the cleaning order from the second interface, and the processor of the electronic device can obtain the cleaning order. It is understood that the cleaning sequence refers to a cleaning sequence of the user selected cleaning zones. Here, the cleaning sequence includes a swipe gesture sequentially touching the selected cleaning region. That is, the cleaning region is determined according to whether the swipe gesture touches the cleaning region, and the cleaning sequence is determined according to the sequence of the swipe gesture touches the cleaning region.
The second display mode includes, but is not limited to: the second display mode is not limited in any way, and different from the first display mode, the second display mode can reflect the cleaning sequence.
As shown in fig. 4 (a), the cleaning areas in the cleaning map are displayed as unfilled, the sliding gesture sequentially acts on the cleaning areas D, B, E, C, after the electronic device touch screen senses the sliding gesture, the areas touched by the sliding gesture are sent to the processor, and the processor acquires the cleaning areas selected by the sliding gesture and the selected sequence thereof, so as to control the touch screen to display a second interface shown in fig. 4 (b). In the second interface, touching the selected cleaning area by the swipe gesture is displayed as a shadow fill. An arrowed line is displayed between each selected cleaning region, the line having a gesture corresponding to a swipe gesture, sequentially passing through the cleaning regions D, B, E, C, characterizing the cleaning sequence. That is, the link and the selected cleaning area are displayed in a manner such that the second display reflects the cleaning order for viewing by the user. Thus, the user can also know the cleaning sequence from the second interface.
In some embodiments, in the first interface, each cleaning zone includes a selection control; in the second interface, the second display includes displaying a cleaning sequence of the target selection controls sequentially selected in response to the swipe gesture, the target selection controls being connected in a line in the selected sequence.
Wherein the selection control means a virtual key for selecting a cleaning area. As shown in fig. 5 (a), a selection control is disposed at a middle position of each cleaning area, and a code corresponding to the cleaning area is identified on the selection control, for example, a code "a" is identified on the selection control of the cleaning area a. The selection control directs a user's swipe gesture to touch the selection control to select a corresponding cleaning region. The respective cleaning areas in the first interface are shown as unfilled.
The sliding gesture sequentially touches the selection control D, B, E, C, after the electronic device touch screen senses the sliding gesture, the target selection control touched by the sliding gesture is sent to the processor, and the processor acquires the target selection control selected by the sliding gesture and the selected sequence. Based on the one-to-one correspondence of the selection controls to the cleaning areas, the processor obtains the cleaning areas selected by the swipe gesture and the selected order thereof, and controls the touch screen to display a second interface as shown in (b) of fig. 5. In the second interface, the target selection control selected by the swipe gesture is displayed highlighted and the corresponding cleaning area is displayed as shadow fill. Each selected target selection control is connected in a line according to the touch sequence, for example, a line with an arrow is displayed between the target selection controls, the line has a trend consistent with the sliding gesture, and the line sequentially passes through the target selection control D, B, E, C to represent the cleaning sequence. That is, the display mode of the connection line, the target selection control, and the corresponding cleaning region is such that the second display mode reflects the cleaning sequence. Thus, the user can also know the cleaning sequence from the second interface.
In this embodiment, a selection control is disposed at the middle position of each cleaning area, and the selection control is touched by a sliding gesture of the user, so as to select the corresponding cleaning area, which is beneficial to the operation of the user, and is more intelligent and concise. After the sliding gesture acts on the selection control, the selected target selection control is sequentially touched by the sliding gesture to be connected in a line according to the touch sequence, so that the cleaning sequence is more visual.
In some embodiments, the method S100 further comprises:
s30: the electronic device updates the second interface in response to the deselected first operation such that the cleaning region deselected by the first operation is removed from the cleaning sequence.
Wherein the first operation may be clicking or long pressing a corresponding area of the second interface in the touch screen. Illustratively, as shown in fig. 6 (a), the sliding track sequentially passes through the selection control D, B, E, C, the cleaning region D, B, E, C is selected, and if the user needs to cancel the cleaning region B that has been selected, the selection control of the cleaning region B is clicked. After the touch screen senses the first operation, the first operation is sent to the processor, and the processor cancels the cleaning area B and removes the cleaning area B from the cleaning sequence. As shown in fig. 6 (B), the second interface is automatically updated such that the cleaning region B is removed from the cleaning sequence, and the removed cleaning region B is displayed in the first display mode (e.g., no fill) and is no longer displayed in the second display mode (e.g., shadow fill).
Illustratively, as shown in fig. 7 (a), the sliding track sequentially passes through any position of the cleaning region A, B, F, E, and after the electronic device touch screen senses the sliding track, it jumps to the second interface shown in fig. 7 (b). Wherein the selected cleaning area F is the aisle from cleaning area B to cleaning area E, and if cleaning is not required, clicking any position of cleaning area F to remove cleaning area F from the cleaning sequence, and the electronic device updates the second interface. As shown in fig. 7 (c), the second interface is automatically updated such that the cleaning region F is removed from the cleaning sequence, and the removed cleaning region F is displayed in the first display mode (e.g., no fill) and is no longer displayed in the second display mode (e.g., shadow fill).
In this embodiment, the user may remove selected cleaning regions from the cleaning sequence in the second interface to adjust the cleaning sequence without resetting, making the interactive operation more flexible.
In some embodiments, the swipe gesture includes a first swipe segment for indicating a selected cleaning area or a second swipe segment for indicating bypassing the cleaning area. In the second interface, the second display means includes displaying a cleaning sequence of cleaning areas sequentially selected in response to the first sliding section.
Wherein the trajectories of the first sliding section and the second sliding section are different. The first sliding section is a straight line section, and the second sliding section is an arc section. The first slider segment is used to indicate that a selected cleaning region, i.e., a cleaning region touched by the first slider segment (e.g., a straight segment), is selected as the cleaning region to be added to the cleaning sequence. The second slider is used to indicate that the cleaning region is bypassed, i.e., the cleaning region touched by the second slider (e.g., an arc segment) is not added to the cleaning sequence. Thus, the cleaning sequence includes only the cleaning areas sequentially selected by the first slide segment.
As shown in (a) and (B) of fig. 8, the slide gesture includes 2 straight segments and 1 arc segment, the straight segment 1# passing through the cleaning areas a and B, the straight segment 2# passing through the cleaning area E, and the arc segment passing through the cleaning area F and a portion of the cleaning area B, E. Where the cleaning area through which the straight line segment passes includes cleaning area A, B, E, then cleaning area A, B, E is selected as the cleaning area in the cleaning sequence. No straight line segment passes in the cleaning region F and only an arc segment passes, indicating that the cleaning region F is bypassed and not selected as a cleaning region in the cleaning sequence. Here, the cleaning sequence includes sequentially touching the selected cleaning region A, B, E by each straight line segment.
In this embodiment, by setting the instructions corresponding to different sliding segments in the sliding gesture, the control of the sliding gesture is diversified, so that the interactive control is more intelligent and simplified.
In some embodiments, the method S100 further comprises:
s40: and in the case that the sliding gesture is closed end to end, cleaning the cleaning region in the cleaning sequence in a first cleaning mode.
Wherein the first cleaning mode is one of the cleaning modes, it being understood that the cleaning mode is a cleaning mode for cleaning an area. In some embodiments, the cleaning mode may include parameters such as a sweeping mode, a sweeping suction force, a water absorption amount, and a number of sweeping times. The cleaning mode may include a sweeping mode, a sweeping mode only, or the like. Illustratively, the first cleaning mode is a default cleaning of 2 times.
The end-to-end closing of the sliding gesture means that the starting position and the ending position of the sliding gesture are the same, and the track of the sliding gesture forms a closed loop. Referring to fig. 9 (a), the slide gesture is closed end to end, and the track of the slide gesture sequentially passes through the rooms 1, 3, 5 and 2, so that the rooms 1, 3, 5 and 2 are sequentially cleaned 2 times in the cleaning sequence. Referring to fig. 9 (b), the slide gesture is closed end to end, and the track of the slide gesture sequentially passes through the room 2, the room 3, and the room 1, and then the room 2, the room 3, and the room 1 are sequentially cleaned 2 times in the cleaning sequence.
In this embodiment, the closed form of the sliding gesture is associated with the cleaning mode, so that the user can set the cleaning sequence and the cleaning mode only by sliding the finger, and the interactive control is simple and convenient and more intelligent.
In some embodiments, prior to step S10, the method further comprises:
s50: the electronic device displays a third interface including a setup window for editing the cleaning mode and a preview control.
Wherein the third interface is a main interface or a sub-interface of an application associated with the cleaning robot. Illustratively, the third interface is a sub-interface of the application, and the main interface of the application includes a top navigation bar, a content area, or a bottom navigation bar. The content area includes functionality controls such as "cleaning mode", "cleaning sequence", "timed cleaning", "reconstructing map" or "cleaning report". In one implementation, the third interface is entered after the user clicks on the "cleaning mode" functionality control.
As shown in (a) of fig. 10, the third interface includes a setting window for editing a cleaning mode, in which there is a room option, an option to set the number of cleaning times, so that the user can select a room to set the corresponding number of cleaning times; the setup window is slid upward, as shown in (b) of fig. 10, options of a cleaning mode, a cleaning suction force are displayed, and thus, the user can set up the cleaning mode and the cleaning suction force.
In some embodiments, the third interface further includes a preview control that is a functional control for triggering a cleaning mode for previewing the respective cleaning zones. In some embodiments, the preview control may also be a "next control". Illustratively, after the setting of the cleaning mode is completed, clicking the "next control" then displays the cleaning map, and each cleaning area in the cleaning map is correspondingly displayed with the cleaning mode.
The step S10 specifically includes: after receiving a second operation for editing the cleaning mode in the setup window, the first device displays a first interface in response to a third operation acting on the preview control, each cleaning area including a cleaning mode icon.
It will be appreciated that the second operation refers to clicking on the options of the cleaning mode in the setting window, and the third operation refers to clicking or long pressing the preview control after the setting of the cleaning mode is completed. When the user sets up to complete the cleaning mode, after clicking the preview control, the electronic device displays a first interface as shown in fig. 11. Each cleaning region in the first interface includes a cleaning mode icon that reflects a cleaning mode. It will be appreciated that the cleaning mode is different and the corresponding icon is different. The user can know the cleaning mode from the cleaning mode icons, and after checking the cleaning mode, a cleaning area to be cleaned can be selected and a cleaning sequence can be determined in the first interface.
In this embodiment, the first interface is used as the next sub-interface of the third interface, and after the cleaning mode is set, the user enters the first interface, so that not only can the cleaning mode be previewed, but also the cleaning sequence can be determined by sliding the finger, and the user does not need to return to the main interface and enter the first interface, so that the operation process is simplified.
In some embodiments, the method S100 further comprises:
s60: the electronic device responds to the fourth operation on the target cleaning mode icon, and a cleaning mode setting popup window corresponding to the target cleaning mode icon is displayed;
s70: and the electronic equipment responds to the editing operation of the setting popup window and updates the cleaning mode icon of the cleaning area corresponding to the target cleaning mode icon in the first interface.
The fourth operation may be clicking or long pressing the target cleaning mode icon, and it is understood that the target cleaning mode icon is any cleaning mode icon in the first interface. After the user clicks the corresponding target cleaning mode icon a, the electronic device displays a cleaning mode setting pop-up window corresponding to the target cleaning mode icon a, where the user can re-edit the cleaning mode.
After the user re-edits the cleaning mode, the electronic device updates the target cleaning mode icon a in the first interface, and the updated target cleaning mode icon a' matches the re-edited cleaning mode. For example, if the user increases the cleaning suction force when editing the cleaning mode again, the icon of the cleaning suction force changes in the target cleaning mode icon a', indicating that the cleaning suction force increases.
In this embodiment, the cleaning modes of the respective cleaning areas can be previewed and edited in the cleaning map, and in the case where the user wants to change the cleaning mode, there is no need to return to the third interface, so that the interactive operation is more intelligently simplified.
In some embodiments, the method S100 further comprises:
s80: the electronic device adaptively adjusts a display ratio of the first interface based on a screen size thereof.
It will be appreciated that the implementation of the swipe gesture is easily affected if the cleaning map is displayed at a smaller scale. Here, when the first interface is entered, the electronic device adaptively adjusts a display ratio of the first interface based on a screen size thereof, for example, displays the first interface as a whole in a ratio of more than 100%. Therefore, the cleaning map is displayed in an enlarged manner, and a user can conveniently slide a finger on the cleaning map to select and determine the cleaning sequence.
In some embodiments, the cleaning map may be displayed in an enlarged manner alone to facilitate user viewing and sliding of the finger and to enhance the interactive experience.
In summary, the embodiment of the application provides an interaction method, in which an electronic device displays a first interface, the first interface includes a cleaning map displayed in a first display manner, and the cleaning map includes at least two cleaning areas; during display of the first interface, a user may swipe a finger across a touch screen of the electronic device to form a swipe gesture, the electronic device displaying a second interface in response to the swipe gesture, the second interface including a cleaning map displayed in a second display mode including a cleaning sequence of cleaning regions sequentially selected in response to the swipe gesture. In this embodiment, the user can freely select the cleaning area to be cleaned by sliding the finger on the touch screen of the electronic device, and the cleaning sequence of the cleaning area is defined by the touch sequence of the sliding gesture, and the second display mode can reflect the cleaning sequence for the user to view. Compared with the mode of selecting the cleaning areas one by one, the user can freely select the cleaning areas and determine the cleaning sequence through the sliding gestures according to actual requirements, and the interactive operation is intelligent, concise, quick and efficient.
The embodiment of the application also provides an electronic device, please refer to fig. 12, fig. 12 is a schematic hardware structure of the electronic device according to the embodiment of the application. In some embodiments, the electronic device is a smartphone, tablet, or the like, in communicative connection with the cleaning robot.
As shown in fig. 12, the electronic device 300 comprises at least one processor 301 and a memory 302 (bus connection, one processor being an example in fig. 12) in communication connection.
Wherein the processor 301 is configured to provide computing and control capabilities for controlling the electronic device 300 to perform corresponding tasks, e.g. for controlling the electronic device 300 to perform the interaction method of any of the method embodiments described above.
The processor 301 may be a general purpose processor including a central processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), a hardware chip, or any combination thereof; it may also be a digital signal processor (Digital Signal Processing, DSP), application specific integrated circuit (Application Specific Integrated Circuit, ASIC), programmable logic device (programmable logic device, PLD), or a combination thereof. The PLD may be a complex programmable logic device (complex programmable logic device, CPLD), a field-programmable gate array (field-programmable gate array, FPGA), general-purpose array logic (generic array logic, GAL), or any combination thereof.
The memory 302 serves as a non-transitory computer readable storage medium, and may be used to store non-transitory software programs, non-transitory computer executable programs, and modules, such as program instructions/modules corresponding to the interaction methods in embodiments of the present application. The processor 301 may implement the interaction method in any of the above method embodiments by running non-transitory software programs, instructions and modules stored in the memory 302, and will not be described herein again to avoid repetition.
In particular, the memory 302 may include Volatile Memory (VM), such as random access memory (random access memory, RAM); the memory 302 may also include a non-volatile memory (NVM), such as read-only memory (ROM), flash memory (flash memory), hard disk (HDD) or Solid State Drive (SSD), or other non-transitory solid state storage devices; memory 302 may also include a combination of the types of memory described above.
In an embodiment of the application, the memory 302 may also include memory located remotely from the processor, which may be connected to the processor via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
In the embodiment of the present application, the electronic device 300 may further have a wired or wireless network interface, a keyboard, an input/output interface, and other components for implementing the functions of the device, which are not described herein.
Embodiments of the present application also provide a computer readable storage medium, such as a memory, including program code executable by a processor to perform the interaction method of the above embodiments. For example, the computer readable storage medium may be Read-Only Memory (ROM), random-access Memory (Random Access Memory, RAM), compact disc Read-Only Memory (CDROM), magnetic tape, floppy disk, optical data storage device, etc.
Embodiments of the present application also provide a computer program product comprising one or more program codes stored in a computer-readable storage medium. The program code is read from the computer readable storage medium by a processor of the electronic device, which executes the program code to carry out the method steps of the interaction method provided in the above-described embodiments.
It should be noted that the above-described apparatus embodiments are merely illustrative, and the units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
From the above description of embodiments, it will be apparent to those skilled in the art that the embodiments may be implemented by means of software plus a general purpose hardware platform, or may be implemented by hardware. Those skilled in the art will appreciate that all or part of the processes implementing the methods of the above embodiments may be implemented by a computer program for instructing relevant hardware, where the program may be stored in a computer readable storage medium, and where the program may include processes implementing the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), or the like.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and are not limiting; the technical features of the above embodiments or in the different embodiments may also be combined within the idea of the application, the steps may be implemented in any order, and there are many other variations of the different aspects of the application as described above, which are not provided in detail for the sake of brevity; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the application.

Claims (10)

1. An interaction method, comprising:
the electronic equipment displays a first interface, wherein the first interface comprises a cleaning map displayed in a first display mode, and the cleaning map comprises at least two cleaning areas;
the electronic device displays a second interface in response to the swipe gesture, the second interface including a cleaning map displayed in a second display manner including displaying a cleaning sequence of cleaning areas sequentially selected in response to the swipe gesture.
2. The method of claim 1, wherein in the first interface, each of the cleaning areas comprises a selection control;
in the second interface, the second display mode includes displaying a cleaning sequence of target selection controls sequentially selected in response to the sliding gesture, the target selection controls being connected in a line in the selected sequence.
3. The method of claim 1, wherein the swipe gesture comprises a first swipe segment for indicating a selected cleaning area or a second swipe segment for indicating bypassing a cleaning area;
in the second interface, the second display means includes displaying a cleaning sequence of cleaning areas sequentially selected in response to the first sliding section.
4. A method according to any one of claims 1-3, characterized in that the method further comprises:
the electronic device updates the second interface in response to a deselected first operation such that the cleaning region deselected by the first operation is removed from the cleaning sequence.
5. A method according to any one of claims 1-3, characterized in that the method further comprises:
And under the condition that the sliding gesture is closed end to end, cleaning the cleaning area in the cleaning sequence in a first cleaning mode.
6. The method of claim 5, wherein prior to the electronic device displaying the first interface, the method further comprises:
the electronic equipment displays a third interface, wherein the third interface comprises a setting window for editing a cleaning mode and a preview control;
the electronic device displays a first interface comprising: after receiving a second operation for editing the cleaning mode in the setting window, the first device responds to a third operation acting on the preview control to display the first interface, and each cleaning area comprises a cleaning mode icon.
7. The method of claim 6, wherein the method further comprises:
the electronic equipment responds to a fourth operation on a target cleaning mode icon, and a cleaning mode setting popup window corresponding to the target cleaning mode icon is displayed;
and the electronic equipment responds to the editing operation of the setting popup window and updates the cleaning mode icon of the cleaning area corresponding to the target cleaning mode icon in the first interface.
8. A method according to any one of claims 1-3, characterized in that the method further comprises:
the electronic device adaptively adjusts the display proportion of the first interface based on the screen size of the electronic device.
9. An electronic device, comprising:
at least one processor;
a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the interaction method of any of claims 1-8.
10. A computer readable storage medium storing computer executable instructions for causing a computer device to perform the interaction method of any of claims 1-8.
CN202311281810.4A 2023-09-27 2023-09-27 Interaction method and electronic equipment Pending CN117234403A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311281810.4A CN117234403A (en) 2023-09-27 2023-09-27 Interaction method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311281810.4A CN117234403A (en) 2023-09-27 2023-09-27 Interaction method and electronic equipment

Publications (1)

Publication Number Publication Date
CN117234403A true CN117234403A (en) 2023-12-15

Family

ID=89092755

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311281810.4A Pending CN117234403A (en) 2023-09-27 2023-09-27 Interaction method and electronic equipment

Country Status (1)

Country Link
CN (1) CN117234403A (en)

Similar Documents

Publication Publication Date Title
US11272823B2 (en) Zone cleaning apparatus and method
US10583561B2 (en) Robotic virtual boundaries
US11853523B2 (en) Display device and method of indicating an active region in a multi-window display
AU2015322263B2 (en) Cleaning robot and method for controlling cleaning robot
WO2018192394A1 (en) Interaction method and apparatus for virtual reality scene, storage medium and electronic apparatus
US10365659B2 (en) Robot cleaner, control apparatus, control system, and control method of robot cleaner
US20180039400A1 (en) Multichannel controller
CN109984678B (en) Cleaning robot and cleaning method thereof
JP6469706B2 (en) Modeling structures using depth sensors
US20210283773A1 (en) Robot cleaner, terminal device and control method therefor
JP5807686B2 (en) Image processing apparatus, image processing method, and program
GB2567944A (en) Robotic virtual boundaries
KR102143584B1 (en) Display apparatus and method for controlling thereof
CN108159697B (en) Virtual object transmission method and device, storage medium and electronic equipment
TW201305761A (en) An autonomous robot and a positioning method thereof
CN111142669B (en) Interaction method, device, equipment and storage medium from two-dimensional interface to three-dimensional scene
CN109242963B (en) Three-dimensional scene simulation device and equipment
JP2017502369A (en) Control method, apparatus and mobile device for moving body
JP2014502755A5 (en)
CN110955421A (en) Method, system, electronic device, storage medium for robot programming
JP6360509B2 (en) Information processing program, information processing system, information processing method, and information processing apparatus
WO2021027954A1 (en) Control method, portable terminal and storage medium
CN109107151A (en) Virtual object control method and device, electronic equipment, storage medium
KR20230054336A (en) Cleaning robot and controlling method thereof
JP2024525118A (en) Method for controlling virtual characters, device, electronic device and computer program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination