CN109965785B - Mobile terminal with display and information processing method - Google Patents

Mobile terminal with display and information processing method Download PDF

Info

Publication number
CN109965785B
CN109965785B CN201910161383.3A CN201910161383A CN109965785B CN 109965785 B CN109965785 B CN 109965785B CN 201910161383 A CN201910161383 A CN 201910161383A CN 109965785 B CN109965785 B CN 109965785B
Authority
CN
China
Prior art keywords
mobile terminal
user
mobile robot
restriction
identification information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910161383.3A
Other languages
Chinese (zh)
Other versions
CN109965785A (en
Inventor
陈六根
周锋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Silver Star Intelligent Group Co Ltd
Original Assignee
Shenzhen Silver Star Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Silver Star Intelligent Technology Co Ltd filed Critical Shenzhen Silver Star Intelligent Technology Co Ltd
Priority to CN201910161383.3A priority Critical patent/CN109965785B/en
Publication of CN109965785A publication Critical patent/CN109965785A/en
Application granted granted Critical
Publication of CN109965785B publication Critical patent/CN109965785B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/28Floor-scrubbing machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4063Driving means; Transmission means therefor
    • A47L11/4066Propulsion of the whole machine
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4072Arrangement of castors or wheels
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0264Details of the structure or mounting of specific components for a camera module assembly
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W76/00Connection management
    • H04W76/10Connection setup

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a mobile terminal with a display and an information processing method, wherein the information processing method comprises the following steps: establishing wireless communication connection with the mobile robot; acquiring a restriction characteristic defined by a user; performing interactive operation according to an interactive operation type graphical user interface object on a display by a user to generate preset identification information; the restriction feature and the preset identification information are bound and transmitted to the mobile robot, so that when the environmental feature shot by the camera module of the mobile robot contains the restriction feature transmitted from the mobile terminal, the mobile robot executes the preset behavior according to the preset identification information, the user can freely define the identification, the number, the shape, the volume and the like of the restriction feature, the user-defined requirement is met, and the problem of poor reliability of the scheme for setting the virtual boundary line in the map information displayed by the mobile terminal is solved.

Description

Mobile terminal with display and information processing method
Technical Field
The invention relates to the technical field of robots, in particular to a mobile terminal with a display and an information processing method.
Background
With the technological progress and the development of robotics, various mobile robots replacing manual labor have appeared in various fields. In the prior art, in consideration of safety factors or work efficiency, it is sometimes necessary to limit a mobile robot to work in a specific area, such as a home mobile robot and a security patrol robot. Therefore, it is necessary to provide an information processing method for restricting the mobile robot to work in a certain area.
In contrast, chinese patent application No. CN201310436173.3 discloses that a cleaning robot transmits acquired map information to a smartphone, and a user adds a virtual boundary line to the map information to restrict the area operation of the cleaning robot within the virtual boundary line.
However, in the process of implementing the present invention, the inventors found that the following problems exist: in the scheme of setting the virtual boundary line in the map information, once the map information has a large error or is wrong, the robot cannot identify the virtual boundary line, the reliability is poor, and the user-defined requirement cannot be met.
Disclosure of Invention
The invention aims to solve the problems that in the prior art, the reliability is poor and the user-defined requirement cannot be met by adopting a scheme of setting a virtual boundary line in map information displayed by a mobile terminal, and provides the mobile terminal with a display and an information processing method.
In order to solve the technical problem, the invention adopts the following technical scheme: an information processing method based on a mobile terminal with a display comprises the following steps:
establishing wireless communication connection with the mobile robot;
acquiring a restriction characteristic defined by a user;
performing interactive operation according to the interactive operation type graphical user interface object on the display by the user to generate preset identification information;
and binding the restriction feature with the preset identification information and transmitting the restriction feature to the mobile robot, so that when the environmental feature shot by a camera module of the mobile robot contains the restriction feature transmitted from a mobile terminal, the mobile robot executes a preset behavior according to the preset identification information.
Optionally, the display includes a first interactive graphical user interface object, and first identification information used for representing inaccessibility is generated according to an interactive operation performed by a user with the first interactive graphical user interface object; and binding the restriction feature with the first identification information and transmitting the restriction feature to the mobile robot, so that when the environmental feature shot by the camera module of the mobile robot contains the restriction feature transmitted from the mobile terminal, the mobile robot executes avoidance operation according to the first identification information.
Optionally, the display includes a second interactive operation type graphical user interface object, and second identification information for representing the fall warning is generated according to the interactive operation between the user and the second interactive operation type graphical user interface object; and binding and transmitting the restriction feature and the second identification information to the mobile robot, so that when the environmental feature shot by the camera module of the mobile robot contains the restriction feature transmitted from the mobile terminal, the mobile robot performs deceleration or steering operation according to the second identification information.
Optionally, the display includes a third interactive operation type graphical user interface object, and third identification information for representing the backfill seat approach area is generated according to an interactive operation performed between a user and the third interactive operation type graphical user interface object; and binding and transmitting the restriction feature and the third identification information to the mobile robot, so that when the environmental feature shot by the camera module of the mobile robot contains the restriction feature transmitted from the mobile terminal, the mobile robot avoids the restriction feature according to the third identification information in an operating mode, and the mobile robot crosses the restriction feature in a back-charging mode.
Optionally, the restriction feature includes a specific pattern which is shot by a user through a camera on the mobile terminal and is customized by the user.
Optionally, the restriction feature comprises a specific pattern loaded by the user from a local gallery using the mobile terminal and/or downloaded from the internet and customized by the user.
The embodiment of the invention provides an information processing method based on a mobile terminal with a display, which comprises the following steps: establishing wireless communication connection with the mobile robot; acquiring a restriction characteristic defined by a user; performing interactive operation according to an interactive operation type graphical user interface object on a display by a user to generate preset identification information; the restriction feature and the preset identification information are bound and transmitted to the mobile robot, so that when the environmental feature shot by the camera module of the mobile robot contains the restriction feature transmitted from the mobile terminal, the mobile robot executes the preset behavior according to the preset identification information, the user can freely define the identification, the number, the shape, the volume and the like of the restriction feature, the user-defined requirement is met, and the problem of poor reliability of the scheme for setting the virtual boundary line in the map information displayed by the mobile terminal is solved.
In order to solve the technical problem, the invention also adopts the following technical scheme: a mobile terminal having a display, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform any one of the information processing methods described above.
Optionally, the mobile terminal is equipped with a camera, and the restriction feature includes that the user uses the camera on the mobile terminal to shoot a specific pattern customized by the user.
Optionally, the restriction feature comprises a user loading a specific pattern customized by the user from a local gallery using the mobile terminal and/or downloading the specific pattern from the internet.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and other modifications can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a schematic bottom structure diagram of a mobile robot according to an embodiment of the present invention;
FIG. 2 is a front view of the mobile robot of FIG. 1;
FIG. 3 is a simplified schematic diagram of a mobile robot located in an application scenario;
FIG. 4 is a flowchart illustrating an information processing method according to an embodiment of the present invention
Fig. 5 is a functional block diagram of a mobile robot establishing a wireless communication connection with a mobile terminal;
FIG. 6 is a simplified schematic diagram of a mobile robot and mobile terminal in a first application scenario;
FIG. 7 is a simplified schematic diagram of a mobile robot and mobile terminal in a second application scenario;
FIG. 8 is a simplified schematic diagram of a mobile robot and mobile terminal in a third application scenario;
fig. 9 is a hardware circuit diagram of a mobile terminal with a touch-sensitive display for executing an information processing method according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be described clearly below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The mobile robot may be any one of a home service robot, a dish delivery robot, a reception service robot, a cleaning robot, and the like, and is not limited herein. The following description will be given taking a mobile robot as an example of a cleaning robot.
Fig. 1 is a schematic bottom structure diagram of a mobile robot 100, and fig. 2 is a front view of the mobile robot 100 in fig. 1. The mobile robot 100 includes: machine body 10, drive wheel mechanism, cleaning mechanism 30, camera module 40, wireless communication module 50, and control module 60.
A driving wheel mechanism is connected to the machine body 10, and the driving wheel mechanism can be controlled by the control module 60 so as to drive the mobile robot 100 to move on the ground. In the embodiment of the present invention, the driving wheel mechanism is a roller type structure, and in other embodiments, the driving wheel mechanism may also be a crawler type structure. The driving wheel mechanism includes a left driving wheel mechanism 210 and a right driving wheel mechanism 220, and the left driving wheel mechanism 210 and the right driving wheel mechanism 220 are respectively disposed on the left and right sides of the bottom of the machine body 100. The left drive wheel mechanism 210 and the right drive wheel mechanism 220 each contain a motor for driving the respective roller.
In the embodiment of the present invention, the mobile robot 100 further includes a universal wheel mechanism 70, and the universal wheel mechanism 70 is provided on the front side of the bottom of the robot body 10; in other embodiments, for example, the universal wheel mechanism 70 may be provided on the rear side of the bottom of the machine body 10; for another example, when the mobile robot 100 includes two universal wheel mechanisms 70, the two universal wheel mechanisms 70 may be respectively provided on the front side and the rear side of the bottom of the machine body 10.
The cleaning mechanism 30 can clean the garbage on the ground, and in the embodiment of the present invention, the cleaning mechanism 30 includes a roller brush driven by a motor, and the roller brush can sweep the garbage on the ground into the garbage box 80 of the mobile robot 100 when rotating. To improve cleaning efficiency, the mobile robot 100 may further include a fan assembly for sucking fine particles of trash on the floor into the trash box 80. In other embodiments, the cleaning mechanism 30 may include a mopping assembly for dry mopping or wet mopping the floor.
The mobile robot 100 may further include a cliff detection sensor 90, the cliff detection sensor 90 being provided at the front of the mobile robot 100 for detecting a fall of the ground, for example, turning or retreating after detecting a staircase, thereby preventing a fall.
As shown in fig. 3, a camera module 40 may be provided at a front portion of the machine body 10, and the camera module 40 may be capable of photographing environmental characteristics around the mobile robot 100 when the mobile robot 100 moves on the ground, for example, photographing objects on the ground and/or the ground. In other embodiments, the camera module 40 may be provided on the top of the machine body 10, for example, to capture a door frame and/or lintel.
The control module 60 may include a Micro Controller Unit (MCU), and may also include any one or more of a CPU, a PLC, a DSP, an SoC, an FPGA, and the like.
As shown in fig. 4, an information processing method provided in an embodiment of the present invention is suitable for a mobile terminal with a touch-sensitive display. The information processing method comprises the following steps: step S110, step S120, step S130, and step S140, wherein:
step S110 includes: and the mobile terminal establishes wireless communication connection with the mobile robot. Referring to fig. 5, the mobile terminal can establish a wireless communication connection with the wireless communication module 50 of the mobile robot 100, and the wireless communication module 50 may be any one or more of a wifi communication module, a ZigBee communication module, a Bluetooth communication module, and the like. The mobile terminal can be a smart phone, a tablet computer, an intelligent remote controller and the like.
Step S120 includes: the mobile terminal obtains the restriction features customized by the user. In the embodiment of the present invention, the restriction features customized by the user include, but are not limited to, the following three types: firstly, a user utilizes a specific pattern shot by a camera on a mobile terminal, for example, the user manufactures restriction features with the specific pattern by drawing, carving, spraying, printing and other modes, and carriers of the restriction features can be paper, cloth, the ground, the wall surface and the like; secondly, the user loads restriction features from the local gallery by using the mobile terminal, for example, the user stores the restriction features with specific patterns in the local gallery in advance; and thirdly, downloading the restriction features from the internet by using the mobile terminal, for example, downloading a picture comprising a specific pattern from the internet by the user, and then printing the picture to be attached to the ground, the wall surface, the door frame, the door lintel and other places needing to play the function of the picture. It should be noted that the user customization mentioned herein includes but is not limited to: the user makes a restriction feature having a specific pattern, the user selects a restriction feature having a specific pattern, and the like, in order to facilitate the user to freely define the identification, number, shape, volume, and the like of the restriction feature.
The step S130 includes: and the mobile terminal carries out interactive operation according to the interactive operation type graphical user interface object on the display and the user to generate preset identification information.
The step S140 includes: and binding the restriction feature with the preset identification information and transmitting the restriction feature to the mobile robot, so that when the environmental feature shot by the camera module of the mobile robot contains the restriction feature transmitted from the mobile terminal, the mobile robot executes a preset behavior according to the preset identification information.
In the embodiment of the present invention, the control module 60 is configured to determine whether the environmental feature photographed by the camera module 40 includes the restriction feature transferred from the mobile terminal, and control the mobile robot 100 to perform the preset behavior according to the preset identification information if the environmental feature photographed by the camera module 40 includes the restriction feature transferred from the mobile terminal. Specifically, the control module 60 may be configured to determine similarity between the environmental feature captured by the camera module 40 and the constraint feature acquired from the mobile terminal, and control the mobile robot 100 to perform the preset behavior according to the preset identification information when the similarity exceeds a preset similarity threshold.
For the mobile robot 100, different restriction features may represent different functions, and a user often wants the different restriction features to perform different functions, and in order to distinguish the restriction features of different functions, preset identification information for representing different functions needs to be bound with the restriction features in advance, so that the control module 60 can control the mobile robot to perform different behaviors according to the preset identification information for representing different functions.
The method for binding the preset identification information with the restriction feature includes, but is not limited to: firstly, binding preset identification information with restriction characteristics selected by a user while generating the preset identification information; secondly, the generated preset identification information is bound with the restriction characteristics selected by the user when the mobile robot 100 is ready to be transferred.
For convenience of reference, three different restriction features are named as follows: a first constraint feature, a second constraint feature, a third constraint feature; three different kinds of restriction characteristics are respectively bound with three different kinds of identification information in a one-to-one correspondence manner, and the three different kinds of identification information are respectively named as: first identification information, second identification information and third identification information; the three different interactive operation type graphic user interface objects respectively correspond to three different identification information one by one, and the three different interactive operation type graphic user interface objects are respectively named as: a first, a second, and a third interactive graphical user interface object.
As shown in fig. 6, the first identification information is used to characterize the non-accessibility, and can function as a "virtual wall". In a usage scenario, if the user does not want the mobile robot 100 to enter a certain room, the first restriction feature may be set at the entrance of the room door, for example, the first restriction feature may be attached to the ground at the entrance, to the top beam of the room door, or to both sides of the room door; if the user does not wish to move the robot 100 into an area, the area may be enclosed using a first restriction feature. The user shoots the first restriction feature by using a camera on the mobile terminal, or loads the first restriction feature from a local gallery or downloads the first restriction feature from the internet, and then the user performs an interactive operation with the first interactive operation type graphical user interface object 5A on the touch-sensitive display to generate first identification information, and then binds the first restriction feature and the first identification information and transmits the first restriction feature and the first identification information to the mobile robot 100.
When the environmental characteristics captured by the camera module 40 include the first restriction characteristic and the first identification information transmitted from the mobile terminal while the mobile robot 100 is moving, the control module 60 controls the mobile robot 100 to perform the avoidance operation based on the first identification information so that the mobile robot 100 does not enter a certain room or a certain area beyond the first restriction characteristic.
As shown in fig. 7, the second identification information is used to represent a fall warning, and can perform a fall warning function. In a use scenario, if the user does not want the mobile robot 100 to approach too closely or travel quickly to a ground fall such as a staircase, a second restriction feature may be provided at the staircase entrance. The user shoots the second restriction feature by using the camera on the mobile terminal, or loads the second restriction feature from the local gallery or downloads the second restriction feature from the internet, and then the user performs an interactive operation with the second interactive operation type graphical user interface object 5B on the touch-sensitive display to generate second identification information, and then the second restriction feature and the second identification information are bound and transmitted to the mobile robot 100.
If the environmental characteristics captured by the camera module 40 include the second restriction characteristic and the second identification information transmitted from the mobile terminal while the mobile robot 100 is moving, the control module 60 controls the mobile robot 100 to perform a deceleration or steering operation according to the second identification information, thereby preventing the mobile robot 100 from falling down due to a failure of the cliff detection sensor 90 or a too fast speed of the mobile robot going out of a staircase.
As shown in fig. 8, the third identification information is used to characterize the refill seat near guard area, and can perform the function of the near guard signal (refer to the function of the near guard signal 131 in chinese patent application publication No. CN 108390441A). In a use scenario, if the user does not want the mobile robot 100 to touch the recharging seat in a cleaning or mopping mode, which may cause the recharging seat to shift, but wants the mobile robot 100 to be able to dock with the recharging seat in the recharging mode for charging, a third restriction feature may be provided on the ground around the recharging seat. The user shoots the third restriction feature by using the camera on the mobile terminal, or loads the third restriction feature from the local gallery or downloads the third restriction feature from the internet, and then the user performs an interactive operation with the third interactive operation type graphical user interface object 5C on the touch-sensitive display to generate third identification information, and further binds the third restriction feature with the third identification information and transmits the third restriction feature to the mobile robot 100.
In the moving process of the mobile robot 100, if the environmental characteristics photographed by the camera module 40 include the third restriction characteristic transmitted from the mobile terminal, in the working mode, the control module 60 controls the mobile robot 100 to avoid the third restriction characteristic according to the third identification information, so as to avoid touching the recharging seat; in the recharge mode, the control module 60 controls the mobile robot 100 to override the third constraint feature, thereby docking with a recharge cradle.
The embodiment of the invention provides an information processing method based on a mobile terminal with a display, which comprises the following steps: establishing wireless communication connection with the mobile robot; acquiring a restriction characteristic defined by a user; performing interactive operation according to an interactive operation type graphical user interface object on a display by a user to generate preset identification information; the restriction feature and the preset identification information are bound and transmitted to the mobile robot, so that when the environmental feature shot by the camera module of the mobile robot contains the restriction feature transmitted from the mobile terminal, the mobile robot executes the preset behavior according to the preset identification information, the user can freely define the identification, the number, the shape, the volume and the like of the restriction feature, the user-defined requirement is met, and the problem of poor reliability of the scheme for setting the virtual boundary line in the map information displayed by the mobile terminal is solved.
Fig. 9 is a hardware circuit diagram of a mobile terminal 300 with a display for executing the above information processing method according to an embodiment of the present invention, the mobile terminal includes at least one processor and a memory communicatively connected to the at least one processor, the memory stores a program of instructions executable by the at least one processor, and the program of instructions is executed by the at least one processor to enable the at least one processor to execute the method shown in fig. 4. The description is given by taking one processor 310 and the memory 320 in fig. 9 as an example, in this embodiment, the processor 310 and the memory 320 may be connected by a bus or by other means, and fig. 9 takes the example of being connected by a bus as an example. The processor 310 may be in the form of an integrated circuit such as a single chip, FPGA, ASIC, DSP, or in the form of a circuit including an integrated circuit and peripheral circuits.
The memory 320 is a nonvolatile computer-readable storage medium, and can be used to store a nonvolatile software program, a nonvolatile computer-executable program, and program instructions corresponding to the information processing method in the embodiment of the present invention. The processor 310 executes various functional applications and data processing of the mobile terminal 300 by executing nonvolatile software programs and instructions stored in the memory 320, that is, implements the above-described method embodiment information processing method.
The memory 320 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the mobile terminal 300, and the like. Further, the memory 320 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some embodiments, memory 320 may optionally include memory located remotely from processor 310, which may be connected to mobile terminal 300 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example" or "an alternative embodiment," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
The above-described embodiments do not limit the scope of the present invention. Any modification, equivalent replacement, and improvement made within the spirit and principle of the above-described embodiments should be included in the protection scope of the technical solution.

Claims (9)

1. An information processing method based on a mobile terminal with a display comprises the following steps:
establishing wireless communication connection with the mobile robot;
acquiring a restriction characteristic defined by a user;
performing interactive operation according to the interactive operation type graphical user interface object on the display by the user to generate preset identification information;
binding the restriction feature with the preset identification information and transmitting the restriction feature to the mobile robot, so that when the environmental feature shot by a camera module of the mobile robot contains the restriction feature transmitted from a mobile terminal, the mobile robot executes a preset behavior according to the preset identification information;
the constraint feature includes: the user utilizes at least one of the specific identification which is shot by the camera on the mobile terminal and is defined by the user, the restriction characteristic which is loaded from the body gallery by the user utilizing the terminal equipment and the restriction characteristic which is downloaded from the internet by the user utilizing the mobile terminal.
2. The information processing method according to claim 1, wherein a first interactive graphical user interface object is included on the display, and first identification information for characterizing inaccessibility is generated according to a user's interaction with the first interactive graphical user interface object; and binding the restriction feature with the first identification information and transmitting the restriction feature to the mobile robot, so that when the environmental feature shot by the camera module of the mobile robot contains the restriction feature transmitted from the mobile terminal, the mobile robot executes avoidance operation according to the first identification information.
3. The information processing method according to claim 1, wherein a second interactive graphical user interface object is included on the display, and second identification information for characterizing a fall warning is generated according to an interactive operation performed by a user with the second interactive graphical user interface object; and binding and transmitting the restriction feature and the second identification information to the mobile robot, so that when the environmental feature shot by the camera module of the mobile robot contains the restriction feature transmitted from the mobile terminal, the mobile robot performs deceleration or steering operation according to the second identification information.
4. The information processing method according to claim 1, wherein a third interactive graphical user interface object is included on the display, and third identification information for characterizing a backfill seat approach area is generated according to user interaction with the third interactive graphical user interface object; and binding and transmitting the restriction feature and the third identification information to the mobile robot, so that when the environmental feature shot by the camera module of the mobile robot contains the restriction feature transmitted from the mobile terminal, the mobile robot avoids the restriction feature according to the third identification information in an operating mode, and the mobile robot crosses the restriction feature in a back-charging mode.
5. The information processing method according to any one of claims 1 to 4, wherein the restriction feature includes a specific pattern that a user takes with a camera on the mobile terminal and is customized by the user.
6. The information processing method according to any one of claims 1 to 4, wherein the restriction feature includes a specific pattern that is loaded from a local gallery by a user using a mobile terminal and/or downloaded from the internet and customized by the user.
7. A mobile terminal having a display, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the information processing method of any one of claims 1 to 4.
8. The mobile terminal of claim 7, wherein the mobile terminal is equipped with a camera, and the restriction feature comprises a user taking a specific pattern customized by the user with the camera on the mobile terminal.
9. The mobile terminal of claim 7, wherein the restriction feature comprises a user loading a specific pattern customized by the user from a local gallery and/or from the internet using the mobile terminal.
CN201910161383.3A 2019-03-04 2019-03-04 Mobile terminal with display and information processing method Active CN109965785B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910161383.3A CN109965785B (en) 2019-03-04 2019-03-04 Mobile terminal with display and information processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910161383.3A CN109965785B (en) 2019-03-04 2019-03-04 Mobile terminal with display and information processing method

Publications (2)

Publication Number Publication Date
CN109965785A CN109965785A (en) 2019-07-05
CN109965785B true CN109965785B (en) 2021-05-28

Family

ID=67077819

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910161383.3A Active CN109965785B (en) 2019-03-04 2019-03-04 Mobile terminal with display and information processing method

Country Status (1)

Country Link
CN (1) CN109965785B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111214173A (en) * 2019-11-21 2020-06-02 三峡大学 Crawler-type stair-climbing dust collection device and method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102866706A (en) * 2012-09-13 2013-01-09 深圳市银星智能科技股份有限公司 Cleaning robot adopting smart phone navigation and navigation cleaning method thereof
CN204287962U (en) * 2014-10-31 2015-04-22 深圳市大疆创新科技有限公司 A kind of removable machine and robot

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100485696B1 (en) * 2003-02-07 2005-04-28 삼성광주전자 주식회사 Location mark detecting method for a robot cleaner and a robot cleaner using the same method
CN104460663A (en) * 2013-09-23 2015-03-25 科沃斯机器人科技(苏州)有限公司 Method for controlling cleaning robot through smart phone
CN105796002B (en) * 2016-03-31 2018-09-18 北京小米移动软件有限公司 Clean robot indoor cleaning processing method, clean robot and mobile terminal
CN108247655A (en) * 2016-12-29 2018-07-06 广州映博智能科技有限公司 A kind of monitoring service robot control system
CN106990779A (en) * 2017-03-24 2017-07-28 上海思岚科技有限公司 Make the implementation method of mobile robot progress virtual wall avoidance by computer client
CN106983460B (en) * 2017-04-07 2019-08-27 小狗电器互联网科技(北京)股份有限公司 A kind of sweeping robot region cleaning display control method
CN107803837B (en) * 2017-12-01 2020-02-07 深圳市无限动力发展有限公司 Restricting device, visual floor sweeping robot and control method of visual floor sweeping robot

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102866706A (en) * 2012-09-13 2013-01-09 深圳市银星智能科技股份有限公司 Cleaning robot adopting smart phone navigation and navigation cleaning method thereof
CN204287962U (en) * 2014-10-31 2015-04-22 深圳市大疆创新科技有限公司 A kind of removable machine and robot

Also Published As

Publication number Publication date
CN109965785A (en) 2019-07-05

Similar Documents

Publication Publication Date Title
JP7484015B2 (en) Obstacle detection method and device, self-propelled robot, and storage medium
CN109589050A (en) A kind of method and clean robot controlling cleaning mode
CN106200645B (en) Autonomous robot, control device, and control method
WO2018112495A2 (en) Autonomous mobile robot and method for controlling an autonomous mobile robot
TWI654961B (en) Robotics device, terminal device, system for remotely controlling robotics device, and program
CN111427357A (en) Robot obstacle avoidance method and device and storage medium
KR102206201B1 (en) Cleaning robot and controlling method thereof
CN111265151B (en) Robot control method, device and storage medium
CN109965785B (en) Mobile terminal with display and information processing method
AU2022350408B2 (en) Cleaning control method and device, cleaning robot and storage medium
CN108852174A (en) Autonomous mobile robot and its seek piling method, control device and intelligent cleaning system
CN114451814B (en) Automatic walking device and control method for automatic walking device
CN109875463A (en) Clean robot and its clean method
CN114557633B (en) Cleaning parameter configuration method, device, equipment and medium for automatic cleaning equipment
CN114587189A (en) Cleaning robot, control method and device thereof, electronic equipment and storage medium
CN107390552B (en) Control method, system and the dust catcher of dust catcher
CN111401574A (en) Household appliance, accessory management method and readable medium
WO2023104118A1 (en) Cleaning path determination method and system, and device and storage medium
CN110000776B (en) Mobile robot and control method thereof
CN112716392A (en) Control method of cleaning equipment and cleaning equipment
CN113693498B (en) Child lock control method and device, robot, storage medium and electronic equipment
CN114879691A (en) Control method for self-propelled robot, storage medium, and self-propelled robot
CN110712204B (en) Robot working method and robot
KR20210036160A (en) Robot cleaner and control method thereof
JP6619055B2 (en) Running body device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address

Address after: 518110 1701, building 2, Yinxing Zhijie, No. 1301-72, sightseeing Road, Xinlan community, Guanlan street, Longhua District, Shenzhen, Guangdong Province

Patentee after: Shenzhen Yinxing Intelligent Group Co.,Ltd.

Address before: 518110 building A1, Yinxing hi tech Industrial Park, Guanlan street, Longhua District, Shenzhen City, Guangdong Province

Patentee before: Shenzhen Silver Star Intelligent Technology Co.,Ltd.

CP03 Change of name, title or address