CN111390930A - Intelligent protection method and system based on artificial intelligence and clothes climbing robot - Google Patents

Intelligent protection method and system based on artificial intelligence and clothes climbing robot Download PDF

Info

Publication number
CN111390930A
CN111390930A CN202010306032.XA CN202010306032A CN111390930A CN 111390930 A CN111390930 A CN 111390930A CN 202010306032 A CN202010306032 A CN 202010306032A CN 111390930 A CN111390930 A CN 111390930A
Authority
CN
China
Prior art keywords
clothes
climbing robot
user
controlling
climbing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010306032.XA
Other languages
Chinese (zh)
Inventor
李弇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Zhaoxuan Digital Technology Co ltd
Original Assignee
Suzhou Zhaoxuan Digital Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Zhaoxuan Digital Technology Co ltd filed Critical Suzhou Zhaoxuan Digital Technology Co ltd
Priority to CN202010306032.XA priority Critical patent/CN111390930A/en
Publication of CN111390930A publication Critical patent/CN111390930A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J5/00Manipulators mounted on wheels or on carriages
    • B25J5/007Manipulators mounted on wheels or on carriages mounted on wheels
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Abstract

An intelligent protection method and system based on artificial intelligence and a clothes climbing robot comprises the following steps: if a clothes protection instruction is received, the clothes climbing robot is controlled to start and the micro camera is controlled to start to capture high-definition images in real time, a clothes wrinkle area of a user is analyzed in real time, the clothes climbing robots of a first preset number are controlled to move to the wrinkle area to move circularly, sleeve opening areas of the clothes of the user are analyzed in real time, the clothes climbing robots of a second preset number are controlled to move to the sleeve opening areas to be stopped and fixed, whether zippers or buttons exist in clothes pockets of the user or not is analyzed in real time, if not, the clothes climbing robots of a third preset number are controlled to move to positions of the clothes pockets of the user to enter a motion pocket sealing state, a fingerprint sensor is controlled to start to acquire fingerprint information in real time, whether hands are located on the outer surface of the clothes climbing robot or not is analyzed in real time and are fingerprints of the user, and if yes, the clothes climbing robot.

Description

Intelligent protection method and system based on artificial intelligence and clothes climbing robot
Technical Field
The invention relates to the field of clothing protection, in particular to an intelligent protection method and system based on artificial intelligence and a clothing climbing robot.
Background
Artificial Intelligence (Artificial Intelligence), abbreviated in english as AI. The method is a new technical science for researching and developing theories, methods, technologies and application systems for simulating, extending and expanding human intelligence.
Artificial intelligence is a branch of computer science that attempts to understand the essence of intelligence and produce a new intelligent machine that can react in a manner similar to human intelligence, a field of research that includes robotics, language recognition, image recognition, natural language processing, and expert systems, among others. Since the birth of artificial intelligence, theories and technologies become mature day by day, and application fields are expanded continuously, so that science and technology products brought by the artificial intelligence in the future can be assumed to be 'containers' of human intelligence. The artificial intelligence can simulate the information process of human consciousness and thinking. Artificial intelligence is not human intelligence, but can think like a human, and can also exceed human intelligence.
Therefore, how to combine artificial intelligence, the protection of micro-robot and clothing for according to user's demand micro-robot removes and provides the regional arrangement function of clothing fold, cuff fixed prevent shrink-proof function, pocket seal theftproof function, meal splashproof juice function, clothing lower hem prevent shrink-proof function and heating function for the user at user's clothing surface through the magnetism centre gripping wheel and be the problem that needs to solve at present urgently.
Disclosure of Invention
The purpose of the invention is as follows: in order to overcome the defects in the background art, the embodiment of the invention provides an intelligent protection method and system based on artificial intelligence and a clothes climbing robot, which can effectively solve the problems in the background art.
The technical scheme is as follows:
an intelligent protection method based on artificial intelligence and a clothes climbing robot comprises the following steps:
s1, if a clothes protection instruction sent by a user terminal keeping the connection relation is received, controlling the clothes climbing robot stored in the clothes pocket to start and controlling a micro camera arranged at the external position of the clothes climbing robot to start to shoot a high-definition image in real time;
s2, analyzing the wrinkle area of the clothes of the user in real time according to the high-definition images, and controlling a first preset number of clothes climbing robots to move to the wrinkle area on the surface of the clothes to move circularly according to the wrinkle area and the high-definition images;
s3, analyzing the cuff areas of the clothes of the user in real time according to the high-definition images, and controlling the clothes climbing robots with a second preset number to move to the cuff areas on the surfaces of the clothes to be parked and fixed according to the cuff areas and the high-definition images;
s4, analyzing whether zippers or buttons exist in clothes pockets of the user in real time according to the high-definition images;
s5, if not, controlling a third preset number of clothes climbing robots to move to the positions of the clothes pockets of the user on the surfaces of the clothes to enter a moving pocket sealing state according to the high-definition images and controlling fingerprint sensors arranged at the outer surface positions of the clothes climbing robots to start to acquire fingerprint information in real time;
s6, analyzing whether a hand is positioned on the outer surface of the clothes climbing robot and is a user fingerprint in real time according to the acquired fingerprint information and the high-definition image;
and S7, if yes, controlling the clothes climbing robot to move to the side upper part at the position of the clothes pocket of the user according to the high-definition image to remove the pocket seal.
As a preferred mode of the present invention, after S1, the method further includes the steps of:
s10, analyzing whether a user sits at a dining table or not in real time according to the high-definition images;
s11, if yes, controlling a fourth preset number of clothes climbing robots to move to the chest area of a user on the clothes surface according to the high-definition images and controlling electromagnetic layers arranged on the sides of the clothes climbing robots to start to enter an electromagnetic adsorption state;
s12, controlling the clothes climbing robot to perform magnetic attraction splicing in the chest area of the user according to the high-definition image, and analyzing whether the user has a meal or not in real time according to the high-definition image;
and S13, if yes, controlling the electromagnetic layer of the clothes climbing robot to be closed and controlling the clothes climbing robot to move and reset on the surface of clothes according to the high-definition image.
As a preferred mode of the present invention, after S1, the method further includes the steps of:
s14, analyzing the type information of the clothes where the climbing robot is located in real time according to the high-definition image, and analyzing whether the clothes worn by a user are of a preset type according to the type information of the clothes;
and S15, if yes, controlling a fifth preset number of clothes climbing robots to move to the lower hem positions of the clothes on the surfaces of the clothes according to the high-definition images, and controlling the moved clothes climbing robots to be fixed in sequence.
As a preferred mode of the present invention, after S1, the method further includes the steps of:
s16, controlling a temperature sensor arranged on the outer surface of the clothes climbing robot to start to acquire first temperature information in real time and acquiring second temperature information of an urban area where the climbing robot is located in real time;
s17, analyzing whether the temperature value is lower than a first preset temperature or not according to the first temperature information and the second temperature information;
and S18, if so, controlling a heating layer arranged at the position below the climbing robot to start heating to a second preset temperature.
As a preferable mode of the present invention, in S6, the method further includes the steps of:
s60, analyzing whether a human hand is located at a position of a clothes pocket or not in real time according to the high-definition image;
s61, if yes, analyzing whether the human hand touches a fingerprint sensor on the outer surface of the clothes climbing robot or not in real time according to the high-definition image;
and S62, if not, extracting the high-definition image containing the surrounding environment of the clothes pocket and sending warning information and the high-definition image containing the surrounding environment of the clothes pocket to the user terminal keeping the connection relation.
An intelligent protection system based on artificial intelligence and a clothes climbing robot, which uses the intelligent protection method based on artificial intelligence and the clothes climbing robot as claimed in any one of claims 1 to 5, and comprises a protection device, a microprocessor and a master controller;
the protective device comprises a clothes climbing robot, a miniature camera, a fingerprint sensor, an electromagnetic layer, a temperature sensor and a heating layer, wherein the clothes climbing robot is located at an initial position set by a user wearing outer-layer clothes and used for moving on the surface of the clothes and providing a corresponding service function; the miniature camera is arranged at the external position of the clothes climbing robot and is used for shooting an environment image of the external area of the clothes climbing robot; the fingerprint sensor is arranged at the outer surface of the clothes climbing robot and used for acquiring fingerprint information; the electromagnetic layer is arranged at the side position of the clothes climbing robot and used for colliding with other clothes climbing robots to perform electromagnetic adsorption; the temperature sensor is arranged at an external position of the clothes climbing robot and used for acquiring the temperature information of the environment where the clothes climbing robot is located; the heating layer is arranged below the clothes climbing robot and corresponds to the body of a user, and is used for providing a heating function;
the microprocessor is arranged at the inner position of the clothes climbing robot and is respectively connected with the clothes climbing robot, the miniature camera, the fingerprint sensor, the electromagnetic layer, the temperature sensor and the heating layer;
the general controller is fixed in the position that the user dressed the clothing planning, general controller includes:
the wireless module is used for being respectively in wireless connection with the microprocessor, the user terminal, the alarm center and the network;
the information receiving module is used for receiving information and/or instructions and/or requests;
the robot control module is used for controlling the clothes climbing robot to execute set operation according to set steps through the microprocessor;
the micro shooting module is used for controlling the micro camera to start or close through the microprocessor;
the information analysis module is used for processing and analyzing the information according to the specified information;
and the fingerprint identification module is used for controlling the start or the stop of the fingerprint sensor through the microprocessor.
As a preferable mode of the present invention, the general controller further includes:
and the magnetic attraction control module is used for controlling the magnetic attraction layer to be started or closed through the microprocessor.
As a preferable mode of the present invention, the general controller further includes:
the temperature identification module is used for controlling the temperature sensor to be started or closed through the microprocessor;
the temperature acquisition module is used for acquiring temperature information of an urban area where the clothes climbing robot is located;
and the intelligent heating module is used for controlling the heating layer to be heated to a set temperature value through the microprocessor.
As a preferable mode of the present invention, the general controller further includes:
the information extraction module is used for extracting the specified information and/or instruction and/or request contained information and/or instruction and/or request;
and the information sending module is used for sending the specified information and/or instruction and/or request to the specified object.
The invention realizes the following beneficial effects:
1. after the intelligent protection system is started, controlling a first set number of clothes climbing robots to patrol the surfaces of the clothes and tidy clothes wrinkle areas; meanwhile, the clothes climbing robots with the second set number are controlled to move to the cuff areas of the clothes to fix so as to prevent cuffs from shrinking upwards; meanwhile, identifying the sealing state of the pockets of the clothes of the user, distributing a third set number of clothes climbing robots to the pockets without zippers and buttons for sealing and fixing, and then controlling the fingerprint sensors of the clothes climbing robots of the pockets to start to identify the fingerprints of the user in real time; if the clothes worn by the user are identified to be of the set type, controlling the clothes climbing robots with the fifth set number to go to the clothes lower hem position to fix and prevent the clothes lower hem from shrinking upwards; and if the fact that the hand of the human body is located in the pocket area of the clothes is recognized and the fingerprint sensor of the clothes climbing robot in the pocket area is not touched, warning the user.
2. When the user is identified to have dinner, the clothes climbing robot with the fourth set number is controlled to move to the chest area of the clothes of the user, the electromagnetic layer of the clothes climbing robot is controlled to be started, and then the chest area of the user is spliced by mechanical energy to form a splash-proof juice area.
3. And when the current temperature is lower than the set temperature value, controlling a heating layer of the clothes climbing robot to start to heat in real time.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a flow chart of a smart safeguard method provided by one example of the present invention;
FIG. 2 is a flow chart of a meal splash control method provided by one example of the present invention;
FIG. 3 is a flow chart of a method for preventing the top-down of the lower hem of a garment according to an exemplary embodiment of the present invention;
FIG. 4 is a flow chart of a heating control method provided by one example of the present invention;
FIG. 5 is a flow chart of a pocket anti-theft method according to an exemplary embodiment of the present invention;
FIG. 6 is a connection diagram of an intelligent protection system according to an exemplary embodiment of the present invention;
fig. 7 is a side schematic view of a clothes climbing robot according to an example of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments.
Example one
Referring to fig. 1, 3, and 5-7.
Specifically, the embodiment provides an intelligent protection method based on artificial intelligence and a clothes climbing robot 10, and the method includes the following steps:
and S1, if a clothes protection instruction sent by the user terminal keeping the connection relation is received, controlling the clothes climbing robot 10 stored in the clothes pocket to start, and controlling the micro camera 11 arranged at the external position of the clothes climbing robot 10 to start to shoot high-definition images in real time.
In S1, specifically, after the information receiving module 31 included in the general controller 3 receives the clothing protection instruction sent by the user terminal that maintains the connection relationship, the robot controller module included in the general controller 3 controls the clothing climbing robot 10 stored in the clothing pocket and the microprocessor 2 thereof to start up, and the micro capturing module 33 included in the general controller 3 controls the micro camera 11 disposed at the external position of the started clothing climbing robot 10 to start up to capture a high-definition image in real time through the started microprocessor 2, wherein the high-definition image is an image of the external environment of the clothing climbing robot that the high-definition camera captures.
And S2, analyzing the wrinkle area of the clothes of the user in real time according to the high-definition images, and controlling the clothes climbing robots 10 of a first preset number to move to the wrinkle area on the surface of the clothes to move circularly according to the wrinkle area and the high-definition images.
At S2, specifically after the high definition camera is started, the robot control module 32 controls, according to the high definition image, a first preset number of the clothes climbing robots 10 to patrol and walk on the surface of the clothes through the microprocessor 2, and the clothes climbing robots 10 move on the surface of the clothes through the magnetic clamping wheels, where the first preset number is set by the user terminal, and is preferably 3 in this embodiment; when the clothes climbing robot 10 patrols and walks, the information analysis module 34 included in the master controller 3 analyzes the wrinkle area of the clothes of the user in real time according to the high definition images, and after the information analysis module 34 analyzes the wrinkle area of the clothes of the user, the robot control module 32 controls the clothes climbing robots 10 of a first preset number to move to the wrinkle area on the surface of the clothes for circular movement through the microprocessor 2 according to the high definition images, so that the wrinkle area is arranged and paved, and the wrinkle area of the clothes of the user is reduced.
And S3, analyzing the cuff areas of the clothes of the user in real time according to the high-definition images, and controlling the clothes climbing robots 10 with a second preset number to move to the cuff areas on the surfaces of the clothes to be parked and fixed according to the cuff areas and the high-definition images.
In S3, specifically, when the clothes climbing robot 10 patrols and walks on the surface of the clothes, the information analysis module 34 analyzes the cuff areas of the clothes of the user in real time according to the high definition images, and after the information analysis module 34 analyzes the cuff areas of the clothes, the robot control module 32 controls a second preset number of clothes climbing robots 10 to move to the cuff areas for stopping and fixing on the surface of the clothes according to the cuff areas and the high definition images by the microprocessor 2, wherein the second preset number is set by the user terminal, preferably 8 in this embodiment, and 4 sleeves are provided; so as to fix the position of the cuffs and block the clothes climbing robot 10, thereby preventing the clothes in the cuff area from shrinking upwards and avoiding the problem that the user walks away through the cuffs when lifting hands.
And S4, analyzing whether zippers or buttons exist in the clothes pockets of the user in real time according to the high-definition images.
In S4, specifically when the clothes climbing robot 10 patrols and walks on the surface of the clothes, the information analysis module 34 analyzes whether the user clothes pocket has a zipper or a button in real time according to the high definition image, that is, whether the user clothes pocket can be sealed.
And S5, if not, controlling a third preset number of clothes climbing robots 10 to move to the positions of the clothes pockets of the user on the surfaces of the clothes to enter a motion pocket sealing state according to the high-definition images and controlling the fingerprint sensors 12 arranged on the outer surface positions of the clothes climbing robots 10 to start to acquire fingerprint information in real time.
In S5, specifically after the information analysis module 34 analyzes that no zipper or button exists in the user 'S clothes pocket, the robot control module 32 controls, through the microprocessor 2, a third preset number of clothes climbing robots 10 to move to the positions of the user' S clothes pockets on the clothes surface to enter a moving pocket sealing state according to the high definition image, where the third preset number is set by the user terminal, and preferably 3 in this embodiment; the bag is moved from the obliquely upper part of the user clothes pocket to the position of the entrance and exit area of the user pocket and is stopped and fixed, so that the pocket of the user is sealed; meanwhile, the fingerprint identification module 35 of the master controller 3 controls the fingerprint sensor 12 arranged at the outer surface of the clothes climbing robot 10 entering the sealing state of the moving pocket to start to acquire fingerprint information in real time through the microprocessor 2.
And S6, analyzing whether a hand is positioned on the outer surface of the clothes climbing robot 10 and is a user fingerprint in real time according to the acquired fingerprint information and the high-definition image.
In S6, specifically after the fingerprint sensor 12 acquires the fingerprint information, the information analysis module 34 analyzes whether there is a hand on the outer surface of the clothing climbing robot 10 and is the fingerprint of the user according to the fingerprint information acquired by the fingerprint sensor 12 and the high-definition image in real time, that is, whether there is a hand touching the fingerprint sensor 12 is analyzed, and the hand touching the fingerprint sensor 12 is the fingerprint information entered by the user.
And S7, if yes, controlling the clothes climbing robot 10 to move to the side upper part at the clothes pocket position of the user according to the high-definition image to remove the pocket sealing.
At S7, specifically, after the information analysis module 34 analyzes that the hand is located on the outer surface of the clothing climbing robot 10 and is a user 'S fingerprint, the robot control module 32 controls the clothing climbing robot 10 corresponding to the pocket area to move laterally upward at the user' S clothing pocket position according to the high definition image by the microprocessor 2 to remove the pocket seal, so that the user can take the pocket articles; after the user leaves the pocket, the clothes climbing robot 10 automatically resets to continue sealing the pocket.
As a preferred mode of the present invention, after S1, the method further includes the steps of:
s14, analyzing the type information of the clothes where the climbing robot is located in real time according to the high-definition images, and analyzing whether the clothes worn by the user are of a preset type or not according to the type information of the clothes.
Specifically, when the high-definition camera is started and the clothes climbing robot 10 patrols and walks on the surface of clothes, the information analysis module 34 analyzes the type information of the clothes where the climbing robot is located in real time according to a high-definition image, and after the information analysis module 34 analyzes the type information of the clothes, the information analysis module 34 analyzes whether the clothes worn by a user are in a preset type according to the type information of the clothes; wherein the preset type is a type of clothes with a lower hem easy to rise, including but not limited to a sanitary wear.
And S15, if yes, controlling a fifth preset number of clothes climbing robots 10 to move to the lower hem positions of the clothes on the surfaces of the clothes according to the high-definition images, and controlling the moved clothes climbing robots 10 to be fixed in sequence.
Specifically, after the information analysis module 34 analyzes that the clothes worn by the user are of the preset type, the robot control module 32 controls, by using the microprocessor 2, a fifth preset number of clothes climbing robots 10 to move to the positions of the lower hem of the clothes on the surface of the clothes according to the high-definition image, where the fifth preset number is set by the user, and is preferably 16 in this embodiment; after the movement of the clothes climbing robot 10 is completed, the robot control module 32 sequentially stops and fixes the clothes climbing robot at a position of a lower hem of the clothes through the microprocessor 2.
Wherein, when the user needs to take off the clothes, the clothes climbing robot 10 in which the user's clothes exist is sequentially returned into the initial storage pocket.
As a preferable mode of the present invention, in S6, the method further includes the steps of:
and S60, analyzing whether a human hand is positioned at the position of the clothes pocket or not in real time according to the high-definition image.
Specifically, after the fingerprint sensor 12 acquires fingerprint information, the analysis module analyzes whether a human hand is located at a position of a clothes pocket according to the high-definition image in real time.
And S61, if yes, analyzing whether the human hand touches the fingerprint sensor 12 on the outer surface of the clothes climbing robot 10 or not in real time according to the high-definition image.
Specifically, after the information analysis module 34 analyzes that the human hand is located at the position of the clothes pocket, the information analysis module 34 analyzes whether the human hand touches the fingerprint sensor 12 on the outer surface of the climbing robot at the position of the clothes pocket in real time according to the high-definition image.
And S62, if not, extracting the high-definition image containing the surrounding environment of the clothes pocket and sending warning information and the high-definition image containing the surrounding environment of the clothes pocket to the user terminal keeping the connection relation.
Specifically, analyze at information analysis module 34 the human hand does not have touch clothing climbing robot 10 surface's fingerprint sensor 12 back, the information extraction module 40 that total controller 3 contained draws the high definition image that contains clothing pocket surrounding environment, after information extraction module 40 draws the completion, the information sending module 41 that total controller 3 contained sends warning information and includes the high definition image of clothing pocket surrounding environment to the user terminal who keeps the relation of connection to remind the user pocket region to have the hand that does not carry out fingerprint identification.
Example two
Referring to fig. 2, fig. 6-7.
Specifically, this embodiment is substantially the same as the first embodiment, except that in this embodiment, after S1, the method further includes the following steps:
and S10, analyzing whether the user sits at the dining table or not in real time according to the high-definition images.
Specifically, when the high definition camera is started and the clothes climbing robot 10 patrols and walks on the surface of the clothes, the information analysis module 34 analyzes whether the user sits on the dining table according to the high definition image in real time.
And S11, if yes, controlling a fourth preset number of clothes climbing robots 10 to move to the chest areas of the users on the clothes surfaces according to the high-definition images and controlling the electromagnetic layers 13 arranged on the sides of the clothes climbing robots 10 to start to enter an electromagnetic adsorption state.
Specifically, after the information analysis module 34 analyzes that the user sits at the dining table, the robot control module 32 controls, according to the high-definition image, a fourth preset number of the clothes climbing robots 10 to move to the chest area of the user on the surface of the clothes and form a square area through the microprocessor 2, wherein the fourth preset number is set by the user terminal, and is preferably 12 in this embodiment; after the clothes climbing robot 10 finishes moving, the magnetic attraction control module 36 contained in the master controller 3 is controlled by the microprocessor 2 to be arranged on the electromagnetic layer 13 on the side of the clothes climbing robot 10 and is started to enter an electromagnetic attraction state.
S12, controlling the clothes climbing robot 10 to magnetically attract and splice in the chest area of the user according to the high-definition images, and analyzing whether the user has meals according to the high-definition images in real time.
Specifically, after the electromagnetic layer 13 on the side of the clothes climbing robot 10 is started, the robot control module 32 controls the clothes climbing robot 10 to magnetically attract, splice and combine in the chest region of the user to form a splash-proof square protection region according to a high-definition image through the microprocessor 2, and meanwhile, the information analysis module 34 analyzes whether the user has a meal finish according to the high-definition image in real time, wherein the meal finish means that the user stops eating food and finishes wiping the outer surface of the climbing robot by using a wet tissue.
And S13, if yes, controlling the electromagnetic layer 13 of the clothes climbing robot 10 to be closed and controlling the clothes climbing robot 10 to move and reset on the surface of clothes according to the high-definition image.
Specifically, after the information analysis module 34 analyzes that the user has finished having a meal, the magnetic attraction control module 36 controls the electromagnetic layer 13 to close through the microprocessor 2, and after the electromagnetic layer 13 is closed, the robot control module 32 controls the clothes climbing robot 10 to move and reset on the surface of the clothes through the microprocessor 2 according to a high-definition image.
EXAMPLE III
Referring to fig. 4, shown in fig. 6-7.
Specifically, this embodiment is substantially the same as the first embodiment, except that in this embodiment, after S1, the method further includes the following steps:
and S16, controlling the temperature sensor 14 arranged on the outer surface of the climbing robot to start to acquire first temperature information in real time and acquiring second temperature information of the urban area where the clothes climbing robot 10 is located in real time.
Specifically, when the clothes climbing robot 10 patrols and walks on the surface of clothes, the temperature recognition module 37 included in the main controller 3 controls the temperature sensor 14 arranged on the outer surface of the climbing robot to start to acquire first temperature information in real time through the microprocessor 2, and meanwhile, the temperature acquisition module 38 included in the main controller 3 acquires second temperature information of an urban area where the clothes climbing robot 10 is located in real time.
And S17, analyzing whether the temperature value is lower than a first preset temperature according to the first temperature information and the second temperature information.
Specifically, after the temperature sensor 14 is started and the temperature obtaining module 38 obtains the second temperature information, the information analyzing module 34 analyzes whether the temperature value is lower than a first preset temperature according to the first temperature information and the second temperature information, where the first preset temperature is set by the user terminal, and is preferably 10 ℃ or lower in this embodiment.
And S18, if yes, controlling the heating layer 15 arranged at the position below the climbing robot to start heating to a second preset temperature.
Specifically, after the information analysis module 34 analyzes that the temperature is lower than the first preset temperature, the intelligent heating module 39 included in the general controller 3 controls the heating layer 15 disposed at the position below the climbing robot to start heating to a second preset temperature, wherein the second preset temperature is set by the user terminal, and is preferably 50 ℃ in this embodiment.
Example four
As shown with reference to fig. 6-7.
Specifically, the embodiment provides an intelligent protection system based on artificial intelligence and a clothes climbing robot 10, and an intelligent protection method based on artificial intelligence and a clothes climbing robot 10 is used, and comprises a protection device 1, a microprocessor 2 and a master controller 3;
the protection device 1 comprises a clothes climbing robot 10, a miniature camera 11, a fingerprint sensor 12, an electromagnetic layer 13, a temperature sensor 14 and a heating layer 15, wherein the clothes climbing robot 10 is located at an initial position set by a user wearing outer clothes and used for moving on the surface of the clothes and providing corresponding service functions; the miniature camera 11 is arranged at an external position of the clothes climbing robot 10 and is used for shooting an environment image of an external area of the clothes climbing robot 10; the fingerprint sensor 12 is arranged at the outer surface of the clothes climbing robot 10 and used for acquiring fingerprint information; the electromagnetic layer 13 is arranged at the side position of the clothes climbing robot 10 and used for colliding with other clothes climbing robots 10 for electromagnetic adsorption; the temperature sensor 14 is arranged at an external position of the clothes climbing robot 10 and is used for acquiring temperature information of an environment where the clothes climbing robot 10 is located; the heating layer 15 is arranged below the clothes climbing robot 10 and at a position corresponding to the body of the user, and is used for providing a heating function;
the microprocessor 2 is arranged at the inner position of the clothes climbing robot 10 and is respectively connected with the clothes climbing robot 10, the miniature camera 11, the fingerprint sensor 12, the electromagnetic layer 13, the temperature sensor 14 and the heating layer 15;
the general controller 3 is fixed in the position that the user dresses the clothing planning, general controller 3 includes:
the wireless module 30 is used for being respectively in wireless connection with the microprocessor 2, the user terminal, the alarm center and the network;
an information receiving module 31, configured to receive information and/or instructions and/or requests;
a robot control module 32 for controlling the laundry climbing robot 10 to perform a set operation according to the set steps through the microprocessor 2;
the micro shooting module 33 is used for controlling the micro camera 11 to start or close through the microprocessor 2;
an information analysis module 34 for processing and analyzing information according to the specified information;
and the fingerprint identification module 35 is used for controlling the fingerprint sensor 12 to be started or closed through the microprocessor 2.
As a preferred mode of the present invention, the general controller 3 further includes:
and the magnetic attraction control module 36 is used for controlling the magnetic attraction layer to be started or closed through the microprocessor 2.
As a preferred mode of the present invention, the general controller 3 further includes:
the temperature identification module 37 is used for controlling the temperature sensor 14 to be started or closed through the microprocessor 2;
the temperature obtaining module 38 is used for obtaining temperature information of an urban area where the clothes climbing robot 10 is located;
and the intelligent heating module 39 is used for controlling the heating layer 15 to be heated to a set temperature value through the microprocessor 2.
As a preferred mode of the present invention, the general controller 3 further includes:
an information extraction module 40, configured to extract information and/or instructions and/or requests included in the specific information and/or instructions and/or requests;
an information sending module 41, configured to send the specified information and/or instruction and/or request to the specified object.
It should be understood that, in the fourth embodiment, the specific implementation process of each module described above may correspond to the description of the above method embodiments (the first to fourth embodiments), and is not described in detail here.
The system provided in the fourth embodiment is only illustrated by dividing the functional modules, and in practical applications, the above-mentioned functions may be distributed by different functional modules according to needs, that is, the internal structure of the system is divided into different functional modules to complete all or part of the functions described above.
The above embodiments are merely illustrative of the technical ideas and features of the present invention, and are intended to enable those skilled in the art to understand the contents of the present invention and implement the present invention, and not to limit the scope of the present invention. All equivalent changes or modifications made according to the spirit of the present invention should be covered within the protection scope of the present invention.

Claims (9)

1. An intelligent protection method based on artificial intelligence and a clothes climbing robot is characterized by comprising the following steps:
s1, if a clothes protection instruction sent by a user terminal keeping the connection relation is received, controlling the clothes climbing robot stored in the clothes pocket to start and controlling a micro camera arranged at the external position of the clothes climbing robot to start to shoot a high-definition image in real time;
s2, analyzing the wrinkle area of the clothes of the user in real time according to the high-definition images, and controlling a first preset number of clothes climbing robots to move to the wrinkle area on the surface of the clothes to move circularly according to the wrinkle area and the high-definition images;
s3, analyzing the cuff areas of the clothes of the user in real time according to the high-definition images, and controlling the clothes climbing robots with a second preset number to move to the cuff areas on the surfaces of the clothes to be parked and fixed according to the cuff areas and the high-definition images;
s4, analyzing whether zippers or buttons exist in clothes pockets of the user in real time according to the high-definition images;
s5, if not, controlling a third preset number of clothes climbing robots to move to the positions of the clothes pockets of the user on the surfaces of the clothes to enter a moving pocket sealing state according to the high-definition images and controlling fingerprint sensors arranged at the outer surface positions of the clothes climbing robots to start to acquire fingerprint information in real time;
s6, analyzing whether a hand is positioned on the outer surface of the clothes climbing robot and is a user fingerprint in real time according to the acquired fingerprint information and the high-definition image;
and S7, if yes, controlling the clothes climbing robot to move to the side upper part at the position of the clothes pocket of the user according to the high-definition image to remove the pocket seal.
2. The intelligent protection method based on artificial intelligence and the clothes climbing robot as claimed in claim 1, wherein after S1, the method further comprises the following steps:
s10, analyzing whether a user sits at a dining table or not in real time according to the high-definition images;
s11, if yes, controlling a fourth preset number of clothes climbing robots to move to the chest area of a user on the clothes surface according to the high-definition images and controlling electromagnetic layers arranged on the sides of the clothes climbing robots to start to enter an electromagnetic adsorption state;
s12, controlling the clothes climbing robot to perform magnetic attraction splicing in the chest area of the user according to the high-definition image, and analyzing whether the user has a meal or not in real time according to the high-definition image;
and S13, if yes, controlling the electromagnetic layer of the clothes climbing robot to be closed and controlling the clothes climbing robot to move and reset on the surface of clothes according to the high-definition image.
3. The intelligent protection method based on artificial intelligence and the clothes climbing robot as claimed in claim 1, wherein after S1, the method further comprises the following steps:
s14, analyzing the type information of the clothes where the climbing robot is located in real time according to the high-definition image, and analyzing whether the clothes worn by a user are of a preset type according to the type information of the clothes;
and S15, if yes, controlling a fifth preset number of clothes climbing robots to move to the lower hem positions of the clothes on the surfaces of the clothes according to the high-definition images, and controlling the moved clothes climbing robots to be fixed in sequence.
4. The intelligent protection method based on artificial intelligence and the clothes climbing robot as claimed in claim 1, wherein after S1, the method further comprises the following steps:
s16, controlling a temperature sensor arranged on the outer surface of the clothes climbing robot to start to acquire first temperature information in real time and acquiring second temperature information of an urban area where the climbing robot is located in real time;
s17, analyzing whether the temperature value is lower than a first preset temperature or not according to the first temperature information and the second temperature information;
and S18, if so, controlling a heating layer arranged at the position below the climbing robot to start heating to a second preset temperature.
5. The intelligent protection method based on artificial intelligence and the clothes climbing robot as claimed in claim 1, wherein in S6, the method further comprises the following steps:
s60, analyzing whether a human hand is located at a position of a clothes pocket or not in real time according to the high-definition image;
s61, if yes, analyzing whether the human hand touches a fingerprint sensor on the outer surface of the clothes climbing robot or not in real time according to the high-definition image;
and S62, if not, extracting the high-definition image containing the surrounding environment of the clothes pocket and sending warning information and the high-definition image containing the surrounding environment of the clothes pocket to the user terminal keeping the connection relation.
6. An intelligent protection system based on artificial intelligence and a clothes climbing robot, which uses the intelligent protection method based on artificial intelligence and the clothes climbing robot as claimed in any one of claims 1 to 5, and comprises a protection device, a microprocessor and a master controller, and is characterized in that:
the protective device comprises a clothes climbing robot, a miniature camera, a fingerprint sensor, an electromagnetic layer, a temperature sensor and a heating layer, wherein the clothes climbing robot is located at an initial position set by a user wearing outer-layer clothes and used for moving on the surface of the clothes and providing a corresponding service function; the miniature camera is arranged at the external position of the clothes climbing robot and is used for shooting an environment image of the external area of the clothes climbing robot; the fingerprint sensor is arranged at the outer surface of the clothes climbing robot and used for acquiring fingerprint information; the electromagnetic layer is arranged at the side position of the clothes climbing robot and used for colliding with other clothes climbing robots to perform electromagnetic adsorption; the temperature sensor is arranged at an external position of the clothes climbing robot and used for acquiring the temperature information of the environment where the clothes climbing robot is located; the heating layer is arranged below the clothes climbing robot and corresponds to the body of a user, and is used for providing a heating function;
the microprocessor is arranged at the inner position of the clothes climbing robot and is respectively connected with the clothes climbing robot, the miniature camera, the fingerprint sensor, the electromagnetic layer, the temperature sensor and the heating layer;
the general controller is fixed in the position that the user dressed the clothing planning, general controller includes:
the wireless module is used for being respectively in wireless connection with the microprocessor, the user terminal, the alarm center and the network;
the information receiving module is used for receiving information and/or instructions and/or requests;
the robot control module is used for controlling the clothes climbing robot to execute set operation according to set steps through the microprocessor;
the micro shooting module is used for controlling the micro camera to start or close through the microprocessor;
the information analysis module is used for processing and analyzing the information according to the specified information;
and the fingerprint identification module is used for controlling the start or the stop of the fingerprint sensor through the microprocessor.
7. The intelligent protection system based on artificial intelligence and the clothes climbing robot as claimed in claim 6, wherein the general controller further comprises:
and the magnetic attraction control module is used for controlling the magnetic attraction layer to be started or closed through the microprocessor.
8. The intelligent protection system based on artificial intelligence and the clothes climbing robot as claimed in claim 6, wherein the general controller further comprises:
the temperature identification module is used for controlling the temperature sensor to be started or closed through the microprocessor;
the temperature acquisition module is used for acquiring temperature information of an urban area where the clothes climbing robot is located;
and the intelligent heating module is used for controlling the heating layer to be heated to a set temperature value through the microprocessor.
9. The intelligent protection system based on artificial intelligence and the clothes climbing robot as claimed in claim 6, wherein the general controller further comprises:
the information extraction module is used for extracting the specified information and/or instruction and/or request contained information and/or instruction and/or request;
and the information sending module is used for sending the specified information and/or instruction and/or request to the specified object.
CN202010306032.XA 2020-04-17 2020-04-17 Intelligent protection method and system based on artificial intelligence and clothes climbing robot Pending CN111390930A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010306032.XA CN111390930A (en) 2020-04-17 2020-04-17 Intelligent protection method and system based on artificial intelligence and clothes climbing robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010306032.XA CN111390930A (en) 2020-04-17 2020-04-17 Intelligent protection method and system based on artificial intelligence and clothes climbing robot

Publications (1)

Publication Number Publication Date
CN111390930A true CN111390930A (en) 2020-07-10

Family

ID=71417667

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010306032.XA Pending CN111390930A (en) 2020-04-17 2020-04-17 Intelligent protection method and system based on artificial intelligence and clothes climbing robot

Country Status (1)

Country Link
CN (1) CN111390930A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114723831A (en) * 2022-03-25 2022-07-08 山东大学 Heuristic-based robot flexible fabric flattening method and system

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3996929A (en) * 1974-11-30 1976-12-14 Mabuchi Motor Co. Ltd. Massaging machine
US4412535A (en) * 1981-08-17 1983-11-01 Teren Dorothy R Remotely controlled massaging apparatus
WO2009013742A1 (en) * 2007-07-23 2009-01-29 Eyal Avramovich Massager
CN202987324U (en) * 2012-07-03 2013-06-12 中国科学院合肥物质科学研究院 Wall climbing robot with bionic claws
CN203380887U (en) * 2013-07-11 2014-01-08 深圳中科智酷机器人科技有限公司 Multiple-joint bionic machine insect
CN105128972A (en) * 2015-09-28 2015-12-09 哈尔滨工业大学深圳研究生院 Mobile robot capable of walking on ground and crawling on flexible surface
CN110051516A (en) * 2019-04-23 2019-07-26 成都信息工程大学 A kind of massage robot

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3996929A (en) * 1974-11-30 1976-12-14 Mabuchi Motor Co. Ltd. Massaging machine
US4412535A (en) * 1981-08-17 1983-11-01 Teren Dorothy R Remotely controlled massaging apparatus
WO2009013742A1 (en) * 2007-07-23 2009-01-29 Eyal Avramovich Massager
CN202987324U (en) * 2012-07-03 2013-06-12 中国科学院合肥物质科学研究院 Wall climbing robot with bionic claws
CN203380887U (en) * 2013-07-11 2014-01-08 深圳中科智酷机器人科技有限公司 Multiple-joint bionic machine insect
CN105128972A (en) * 2015-09-28 2015-12-09 哈尔滨工业大学深圳研究生院 Mobile robot capable of walking on ground and crawling on flexible surface
CN110051516A (en) * 2019-04-23 2019-07-26 成都信息工程大学 A kind of massage robot

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ARTEM DEMENTYEV等: "Rovables: Miniature On-Body Robots as Mobile Wearables", 《29TH ANNUAL SYMPOSIUM ON USER INTERFACE SOFTWARE AND TECHNOLOGY (UIST)》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114723831A (en) * 2022-03-25 2022-07-08 山东大学 Heuristic-based robot flexible fabric flattening method and system

Similar Documents

Publication Publication Date Title
Mazhar et al. Towards real-time physical human-robot interaction using skeleton information and hand gestures
Jiang et al. Enhanced control of a wheelchair-mounted robotic manipulator using 3-D vision and multimodal interaction
Jiang et al. Integrated vision-based robotic arm interface for operators with upper limb mobility impairments
McColl et al. Human body pose interpretation and classification for social human-robot interaction
CN104331158A (en) Gesture-controlled human-computer interaction method and device
CN105388820A (en) Intelligent monitoring device and monitoring method thereof, and monitoring system
CN111390930A (en) Intelligent protection method and system based on artificial intelligence and clothes climbing robot
CN106030610A (en) Real-time 3D gesture recognition and tracking system for mobile devices
CN103295011A (en) Information processing apparatus, information processing method and computer program
WO2022161111A1 (en) Indoor and outdoor mobile robot and vehicle body dual-purpose apparatus, and management system and method
CN103513906B (en) A kind of command identifying method, device and electronic equipment
Puthuveetil et al. Bodies uncovered: Learning to manipulate real blankets around people via physics simulations
JP2020198053A (en) Information processing device, information processing method, person search system, and person search method
Castro et al. Continuous semi-autonomous prosthesis control using a depth sensor on the hand
CN106054829B (en) Family send the method for operating of water service robot system
Song et al. Visual servoing for a user's mouth with effective intention reading in a wheelchair-based robotic arm
CN106325306B (en) A kind of camera assembly apparatus of robot and its shooting and tracking
Moriya et al. A method of picking up a folded fabric product by a single-armed robot
Jiang et al. Integrated vision-based system for efficient, semi-automated control of a robotic manipulator
Diaz et al. To veer or not to veer: Learning from experts how to stay within the crosswalk
Carrasco et al. Exploiting eye–hand coordination to detect grasping movements
Shimizu et al. A gesture recognition system using stereo vision and arm model fitting
Cheng et al. Application exploring of ubiquitous pressure sensitive matrix as input resource for home-service robots
Proenca et al. A gestural recognition interface for intelligent wheelchair users
Phyo et al. A human-robot interaction system based on calling hand gestures

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200710

RJ01 Rejection of invention patent application after publication