CN113171472A - Disinfection robot - Google Patents

Disinfection robot Download PDF

Info

Publication number
CN113171472A
CN113171472A CN202010454119.1A CN202010454119A CN113171472A CN 113171472 A CN113171472 A CN 113171472A CN 202010454119 A CN202010454119 A CN 202010454119A CN 113171472 A CN113171472 A CN 113171472A
Authority
CN
China
Prior art keywords
robot
disinfection
user
gesture
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010454119.1A
Other languages
Chinese (zh)
Other versions
CN113171472B (en
Inventor
吴骁伟
郭成凯
孙广石
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhongke Wangfu Beijing Technology Co ltd
Original Assignee
Zhongke Wangfu Beijing Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhongke Wangfu Beijing Technology Co ltd filed Critical Zhongke Wangfu Beijing Technology Co ltd
Priority to CN202010454119.1A priority Critical patent/CN113171472B/en
Publication of CN113171472A publication Critical patent/CN113171472A/en
Application granted granted Critical
Publication of CN113171472B publication Critical patent/CN113171472B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61LMETHODS OR APPARATUS FOR STERILISING MATERIALS OR OBJECTS IN GENERAL; DISINFECTION, STERILISATION OR DEODORISATION OF AIR; CHEMICAL ASPECTS OF BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES; MATERIALS FOR BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES
    • A61L2/00Methods or apparatus for disinfecting or sterilising materials or objects other than foodstuffs or contact lenses; Accessories therefor
    • A61L2/02Methods or apparatus for disinfecting or sterilising materials or objects other than foodstuffs or contact lenses; Accessories therefor using physical phenomena
    • A61L2/08Radiation
    • A61L2/10Ultra-violet radiation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61LMETHODS OR APPARATUS FOR STERILISING MATERIALS OR OBJECTS IN GENERAL; DISINFECTION, STERILISATION OR DEODORISATION OF AIR; CHEMICAL ASPECTS OF BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES; MATERIALS FOR BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES
    • A61L2/00Methods or apparatus for disinfecting or sterilising materials or objects other than foodstuffs or contact lenses; Accessories therefor
    • A61L2/24Apparatus using programmed or automatic operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61LMETHODS OR APPARATUS FOR STERILISING MATERIALS OR OBJECTS IN GENERAL; DISINFECTION, STERILISATION OR DEODORISATION OF AIR; CHEMICAL ASPECTS OF BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES; MATERIALS FOR BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES
    • A61L9/00Disinfection, sterilisation or deodorisation of air
    • A61L9/16Disinfection, sterilisation or deodorisation of air using physical phenomena
    • A61L9/18Radiation
    • A61L9/20Ultra-violet radiation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J5/00Manipulators mounted on wheels or on carriages
    • B25J5/007Manipulators mounted on wheels or on carriages mounted on wheels
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0088Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0242Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0248Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/0278Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using satellite positioning signals, e.g. GPS
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm
    • G06V40/113Recognition of static hand signs
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61LMETHODS OR APPARATUS FOR STERILISING MATERIALS OR OBJECTS IN GENERAL; DISINFECTION, STERILISATION OR DEODORISATION OF AIR; CHEMICAL ASPECTS OF BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES; MATERIALS FOR BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES
    • A61L2202/00Aspects relating to methods or apparatus for disinfecting or sterilising materials or objects
    • A61L2202/10Apparatus features
    • A61L2202/14Means for controlling sterilisation processes, data processing, presentation and storage means, e.g. sensors, controllers, programs
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61LMETHODS OR APPARATUS FOR STERILISING MATERIALS OR OBJECTS IN GENERAL; DISINFECTION, STERILISATION OR DEODORISATION OF AIR; CHEMICAL ASPECTS OF BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES; MATERIALS FOR BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES
    • A61L2209/00Aspects relating to disinfection, sterilisation or deodorisation of air
    • A61L2209/10Apparatus features
    • A61L2209/11Apparatus for controlling air treatment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61LMETHODS OR APPARATUS FOR STERILISING MATERIALS OR OBJECTS IN GENERAL; DISINFECTION, STERILISATION OR DEODORISATION OF AIR; CHEMICAL ASPECTS OF BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES; MATERIALS FOR BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES
    • A61L2209/00Aspects relating to disinfection, sterilisation or deodorisation of air
    • A61L2209/10Apparatus features
    • A61L2209/11Apparatus for controlling air treatment
    • A61L2209/111Sensor means, e.g. motion, brightness, scent, contaminant sensors

Abstract

A sterilization robot is disclosed. The disinfection robot comprises an electronic control unit, a first communication unit and an image acquisition device, wherein the electronic control unit is configured to: if an organism exists in the periphery of the disinfection robot, determining whether the organism is an adult, and if so, communicating with a name card of the organism through the first communication unit to acquire identification information of a user; determining whether the user has a gesture instruction authority or not based on the identification information, if so, acquiring a gesture image of the user by using the image acquisition device, and analyzing to obtain a gesture; and executing the operation corresponding to the gesture, wherein the operation comprises at least one of stop, left turn, right turn, backward, upstairs and downstairs. Therefore, the disinfection robot can identify the authority of the user and execute corresponding operation according to the gesture of the user with the authority so as to efficiently complete the disinfection task and ensure that the disinfection robot is not interfered by an unauthorized user.

Description

Disinfection robot
Technical Field
The invention relates to a disinfection robot.
Background
When an existing disinfection robot conducts disinfection work in areas such as a hospital, a typical obstacle can be generally recognized, and a route is re-planned according to the position of the obstacle, so that the obstacle avoidance function is achieved. However, when organisms appear around the disinfection robot, the ultraviolet disinfection of the disinfection robot can cause physiological damage to the organisms, and only the re-planned route can be received through the background server, and the organisms still possibly appear on the route to interfere with the disinfection work, which causes consumption of computing resources and fails to accurately avoid the organisms.
Further, the sterility and safety requirements of the disinfection robot itself place higher demands on the way the user interacts with it. The existing disinfection robot can not accurately and rapidly identify the birth object and carry out evasion operation. In addition, the existing disinfection robot cannot recognize the operation instruction transmitted by the organism by using body language, such as gestures. In addition, the existing disinfection robot does not provide a targeted response to different types of organisms around, for example, in a shopping mall, a kindergarten or a children hospital, children are interested in the disinfection robot, and frequent body languages thereof interfere with the command received by the disinfection robot, such as children or other organisms without authority, which greatly affects the normal disinfection work of the disinfection robot. The responses to the child's body language are currently indistinguishable from those of other organisms, resulting in significant time and resources spent by the disinfection robot in dealing with these invalid body languages and even making erroneous decisions and manipulations.
Disclosure of Invention
The present invention has been made in view of the above problems, and it is an object of the present invention to provide a sterilization robot that can distinguish surrounding living organisms, recognize the authority of a user, and perform corresponding operations according to gestures of the user having the authority, and does not invest computing resources in a disturbing gesture of a child, ensures no interference by unauthorized users, thereby significantly reducing the computing load, and efficiently completing a sterilization task.
According to an aspect of the present disclosure, there is provided a sterilization robot including an electronic control unit, a first communication unit, and an image pickup device, wherein the electronic control unit is configured to: if the living body exists around the disinfection robot, determining whether the living body is an adult, and if so, communicating with a name card of the living body through a first communication unit to acquire identification information of a user; determining whether the user has the gesture instruction authority or not based on the identification information, if so, acquiring a gesture image of the user by using an image acquisition device, and analyzing to obtain a gesture; and executing the operation corresponding to the gesture, wherein the operation comprises at least one of stop, left turn, right turn, backward movement, upstairs and downstairs.
In some embodiments, the image acquisition apparatus is further configured to: detecting whether objects exist around the disinfection robot or not, and sending a first indication signal indicating whether the objects exist or not; the sterilization robot further includes: a motion base configured with wheels; an ultraviolet emitter disposed on the motion base and configured to emit ultraviolet rays to effect sterilization of the peripheral area; an infrared sensor configured to: receiving the first indication signal, detecting whether the existing living body is a living body, and sending a second indication signal indicating that the living body exists; and an alarm configured to: receiving a second indication signal, alerting via at least one of voice and light, or turning off the ultraviolet emitter.
In some embodiments, the nameplate stores therein identification information including authority information indicating whether the user has a gesture instruction authority and body type information of the user; the electronic control unit is further configured to: and analyzing the body type of the image of the user acquired by the image acquisition device to determine whether the image is consistent with the body type information prestored in the identification information so as to avoid the fake license plate being used.
In some embodiments, the image acquisition device comprises an area array laser radar, and the distance between the object and the disinfection robot is measured by using the area array laser radar; the user's image is only acquired for analysis when the distance is within a predetermined distance range.
In some embodiments, the disinfection robot further comprises a navigation unit locally storing a map of the hospital; the electronic control unit presets a disinfection area for each disinfection robot and communicates with the background server to update the running route of each disinfection robot in real time; when a disinfection robot is going to enter a disinfection area, it is determined whether the disinfection area has been covered by the travel routes of other disinfection robots, and if so, the disinfection is suspended and the disinfection area is exited with the shortest route.
In some embodiments, the electronic control unit is further configured to, when there is at least one non-sterile area, determine a shortest travel route covering the at least one non-sterile area and send the shortest travel route to the backend server.
In some embodiments, the method includes prompting the sanitizing robot when it has entered a sanitization area for a segment of the travel route that has been sanitized, determining a length of the segment, causing the sanitizing robot to turn off the ultraviolet emitters in the segment if the length is greater than a threshold, and maintaining the ultraviolet emitters on if the length is not greater than the threshold.
In some embodiments, when the disinfection robot changes floors, the navigation unit determines the nearest elevator, and the disinfection robot sends out a prompt of keeping a safe distance through an alarm after entering the elevator and determines to reach the target floor through the image acquisition device.
Therefore, the disinfection robot receives the picking up instruction by identifying the user with the authority, adjusts the disinfection route in time and can avoid the interference of unauthorized users.
Drawings
In the drawings, which are not necessarily drawn to scale, like reference numerals may describe similar components in different views. Like reference numerals having letter suffixes or different letter suffixes may represent different instances of similar components. The drawings illustrate various embodiments generally by way of example and not by way of limitation, and together with the description and claims serve to explain the disclosed embodiments. The same reference numbers will be used throughout the drawings to refer to the same or like parts, where appropriate. Such embodiments are illustrative, and are not intended to be exhaustive or exclusive embodiments of the present apparatus or method.
Fig. 1 is a schematic structural view of a sterilization robot according to an embodiment of the present invention;
FIG. 2 is a flow diagram of one example of a sanitizing robot, according to an embodiment of the present invention;
fig. 3 is a block diagram schematically showing one example of components provided in the sterilization robot.
Detailed Description
For a better understanding of the technical aspects of the present disclosure, reference is made to the following detailed description taken in conjunction with the accompanying drawings. Embodiments of the present disclosure are described in further detail below with reference to the figures and the detailed description, but the present disclosure is not limited thereto. The order in which the various steps described herein are described as examples should not be construed as a limitation if there is no requirement for a context relationship between each other, and one skilled in the art would know that sequential adjustments may be made without destroying the logical relationship between each other, rendering the overall process impractical.
Fig. 1 is a schematic view of a sterilization robot according to an embodiment of the present invention. This is by way of example only, and the present disclosure may be implemented as a robot having a sterilizing effect that is capable of autonomous movement in any configuration.
As shown in fig. 1, the configuration structure of the sterilization robot at least includes: the device comprises an infrared sensor 1, an ultraviolet emitter 2, a navigation unit 3, a motion base 4 and a battery and data analysis main board 5. The battery may be mounted directly on the data analysis motherboard 5. The data analysis main board 5 may at least include a first communication unit 501 and an electronic control unit 502; the motion base 4 is configured to have wheels (and in some embodiments may further include a crane mounted on the motion base 4), the image capturing device 6 may be mounted on the motion base 4 (e.g., may also be mounted on the crane), the image capturing device 6 configured to detect the presence of an object around the robot and to send a first indication signal indicating the presence of the object. In some embodiments, the image capturing device 6 may be a camera or a lidar. In some embodiments, the image capturing device 6 is configured as an area array laser radar 601, so as to more accurately detect the distance and the activity (such as but not limited to gestures, movements, etc.) of the surrounding objects, and also accurately monitor the response (such as whether the living body is far away from the safe range) after the living body interacts with the disinfection robot.
In some embodiments, the navigation unit 3 may be configured to locate the disinfection robot, and may be implemented using GPS or the like technology to determine its real-time location in the operation site (e.g., hospital, shopping site), which floor, which room, which shop, and the like.
As shown in fig. 1, the ultraviolet emitter 2 is disposed on the moving base 4 and is configured to emit ultraviolet rays to effect sterilization of the peripheral area; the infrared sensor 1 is configured to receive the first indication signal, detect whether or not an existing living body is a living body, and transmit a second indication signal indicating the existence of the living body. The robot further comprises an alarm 7, the alarm 7 may be mounted on any structure for providing an alarm prompt, and the alarm 7 is not specifically limited herein, and is configured to receive the second indication signal, to perform a warning via at least one of voice and light, or to turn off the ultraviolet emitter 2. The alarm 7 may cooperate with the area array lidar and the infrared sensor 1, for example, in a case where the area array lidar detects a peripheral object and the object is an object, determine whether the object is in a safe distance range, if not, remind the object to leave via the alarm 7, and if the area array lidar does not detect that the object moves to the safe distance range within a next preset time period, turn off the ultraviolet emitter 2 to avoid damage to the object.
Fig. 2 is a flow chart of an example of a disinfection robot according to an embodiment of the invention, wherein the disinfection robot comprises a first communication unit 501, an electronic control unit 502 and an image acquisition arrangement 6, wherein the electronic control unit 502 is configured to perform the following steps. In step S201, in the case where a living body exists around the disinfecting robot, it is determined whether the living body is an adult.
If it is not an adult, for example, the living body is a child or a pet, it is not defaulted to further identification and higher-level gesture analysis work, but only to use at least one of voice and light to remind the living body of departure. In some embodiments, it is also possible to turn off the ultraviolet emitter 2 while reminding the exit thereof via the alarm 7 in case it is determined that it is not an adult, in order to avoid ultraviolet injury to children or pets with low alarm compliance.
If it is an adult, communicating with the name plate of the living body via the first communication unit 501 to acquire the identification information of the user; s202, determining whether the user has the gesture instruction authority or not based on the identification information, if so, acquiring a gesture image of the user by using the image acquisition device 6, and analyzing to obtain a gesture; and executing the operation corresponding to the gesture, wherein the operation comprises at least one of stop, left turn, right turn, backward movement, upstairs and downstairs. In this way, by having the staff in the operation place of the disinfection robot wear a specific nameplate, the disinfection robot can accurately distinguish the same from ordinary adult users or staff, and can accurately distinguish staff with different authorities from each other, thereby limiting the population that can interact with the disinfection robot via gesture instructions, and ensuring the safety in use of the disinfection robot.
In some embodiments, the image capturing device 6 detects whether an object, such as a step, a medical instrument, an armrest, a seat, an organism, or the like, exists around the robot, and transmits a first indication signal indicating whether the object exists to the infrared sensor 1. The infrared sensor 1 receives the first indication signal, detects whether or not the existing living body is a living body based on the first indication signal, and transmits a second indication signal indicating the existence of the living body to the alarm 7. The alarm 7 receives the second indicating signal, reminds the surrounding organisms to avoid by means of voice and/or light, or turns off the ultraviolet emitter 2 in the disinfection work, so as to avoid the physiological condition of the organisms from being damaged by ultraviolet rays.
In S201, a second indication signal indicating the presence of a biological body is simultaneously transmitted to the electronic control unit 502 of the disinfection robot, and the electronic control unit 502 further determines whether or not the biological body is an adult in the case where the biological body is present in the vicinity of the disinfection robot, and when the biological body is an adult, communicates with the nameplate of the biological body via the first communication unit 501 to acquire the identification information of the user. In some embodiments, the electronic control unit 502 is configured to: in the case where the area-array lidar 601 detects that the surrounding organism is not evaded, this may be because the organism is a child or a pet, or because the organism is a user with gesture instruction authority, and needs to make a gesture instruction in a sufficiently close range, the ultraviolet emitter 2 in the disinfection operation is turned off to avoid damage to the physiological condition of the organism.
In some embodiments, the nameplate stores identification information, wherein the identification information comprises authority information indicating whether the user has the gesture instruction authority and body type information of the user; only the gesture command sent by the user with the gesture command authority to the disinfection robot can be successfully received and recognized, and the disinfection robot executes corresponding operation. The electronic control unit 502 is further configured to: the image of the user collected by the image collecting device 6 is subjected to body type analysis to determine whether the image is consistent with the body type information prestored in the identification information, so that the nameplate is prevented from being falsely used. In the process of identifying the user with the authority, the nameplate of the user is directly scanned to obtain the corresponding authority, and whether the user corresponding to the nameplate is the current nameplate wearing user or not is further determined according to the body type characteristics of the user, so that the condition that the nameplate is falsely used by other users is avoided, and the safety of executing control on the disinfection robot is further improved.
In S202, whether the user has the gesture instruction authority is determined based on the identification information of the nameplate, if yes, the image acquisition device 6 is used for acquiring the gesture image of the user, and the gesture is obtained through analysis; and executing the operation corresponding to the gesture, wherein the operation comprises at least one of stop, left turn, right turn, backward movement, upstairs and downstairs. The background server receives the operating instructions transmitted by the body language such as gestures through recognizing the organisms (field medical staff and the like) with gesture instruction authority, so that the unexpected obstacles and organisms can be flexibly and timely avoided.
In some embodiments, the image acquisition device 6 may further include an area array laser radar 601, and the distance between the object and the disinfection robot is measured by using the area array laser radar 601; the user's image is only acquired for analysis when the distance is within a predetermined distance range. That is, when the distance between the object and the disinfection robot is too far, the ranging function of the area array laser radar 601 is not triggered to avoid unnecessary waste of computing resources. In addition, images such as gesture instructions of users for analysis are all acquired within a preset distance range, so that deformation of image features caused by perspective relation is eliminated, analysis processing is simplified, and a recognition algorithm is simpler and more accurate. Specifically, in the case where, when communicating with the nameplate of the living body via the first communication unit 501 to acquire the identification information of the user, the image of the user is acquired for scan analysis (for example, using an image sensor or a scanner or the like) only when the distance is within a predetermined distance range, the vertical-horizontal scanning range of the nameplate by the first communication unit 501 may be set in advance, and only this preset range is scanned to read the identification information of the user. Therefore, the speed of scanning and processing the name card can be remarkably increased, and the response time of the disinfection robot to peripheral emergency is shortened.
In some embodiments, the first communication unit 501 and the nameplate may communicate in various wireless communication modes, including but not limited to any one of code scanning with an image sensor, NFC (near field communication), bluetooth communication modes. In some embodiments, an image sensor may be used for code scanning, so that the nameplate configuration may be passive (no transmit power is required), further reducing cost.
In some embodiments, the disinfection robot may also comprise a navigation unit 3, which locally stores a map of the hospital. In some application scenarios, the sanitizing robots may be used in groups. The electronic control unit 502 may preset a sterilization area for each sterilization robot, and communicate with the backend server to update the travel route of each sterilization robot in real time; and the background server is used for realizing the scheduling of each disinfection robot. When a disinfection robot is going to enter a disinfection area, it is determined whether the disinfection area has been covered by the travel routes of other disinfection robots, and if so, the disinfection is suspended and the disinfection area is exited with the shortest route. So, can avoid a plurality of disinfection machine people's route coincidence to cause the repeated disinfection in same region, and then improve a set of disinfection machine people's efficiency. The real-time updating of the driving routes of the disinfection robots enables the driving routes of other disinfection robots in the same group to be adaptively adjusted based on the real-time driving routes of the disinfection robots, and therefore the disinfection efficiency of the whole group is optimized.
In some embodiments, the electronic control unit 502 is further configured to determine a shortest travel route covering the at least one non-sterile area when there is at least one non-sterile area, and to send the shortest travel route to the backend server. When the disinfection robot runs to a certain area in the disinfection work, whether the area is disinfected or not is determined, if the area is not disinfected, the route planning function of the electronic control unit 502 is triggered, the shortest running route covering at least one unsterilized area is calculated, and the shortest running route is sent to the background server so as to update the real-time scheduling condition of the background server for the disinfection routes of the disinfection robots in real time.
In some embodiments, when the sanitizing robot has entered a sanitizing area, the section of the travel route that the sanitizing robot has been sanitized is prompted, the length of the section is determined, if the length is greater than a threshold, the sanitizing robot is caused to turn off the ultraviolet emitters 2 in the section, and if the length is not greater than the threshold, the ultraviolet emitters 2 are maintained on. Although the ultraviolet emitter 2 may be turned off in a section that has been sterilized, the life of the ultraviolet emitter 2 may be deteriorated due to frequent switching. When the disinfecting robot travels in the current route, if the section of the travel route on the current route that has been disinfected is longer than the threshold value, it is considered that the power consumption saved by turning off the ultraviolet emitters 2 has a priority over the adverse effect on the ultraviolet emitters 2 by turning off-on. When the segment is not longer than the threshold value, it is considered that the adverse effect on the ultraviolet emitter 2 by off-on is prioritized over the power consumption saved by turning off the ultraviolet emitter 2, and then the on state of the ultraviolet emitter 2 is maintained for continuous sterilization.
In some embodiments, the navigation unit 3 determines the nearest elevator when the disinfection robot switches floors, and the disinfection robot issues a prompt via the alarm 7 to maintain a safe distance after entering an elevator and determines the arrival at the target floor via the image acquisition device 6. In some embodiments, the sanitizing robot can directly turn off the emission of ultraviolet light upon entering the elevator and adjust down the safe distance range for surrounding organisms.
Fig. 3 is a block diagram schematically showing one example of components provided in the sterilization robot. In the disinfection robot system, the disinfection robot system comprises an infrared sensor 1, an ultraviolet emitter 2, a navigation unit 3, a motion base 4, a battery (not shown), a data analysis main board 5 (comprising a first communication unit 501 and an electronic control unit 502), an image acquisition device 6 (which can include but is not limited to an area array laser radar 601) and an alarm 77. The above components are connected through a bus to form a disinfection robot system so as to maintain the normal work of the disinfection robot. The disinfection robot can also be remotely connected with a background server through the first communication unit 501, so that the background server can schedule and transmit data to the disinfection robot.
In some embodiments, the electronic control unit may be implemented via various processors, and the various steps it performs may be implemented by the processors executing computer-executable instructions stored on the memory. In some embodiments, the processor may include, but is not limited to, any one or more of a microprocessor, an ASIC (application specific integrated circuit), an FPGA (field programmable gate array), an SOC (system on a chip), a DSP (digital signal processing) chip.
In summary, the present invention discloses a disinfection robot, which can identify the living organisms around the disinfection robot, and thus send out an alarm to prompt the living organisms to avoid, so as to avoid physiological damage caused by ultraviolet rays. In addition, the disinfection robot can also recognize operation instructions transmitted by body languages (such as gestures) of the organism, and compared with the situation that the organism can still appear on a route which can only be received by a background server and is re-planned, the disinfection robot can timely and accurately avoid the organism. Further, by identifying the name card and reminding the user, the normal disinfection work of the disinfection robot is prevented from being influenced by children or other organisms without authority.
Although exemplary embodiments have been described herein, the scope thereof includes any and all embodiments based on the present disclosure with equivalent elements, modifications, omissions, combinations (e.g., of various embodiments across), adaptations or alterations. The elements of the claims are to be interpreted broadly based on the language employed in the claims and not limited to examples described in the present specification or during the prosecution of the application, which examples are to be construed as non-exclusive. It is intended, therefore, that the specification and examples be considered as exemplary only, with a true scope and spirit being indicated by the following claims and their full scope of equivalents.
The above description is intended to be illustrative and not restrictive. For example, the above-described examples (or one or more versions thereof) may be used in combination with each other. For example, other embodiments may be used by those of ordinary skill in the art upon reading the above description. In addition, in the foregoing detailed description, various features may be grouped together to streamline the disclosure. This should not be interpreted as an intention that a disclosed feature not claimed is essential to any claim. Rather, inventive subject matter may lie in less than all features of a particular disclosed embodiment. Thus, the following claims are hereby incorporated into the detailed description as examples or embodiments, with each claim standing on its own as a separate embodiment, and it is contemplated that these embodiments may be combined with each other in various combinations or permutations. The scope of the invention should be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled.

Claims (8)

1. A sanitizing robot comprising an electronic control unit, a first communication unit and an image acquisition device, wherein the electronic control unit is configured to:
if an organism exists in the periphery of the disinfection robot, determining whether the organism is an adult, and if so, communicating with a name card of the organism through the first communication unit to acquire identification information of a user; and
determining whether the user has a gesture instruction authority or not based on the identification information, if so, acquiring a gesture image of the user by using the image acquisition device, and analyzing to obtain a gesture; and executing the operation corresponding to the gesture, wherein the operation comprises at least one of stop, left turn, right turn, backward, upstairs and downstairs.
2. The sanitizing robot as recited in claim 1, wherein said image capturing device is further configured to: detecting whether an object exists around the disinfection robot and sending a first indication signal indicating whether the object exists;
the sterilization robot further includes:
a motion base configured with wheels;
an ultraviolet emitter disposed on the motion base and configured to emit ultraviolet rays to effect sterilization of a peripheral area;
an infrared sensor configured to: receiving the first indication signal, detecting whether the existing organism is an organism, and sending a second indication signal indicating that the organism exists; and
an alarm configured to: and receiving the second indication signal, reminding the user through at least one of voice and light, or turning off the ultraviolet transmitter.
3. The sterilization robot according to claim 1, wherein identification information is stored in the name plate, the identification information including authority information indicating whether the user has the gesture instruction authority and body type information of the user; the electronic control unit is further configured to: and analyzing the body type of the image of the user acquired by the image acquisition device to determine whether the image is consistent with the body type information prestored in the identification information so as to avoid the fake license plate being falsely used.
4. The sterilization robot according to claim 2, wherein the image acquisition device includes an area array lidar with which a distance between the object and the sterilization robot is measured; only when the distance is within a predetermined distance range, an image of the user is acquired for analysis.
5. The sterilization robot according to claim 2, further comprising a navigation unit which locally stores a map of a hospital; the electronic control unit presets a disinfection area for each disinfection robot and communicates with the background server to update the running route of each disinfection robot in real time; when the disinfection robot enters a disinfection area, determining whether the disinfection area is covered by the driving routes of other disinfection robots, if so, suspending disinfection and driving out of the disinfection area by the shortest route.
6. The sanitizing robot of claim 5, wherein said electronic control unit is further configured to, when at least one non-sanitized area is present, determine a shortest travel route that covers said at least one non-sanitized area and send said shortest travel route to said backend server.
7. A disinfecting robot as recited in claim 5, characterized in that when said disinfecting robot has entered one of said disinfecting areas, a segment of said travel route that prompts the disinfecting robot to have been disinfected, the length of said segment is determined, if said length is longer than a threshold value, the disinfecting robot is caused to turn off said ultraviolet emitter in said segment, if said length is not longer than a threshold value, said ultraviolet emitter is maintained on.
8. The disinfection robot of claim 2, wherein the navigation unit determines a nearest elevator when the disinfection robot switches floors, and the disinfection robot issues a prompt to maintain a safe distance via the alarm after entering the elevator and determines arrival at a target floor via the image capture device.
CN202010454119.1A 2020-05-26 2020-05-26 Disinfection robot Active CN113171472B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010454119.1A CN113171472B (en) 2020-05-26 2020-05-26 Disinfection robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010454119.1A CN113171472B (en) 2020-05-26 2020-05-26 Disinfection robot

Publications (2)

Publication Number Publication Date
CN113171472A true CN113171472A (en) 2021-07-27
CN113171472B CN113171472B (en) 2023-05-02

Family

ID=76921411

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010454119.1A Active CN113171472B (en) 2020-05-26 2020-05-26 Disinfection robot

Country Status (1)

Country Link
CN (1) CN113171472B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110001813A1 (en) * 2009-07-03 2011-01-06 Electronics And Telecommunications Research Institute Gesture recognition apparatus, robot system including the same and gesture recognition method using the same
US20140005486A1 (en) * 2012-06-27 2014-01-02 CamPlex LLC Surgical visualization system with camera tracking
US20140152557A1 (en) * 2011-09-15 2014-06-05 Omron Corporation Gesture recognition device, electronic apparatus, gesture recognition device control method, control program, and recording medium
CN105334851A (en) * 2014-08-12 2016-02-17 深圳市银星智能科技股份有限公司 Mobile device capable of sensing gesture
JP2016106294A (en) * 2015-12-28 2016-06-16 墫野 和夫 Fully automatic robot household electric system appliance
CN106622728A (en) * 2017-02-28 2017-05-10 北京兆维电子(集团)有限责任公司 Fog gun type belt anti-epidemic robot
CN107765855A (en) * 2017-10-25 2018-03-06 电子科技大学 A kind of method and system based on gesture identification control machine people motion
CN107894836A (en) * 2017-11-22 2018-04-10 河南大学 Remote sensing image processing and the man-machine interaction method of displaying based on gesture and speech recognition
CN108776473A (en) * 2018-05-23 2018-11-09 上海圭目机器人有限公司 A kind of working method of intelligent disinfecting robot
CN108958490A (en) * 2018-07-24 2018-12-07 Oppo(重庆)智能科技有限公司 Electronic device and its gesture identification method, computer readable storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110001813A1 (en) * 2009-07-03 2011-01-06 Electronics And Telecommunications Research Institute Gesture recognition apparatus, robot system including the same and gesture recognition method using the same
US20140152557A1 (en) * 2011-09-15 2014-06-05 Omron Corporation Gesture recognition device, electronic apparatus, gesture recognition device control method, control program, and recording medium
US20140005486A1 (en) * 2012-06-27 2014-01-02 CamPlex LLC Surgical visualization system with camera tracking
CN105334851A (en) * 2014-08-12 2016-02-17 深圳市银星智能科技股份有限公司 Mobile device capable of sensing gesture
JP2016106294A (en) * 2015-12-28 2016-06-16 墫野 和夫 Fully automatic robot household electric system appliance
CN106622728A (en) * 2017-02-28 2017-05-10 北京兆维电子(集团)有限责任公司 Fog gun type belt anti-epidemic robot
CN107765855A (en) * 2017-10-25 2018-03-06 电子科技大学 A kind of method and system based on gesture identification control machine people motion
CN107894836A (en) * 2017-11-22 2018-04-10 河南大学 Remote sensing image processing and the man-machine interaction method of displaying based on gesture and speech recognition
CN108776473A (en) * 2018-05-23 2018-11-09 上海圭目机器人有限公司 A kind of working method of intelligent disinfecting robot
CN108958490A (en) * 2018-07-24 2018-12-07 Oppo(重庆)智能科技有限公司 Electronic device and its gesture identification method, computer readable storage medium

Also Published As

Publication number Publication date
CN113171472B (en) 2023-05-02

Similar Documents

Publication Publication Date Title
US11684526B2 (en) Patient support apparatuses with navigation and guidance systems
CN105730935B (en) A kind of system and method for Biohazard Waste Internet of Things management
EP3016815B1 (en) Operator drowsiness detection in surface mines
CN111459167B (en) Spraying disinfection method and robot
EP1728601A1 (en) An industrial robot system with a teaching portable unit and a detecting unit for detecting when the TPU leaves the robot cell
KR101941527B1 (en) Smart Livestock Vehicle Prevention System for Livestock Disease Prevention
JP2005066745A (en) Guide robot system
CN212341735U (en) Disinfection robot
CN113257415A (en) Health data collection device and system
CN113171472A (en) Disinfection robot
KR101647831B1 (en) Hand sanitizer monitoring system and monitoring method
JP2006289565A (en) Robot operation device and robot
CN112562290A (en) Medical care sanitary hand disinfection monitoring system based on intelligent wearable equipment
CN114586077A (en) Monitoring system and program
US10021657B2 (en) Radio frequency identification modes in patient monitoring
JP2004251816A (en) Apparatus and system for managing position of moving body
KR102487350B1 (en) Prevention and sterilization device using smart robot
US20210089043A1 (en) System for cooperative movement control and/or movement supervision of mobile medical components
Jeyaseelan WR et al. Efficient Intelligent Smart Ambulance Transportation System using Internet of Things
CN212520409U (en) Guide dog auxiliary device
CN113311839A (en) Intelligent robot control method and system for public area disinfection
KR101718064B1 (en) Passenger Checking system and method of mass transit
CN111554389A (en) Hospital service management system and hospital service robot control method
WO2012115558A1 (en) Apparatus and method for tracking a stabled animal
CN114200938B (en) Voice reminding method and device for leading surrounding obstacle of robot and robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant