CN112256038A - Intelligent space service method and system - Google Patents

Intelligent space service method and system Download PDF

Info

Publication number
CN112256038A
CN112256038A CN202011211200.3A CN202011211200A CN112256038A CN 112256038 A CN112256038 A CN 112256038A CN 202011211200 A CN202011211200 A CN 202011211200A CN 112256038 A CN112256038 A CN 112256038A
Authority
CN
China
Prior art keywords
service robot
information
environment
real
time monitoring
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011211200.3A
Other languages
Chinese (zh)
Inventor
沈岗
陈养彬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yinghe Shenzhen Robot and Automation Technology Co Ltd
Original Assignee
Yinghe Shenzhen Robot and Automation Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yinghe Shenzhen Robot and Automation Technology Co Ltd filed Critical Yinghe Shenzhen Robot and Automation Technology Co Ltd
Priority to CN202011211200.3A priority Critical patent/CN112256038A/en
Publication of CN112256038A publication Critical patent/CN112256038A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0242Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0255Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Optics & Photonics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Acoustics & Sound (AREA)
  • Manipulator (AREA)

Abstract

The invention provides an intelligent space service method and system, which are applied between a space sensing device and a service robot and comprise the following steps of; the service robot is used for collecting the environmental information and sending the environmental information to the space sensing device; the space sensing device collects real-time monitoring information of the environment where the service robot is located, identifies, positions and tracks a target object in the real-time monitoring information according to the real-time monitoring information and the environment information, and generates a building holographic map comprising the target object; and the space sensing device sends the building holographic map to the service robot, so that the service robot can actively sense the environment and execute a work task according to the building holographic map. The intelligent space service method and the system can provide the capabilities of positioning assistance, obstacle avoidance assistance, safety assistance and the like for the service robot, so that the service robot can actively sense the environment and accurately execute the service.

Description

Intelligent space service method and system
Technical Field
The invention relates to the technical field of robots, in particular to an intelligent space service method and system.
Background
As the service robots gradually tend to be highly intelligent, their processing capacity for things also greatly increases. The multifunctional disinfection and protection device also plays a role in various aspects in the life of people, such as disinfection, epidemic prevention, cleaning, security and protection patrol, welcome reception, logistics distribution and the like.
When the service robot provides service, the service robot has the characteristics of multiple service environment (place) article types, irregular distribution, dynamic change of obstacles and the like, and the existing service robot adopts a mode of executing service after a body senses the environment, so that the robot is difficult to sense the environment and cannot meet the requirements of accuracy and robustness of executing the service.
Therefore, a smart space service method and system are needed to solve the above problems.
Disclosure of Invention
The invention solves the technical problem that an intelligent space service method and an intelligent space service system are provided, which can provide capabilities of positioning assistance, obstacle avoidance assistance, safety assistance and the like for a service robot, so that the service robot realizes active environment perception and accurate service execution.
The technical problem to be solved by the invention is realized by adopting the following technical scheme:
an intelligent space service method is applied between a space sensing device and a service robot and comprises the following steps of; the service robot is used for collecting the environmental information and sending the environmental information to the space sensing device; the space sensing device collects real-time monitoring information of the environment where the service robot is located, identifies, positions and tracks a target object in the real-time monitoring information according to the real-time monitoring information and the environment information, and generates a building holographic map comprising the target object; and the space sensing device sends the building holographic map to the service robot, so that the service robot can actively sense the environment and execute a work task according to the building holographic map.
In a preferred embodiment of the present invention, the step of the service robot acquiring the environmental information and sending the environmental information to the spatial sensing device includes: the service robot establishes a surrounding environment map and confirms the position of the service robot in the surrounding environment map; the service robot sends environment information to the space sensing device, and the environment information comprises an environment map and the position of the service robot in the surrounding environment map.
In a preferred embodiment of the present invention, the step of acquiring real-time monitoring information of an environment where the service robot is located by the spatial sensing device, identifying, positioning and tracking a target object in the real-time monitoring information according to the real-time monitoring information and the environment information, and generating a building holographic map including the target object includes: the space sensing device collects real-time monitoring information through at least one camera device; acquiring service robot information according to the real-time monitoring information, wherein the service robot information comprises a service robot position; acquiring the position of a camera device according to the environment information and the service robot information; and acquiring the spatial position of the service robot according to the position of the camera device.
In a preferred embodiment of the present invention, the step of acquiring real-time monitoring information of an environment where the service robot is located by the spatial sensing device, identifying, positioning and tracking a target object in the real-time monitoring information according to the real-time monitoring information and the environment information, and generating a building hologram map including the target object further includes: and generating a corresponding building holographic map according to the environment information, the service robot information and the spatial position of the service robot, wherein the building holographic map comprises the building environment and the global spatial distribution information and the dynamic change information of each target object.
In a preferred embodiment of the present invention, the step of acquiring real-time monitoring information of an environment where the service robot is located by the spatial sensing device, identifying, positioning and tracking a target object in the real-time monitoring information according to the real-time monitoring information and the environment information, and generating a building hologram map including the target object further includes: and tracking and positioning the service robot and other target objects according to a particle filter target tracking algorithm.
In a preferred embodiment of the present invention, the step of acquiring real-time monitoring information of an environment where the service robot is located by the spatial sensing device, identifying, locating and tracking a target object in the real-time monitoring information according to the real-time monitoring information and the environment information, and generating a building hologram map including the target object includes: and acquiring real-time position information of the service robot according to the real-time monitoring information, and updating the real-time position information into a building holographic map.
In a preferred embodiment of the present invention, the step of sending the building holographic map to the service robot by the spatial sensing device to enable the service robot to actively sense the environment and execute the work task according to the building holographic map includes: the service robot acquires position information of a target object; and the service robot plans a driving path according to the position information of the target object.
In a preferred embodiment of the present invention, the step of sending the building holographic map to the service robot by the spatial sensing device to enable the service robot to actively sense the environment and execute the work task according to the building holographic map includes: acquiring position information of a moving target object; and when the abnormal speed of the service robot is detected or the collision with the target object is generated, the service robot is reminded to decelerate or avoid.
A smart space services system comprising: a space sensing device and a service robot; the service robot is used for collecting the environmental information and sending the environmental information to the space sensing device; the space sensing device collects real-time monitoring information of the environment where the service robot is located, identifies, positions and tracks target objects in the real-time monitoring information and the environment information, and generates a building holographic map comprising the target objects; and the space sensing device sends the building holographic map to the service robot, so that the service robot can actively sense the environment and execute a work task according to the building holographic map.
In a preferred embodiment of the present invention, the smart space services system further includes: an elevator signal sensor and a positioning sensor; the elevator signal sensor is used for sending the running state information of the elevator to the space sensing device; and the positioning sensor is used for acquiring position information to assist in confirming the position of the target object.
The technical effect achieved by adopting the technical scheme is as follows: the service robot and the space sensing device share data, so that the service robot can actively sense the environment, and the robot can safely work in the scenes which are considered to be challenging traditionally, such as long corridors, glass rooms on the ground and other areas; the service robot identifies the states of various fixed and movable targets such as the service robot in the current environment and makes prejudgment such as overspeed, collision with other targets and the like through positioning, identification, tracking and other services provided by the space sensing device, so that the service robot can be decelerated and avoided in advance, and the safety of the service robot is improved.
The foregoing description is only an overview of the technical solutions of the present invention, and in order to make the technical means of the present invention more clearly understood, the present invention may be implemented in accordance with the content of the description, and in order to make the above and other objects, features, and advantages of the present invention more clearly understood, the following preferred embodiments are specifically described in detail with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
Fig. 1 is a flowchart illustrating a smart space service method according to a first embodiment of the present invention.
Fig. 2 is a flowchart illustrating a smart space service method according to a second embodiment of the present invention.
Fig. 3 is a schematic structural diagram of an intelligent space service system according to a third embodiment of the present invention.
Fig. 4 is a schematic structural diagram of an intelligent space service system according to a fourth embodiment of the present invention.
Detailed Description
To further illustrate the technical measures and effects taken by the present invention to achieve the intended objects, embodiments of the present invention will be described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar functions throughout. The embodiments described below are only a part of the embodiments of the present invention, and not all of them. All other embodiments that can be obtained by a person skilled in the art based on the embodiments of the present invention without any inventive step belong to the scope of the embodiments of the present invention. While the present invention has been described in connection with the preferred embodiments, it is to be understood that the invention is not limited to the disclosed embodiments, but is intended to cover various modifications, equivalent arrangements, and specific embodiments thereof.
When the current service robot provides service, the service environment (place) has the characteristics of multiple article types, irregular distribution, dynamic change of obstacles and the like, and the existing service robot adopts a mode of executing service after the self-body senses the environment, so that the robot has difficulty in sensing the environment and cannot meet the requirements of accuracy and robustness of executing the service.
Fig. 1 is a flowchart illustrating a smart space service method according to a first embodiment of the present invention. The intelligent space service method described in this embodiment is applied between a space sensing device and a service robot, please refer to fig. 1.
Step S11: the service robot is used for collecting the environmental information and sending the environmental information to the space sensing device.
In one embodiment, step S11: the service robot is used for gathering environmental information to send environmental information to the space perception device, includes: the service robot establishes a surrounding environment map and confirms the position of the service robot in the surrounding environment map; the service robot sends environment information to the space sensing device, and the environment information comprises an environment map and the position of the service robot in the surrounding environment map.
Specifically, the service robot scans and detects the building environment through its own onboard sensors (including but not limited to sensors such as laser radar, ultrasonic waves, infrared ranging, vision, etc.), builds an environment map, and calculates the position of the service robot in the environment map. The context information containing this information is then sent to the spatial awareness means.
Step S12: the space sensing device collects real-time monitoring information of the environment where the service robot is located, identifies, positions and tracks a target object in the real-time monitoring information according to the real-time monitoring information and the environment information, and generates a building holographic map comprising the target object.
Specifically, the spatial sensing device may construct a building holographic map corresponding to the service robot through environment information shared by the service robot, position information of the service robot, and spatial position information of the corresponding camera device.
In one embodiment, step S12: the space perception device gathers the real-time monitoring information of the environment that the service robot is located to according to real-time monitoring information and environmental information to the target object in the real-time monitoring information discernment, location and tracking, and the building hologram including the target object is generated, include: the space sensing device collects real-time monitoring information through at least one camera device; acquiring service robot information according to the real-time monitoring information, wherein the service robot information comprises a service robot position; acquiring the position of a camera device according to the environment information and the service robot information; and acquiring the spatial position of the service robot according to the position of the camera device.
In one embodiment, step S12: the space perception device gathers the real-time monitoring information of the environment that the service robot is located to according to real-time monitoring information and environmental information discernment, location and tracking are carried out to the target object in the real-time monitoring information, and the building hologram map that generates including the target object still includes: and generating a corresponding building holographic map according to the environment information, the service robot information and the spatial position of the service robot, wherein the building holographic map comprises the building environment and the global spatial distribution information and the dynamic change information of each target object.
Specifically, the building holographic map records the global spatial distribution information of the building environment and various targets; the building holographic map also records dynamic change information of the target in the environment, such as the position, action posture and other information of the target.
In one embodiment, step S12: the space perception device gathers the real-time monitoring information of the environment that the service robot is located to according to real-time monitoring information and environmental information discernment, location and tracking are carried out to the target object in the real-time monitoring information, and the building hologram map that generates including the target object still includes: and tracking and positioning the service robot and other target objects according to a particle filter target tracking algorithm.
In one embodiment, step S12: the space perception device collects real-time monitoring information of an environment where the service robot is located, identifies, positions and tracks a target object in the real-time monitoring information according to the real-time monitoring information and the environment information, generates a building holographic map comprising the target object, and then comprises: and acquiring real-time position information of the service robot according to the real-time monitoring information, and updating the real-time position information into a building holographic map.
Specifically, tracking and positioning of a service object and a related target are realized through a particle filter tracking algorithm of an intelligent space camera set node target under a complex dynamic background; and calculating the space position of the changed target by combining the observation information of the plurality of cameras, and updating corresponding information in the holographic map according to the tracking and positioning results.
Step S13: and the space sensing device sends the building holographic map to the service robot, so that the service robot can actively sense the environment and execute a work task according to the building holographic map.
In one embodiment, the current service robot and its location are identified using the obtained environment real-time monitoring video data and vision-based object recognition techniques. The information is updated to a building holographic map, so that the service robot can be assisted to carry out positioning. The service robot can work safely in the scenes which are considered to be challenging in the traditional sense, such as long corridors, ground glass rooms and other areas.
In one embodiment, step S13: the space perception device sends building holographic map to service robot, makes service robot carry out initiative environmental perception and carry out the work task according to building holographic map, includes: the service robot acquires position information of a target object; and the service robot plans a driving path according to the position information of the target object.
Specifically, various fixed and moving objects, particularly moving objects such as moving persons, seats, closed/opened doors, and the like in the current environment are identified by using environment real-time monitoring video data obtained by at least one camera device and data obtained by scanning a sensor carried by a service robot itself, a vision-based object identification technology and a sensor feature identification technology. The service robot acquires the dynamic information, can avoid obstacles in advance and prevent collision.
In one embodiment, step S13: the space perception device sends building holographic map to service robot, makes service robot carry out initiative environmental perception and carry out the work task according to building holographic map, includes: acquiring position information of a moving target object; and when the abnormal speed of the service robot is detected or the collision with the target object is generated, the service robot is reminded to decelerate or avoid.
Specifically, the environment real-time monitoring video data obtained by at least one camera device and the data obtained by scanning of a sensor carried by the service robot are utilized, and the states of various fixed and movable targets such as the service robot in the current environment are identified and prejudged by a vision-based target identification technology and a sensor characteristic identification technology, such as overspeed, collision with other targets and the like of the service robot, so that the service robot can be decelerated and avoided in advance, and the safety of the service robot is improved.
In one embodiment, the environment real-time monitoring video data obtained by at least one camera device and the data obtained by the scanning of the sensor carried by the service robot are utilized, and the fault states of the active target in the current environment, such as the service robot, such as the states of shutdown, overturn, blockage and the like, are identified based on the vision target identification technology and the sensor characteristic identification technology, so that early warning can be given out.
According to the intelligent space service method, the service robot and the space sensing device share data, so that the service robot can actively sense the environment, and the robot can safely work in the scenes which are considered to be challenging traditionally, such as long corridors, glass rooms on the ground and other areas; the service robot identifies the states of various fixed and movable targets such as the service robot in the current environment and makes prejudgment through the positioning, identification, tracking and other services provided by the space sensing device, such as overspeed, collision with other targets and the like of the service robot, so that the service robot can be decelerated and avoided in advance, and the safety of the service robot is improved; the space sensing device can also identify the fault states of the service robot in the current environment, such as shutdown, overturning, blocking and the like, so that early warning can be given out.
Fig. 3 is a schematic structural diagram of an intelligent space service system according to a third embodiment of the present invention, and fig. 4 is a schematic structural diagram of an intelligent space service system according to a fourth embodiment of the present invention. Referring to fig. 3 and 4, the smart space services system at least includes at least one service robot 10 and a space sensing device 20. The service robot 10 includes a sensor 101, and the space sensing device 20 includes a server (not shown) and/or at least one camera 201. In one embodiment, the smart space services system also includes other auxiliary sensors 203.
Specifically, the smart space service system may be one camera 201 and one service robot 10, one camera 201 and a plurality of service robots 10, a plurality of cameras 201 and one service robot 10, and a plurality of cameras 201 and a plurality of service robots 10.
The space sensing device 20 identifies, positions and tracks according to the real-time monitoring information of the building environment collected by the space sensing device and the target in the environment information collected by the service robot 10, and generates a building holographic map; the space sensing device 20 sends the building holographic map to the service robot 10, so that the service robot 10 can perform active environment sensing and accurately execute work tasks according to the building holographic map.
Specifically, the smart space services system includes at least the following two components: a service robot 10 and a space sensing device 20. The spatial perception device 20 provides services including but not limited to, building a holographic map of a building and providing vision-based target recognition, positioning and tracking services for the service robot 10: positioning assistance, obstacle avoidance assistance, safety assistance and the like, so that the active environment perception and accurate service execution of the service robot 10 are realized. The environmental information collected by the service robot 10 is transmitted to the space sensing apparatus 20. The service robot 10 provides various services such as disinfection, epidemic prevention, cleaning, security patrol, welcome reception, logistics distribution, and the like to the human based on the acquired various position information.
In one embodiment, the spatial perception device 20 comprises: at least one camera 201; the camera 201 is used for acquiring real-time monitoring information of the building environment; and identifying, positioning and tracking each target in the real-time monitoring information and the environmental information, and generating a building holographic map according to the real-time monitoring information and the environmental information.
Specifically, real-time monitoring information of a service environment within a building is acquired by at least one camera 201 (such as a video camera or the like) installed within the building. The service robot 10 acquires environment information through its own onboard sensor 101 (including but not limited to, a sensor such as a laser radar, an ultrasonic wave, an infrared ranging, and a vision sensor), creates an environment map, and then calculates the position of the service robot 10 in the environment map. The space sensing device 20 and the service robot 10 share data through a network, and the shared data includes, but is not limited to, real-time monitoring information, environment map data, and position data of the service robot 10.
In one embodiment, the server identifies the current service robot 10 and the spatial location of the corresponding service robot 10 within the building environment using the obtained real-time environmental monitoring information and a vision-based object recognition technique. Specifically, the spatial position of the imaging device 201 in the building environment is obtained from the environmental information shared by the service robots 10 and the information such as the positions of the service robots 10 in the environmental map created based on the environmental information.
In one embodiment, a building holographic map includes a building environment, global spatial distribution information for each object in the building environment, and dynamic change information for each object in the building environment.
In one embodiment, the server identifies each target from real-time monitoring information of the building environment, determines the relative position and attitude of each target, and tracks moving targets in real-time.
Further, the server tracks and positions the service robot 10 and other related targets in the real-time monitoring information according to a particle filter tracking algorithm; and acquiring the space position of the changed target according to the real-time monitoring information acquired by the camera device 201, and updating corresponding information in the building holographic map according to the tracking and positioning result.
Specifically, tracking and positioning of a service object and a related target are realized through a particle filter tracking algorithm of a node target of the intelligent space camera 201 under a complex dynamic background; and then, the space position of the changed target is calculated by combining the plurality of camera devices 201 and the observation information thereof, and the corresponding target information in the building holographic map is updated according to the tracking and positioning result. The identified targets include, without limitation, various infrastructures, such as walls, doors, elevators, etc.; various office facilities such as office furniture, computers, printers, greens, etc.; various moving objects such as moving persons, chairs, etc.
In one embodiment, the spatial awareness device provides the service robot 10 with positioning assistance capabilities; the server performs target recognition on the real-time monitoring information of the building environment acquired by the camera 201, identifies the service robot 10, and acquires position information of the service robot 10; the server updates the position information of the service robot 10 to the building holographic map to assist the service robot 10 in positioning.
Specifically, the current service robot 10 and its position are identified by using the obtained environment real-time monitoring video data and a vision-based target identification technique. The information is updated to the building holographic map to assist the service robot 10 in positioning. The robot can work safely in the scenes which are considered to be challenging in the traditional sense, such as long corridors, ground glass rooms and other areas.
In one embodiment, the spatial sensing device provides the service robot 10 with an obstacle avoidance assistance capability: the server identifies real-time monitoring information and environmental information of the building environment to identify fixed targets and moving targets in the current environment.
Specifically, various fixed and moving objects, particularly moving objects such as a moving person, a seat, a closed/opened door, and the like in the current environment are recognized using the obtained real-time monitoring information of the building environment and the environmental information collected by the sensor carried by the service robot 10 itself, the vision-based object recognition technology and the sensor feature recognition technology. The service robot 10 can avoid obstacles in advance by acquiring the dynamic information, and prevent collision.
In one embodiment, the spatial awareness device provides the service robot 10 with a safety assistance capability: the server and/or the service robot 10 performs prejudgment according to the states of the fixed target and the movable target in the current environment, so that the service robot 10 performs deceleration and avoidance in advance; when detecting that the service robot 10 is in a fault state, the server sends out early warning information through the early warning module.
Specifically, by using the obtained real-time monitoring information of the building environment and the environment information obtained by the sensor carried by the service robot 10, the target recognition technology and the sensor feature recognition technology based on vision, the states of various fixed and movable targets in the current environment, such as the service robot 10, are recognized and prejudged, such as the service robot 10 speeding, colliding other targets, and the like, so that the service robot 10 can be decelerated, avoided, and the like in advance, and the safety of the service robot 10 is improved. The obtained real-time monitoring information of the building environment and the building environment obtained by the sensor carried by the service robot 10 can be used for identifying the fault states of the active target in the current environment, such as the fault states of the service robot 10, such as the states of shutdown, overturning, blockage and the like, based on the vision target identification technology and the sensor characteristic identification technology, so that early warning can be given out.
In one embodiment, the smart space services system further comprises: other auxiliary sensors 30 such as elevator signal sensors and positioning sensors; the elevator signal sensor is used for sending the running state information of the elevator to the space sensing device.
Specifically, the intelligent space service system comprises the camera 201 and the server which are installed in the building, and can also comprise other auxiliary sensors, such as an elevator signal sensor installed in the building, so that the running state of the elevator can be known; and positioning sensors such as UWB, Bluetooth, WIFI and the like. The sensors send the detected information to the server, and the server updates the relevant information in the building holographic map according to the information.
The intelligent space service system provided by the embodiment adopts the service robot 10 with the mobile platform and the space sensing device 20 to share data, so that the service robot 10 can actively sense the environment, and the robot can safely work in the scenes which are considered to be challenging traditionally, such as long corridors, glass rooms on the ground and other areas; through the positioning, identifying and tracking services provided by the space sensing device 20, the states of various fixed and movable targets in the current environment, such as the service robot, are identified and prejudged, such as overspeed, collision with other targets and the like of the service robot, so that the service robot can be decelerated and avoided in advance, and the safety of the service robot is improved; fault states of active objects in the current environment, such as service robots, are identified, such as states of shutdown, overturning, blocking, and the like, so that early warning can be given.
It should be understood that, although the steps in the flowcharts in the figures are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and may be performed in other orders unless explicitly stated herein. Moreover, at least a portion of the steps in the figures may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, in different orders, and may be performed alternately or at least partially with respect to other steps or sub-steps of other steps.
Through the above description of the embodiments, it is clear to those skilled in the art that the embodiments of the present invention may be implemented by hardware, or by software plus a necessary general hardware platform. Based on such understanding, the technical solutions of the embodiments of the present invention may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.), and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods described in the various implementation scenarios of the embodiments of the present invention.
The present invention is not limited to the details of the above embodiments, which are exemplary, and the modules or processes in the drawings are not necessarily essential to the implementation of the embodiments of the present invention, and should not be construed as limiting the present invention.

Claims (10)

1. An intelligent space service method is applied between a space sensing device and a service robot, and is characterized by comprising the following steps of;
the service robot is used for collecting environmental information and sending the environmental information to the space sensing device;
the space sensing device collects real-time monitoring information of the environment where the service robot is located, identifies, positions and tracks a target object in the real-time monitoring information according to the real-time monitoring information and the environment information, and generates a building holographic map comprising the target object;
and the space perception device sends the building holographic map to the service robot, so that the service robot can actively perceive the environment and execute a work task according to the building holographic map.
2. The smart space service method as claimed in claim 1, wherein the step of the service robot for collecting the environment information and transmitting the environment information to the space sensing device comprises:
the service robot establishes a surrounding environment map and confirms the position of the service robot in the surrounding environment map;
and the service robot sends the environment information to the space perception device, wherein the environment information comprises the environment map and the position of the service robot in the surrounding environment map.
3. The smart space service method as claimed in claim 1, wherein the step of the space sensing device collecting real-time monitoring information of an environment where the service robot is located, and identifying, locating and tracking a target object in the real-time monitoring information according to the real-time monitoring information and the environment information, and generating a building holographic map including the target object comprises:
the space sensing device collects the real-time monitoring information through at least one camera device;
acquiring the service robot information according to the real-time monitoring information, wherein the service robot information comprises a service robot position;
acquiring the position of a camera device according to the environment information and the service robot information;
and acquiring the spatial position of the service robot according to the position of the camera device.
4. The smart space services method as claimed in claim 3, wherein the step of the space sensing device collecting real-time monitoring information of the environment where the service robot is located, and identifying, locating and tracking a target object in the real-time monitoring information according to the real-time monitoring information and the environment information, and generating a building holographic map including the target object further comprises:
and generating the corresponding building holographic map according to the environment information, the service robot information and the spatial position of the service robot, wherein the building holographic map comprises the building environment and the global spatial distribution information and the dynamic change information of each target object.
5. The smart space services method as claimed in claim 1, wherein the step of the space sensing device collecting real-time monitoring information of the environment where the service robot is located, and identifying, locating and tracking a target object in the real-time monitoring information according to the real-time monitoring information and the environment information, and generating a building holographic map including the target object further comprises:
and tracking and positioning the service robot and other target objects according to a particle filter target tracking algorithm.
6. The smart space services method as claimed in claim 1, wherein the step of the space sensing device collecting real-time monitoring information of the environment where the service robot is located, and identifying, locating and tracking the target object in the real-time monitoring information according to the real-time monitoring information and the environment information, and generating the building holographic map including the target object, is followed by the steps of:
and acquiring real-time position information of the service robot according to the real-time monitoring information, and updating the real-time position information into the building holographic map.
7. The smart space services method as claimed in claim 1, wherein the step of the space sensing device sending the building holographic map to the service robot to make the service robot perform active environment sensing and work task according to the building holographic map comprises:
the service robot acquires the position information of the target object;
and the service robot plans a driving route according to the position information of the target object.
8. The smart space services method as claimed in claim 1, wherein the step of the space sensing device sending the building holographic map to the service robot to make the service robot perform active environment sensing and work task according to the building holographic map comprises:
acquiring position information of a moving target object;
and when the service robot is detected to have abnormal speed or to collide with the target object, reminding the service robot to decelerate or avoid.
9. A smart space services system, the smart space services system comprising: a space sensing device and a service robot;
the service robot is used for collecting environmental information and sending the environmental information to the space sensing device;
the space sensing device collects real-time monitoring information of the environment where the service robot is located, identifies, positions and tracks target objects in the real-time monitoring information and the environment information, and generates a building holographic map comprising the target objects;
and the space perception device sends the building holographic map to the service robot, so that the service robot can actively perceive the environment and execute a work task according to the building holographic map.
10. The smart space service system of claim 9 wherein the smart space service system further comprises: an elevator signal sensor and a positioning sensor;
the elevator signal sensor is used for sending the running state information of the elevator to the space sensing device;
and the positioning sensor is used for acquiring position information to assist in confirming the position of the target object.
CN202011211200.3A 2020-11-03 2020-11-03 Intelligent space service method and system Pending CN112256038A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011211200.3A CN112256038A (en) 2020-11-03 2020-11-03 Intelligent space service method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011211200.3A CN112256038A (en) 2020-11-03 2020-11-03 Intelligent space service method and system

Publications (1)

Publication Number Publication Date
CN112256038A true CN112256038A (en) 2021-01-22

Family

ID=74268713

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011211200.3A Pending CN112256038A (en) 2020-11-03 2020-11-03 Intelligent space service method and system

Country Status (1)

Country Link
CN (1) CN112256038A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012124933A2 (en) * 2011-03-11 2012-09-20 고려대학교 산학협력단 Device and method for recognizing the location of a robot
CN102914303A (en) * 2012-10-11 2013-02-06 江苏科技大学 Navigation information acquisition method and intelligent space system with multiple mobile robots
CN103454919A (en) * 2013-08-19 2013-12-18 江苏科技大学 Motion control system and method of mobile robot in intelligent space
CN105911518A (en) * 2016-03-31 2016-08-31 山东大学 Robot positioning method
CN205983220U (en) * 2016-09-09 2017-02-22 智能侠(北京)科技有限公司 Remove location and navigation head equipped based on spatial vision sensor
CN106647766A (en) * 2017-01-13 2017-05-10 广东工业大学 Robot cruise method and system based on complex environment UWB-vision interaction
CN111216127A (en) * 2019-12-31 2020-06-02 深圳优地科技有限公司 Robot control method, device, server and medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012124933A2 (en) * 2011-03-11 2012-09-20 고려대학교 산학협력단 Device and method for recognizing the location of a robot
CN102914303A (en) * 2012-10-11 2013-02-06 江苏科技大学 Navigation information acquisition method and intelligent space system with multiple mobile robots
CN103454919A (en) * 2013-08-19 2013-12-18 江苏科技大学 Motion control system and method of mobile robot in intelligent space
CN105911518A (en) * 2016-03-31 2016-08-31 山东大学 Robot positioning method
CN205983220U (en) * 2016-09-09 2017-02-22 智能侠(北京)科技有限公司 Remove location and navigation head equipped based on spatial vision sensor
CN106647766A (en) * 2017-01-13 2017-05-10 广东工业大学 Robot cruise method and system based on complex environment UWB-vision interaction
CN111216127A (en) * 2019-12-31 2020-06-02 深圳优地科技有限公司 Robot control method, device, server and medium

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
吴培良 等: "一种服务机器人家庭全息地图构建方法研究", 《计算机应用研究》 *
吴培良 等: "动态家庭环境智能空间服务机器人全息建图方法", 《科技导报》 *
吴培良: "家庭智能空间中服务机器人全息建图及相关问题研究", 《中国博士学位论文全文数据库 信息科技辑》 *

Similar Documents

Publication Publication Date Title
US12045056B2 (en) Information processing apparatus, information processing method, and medium
KR102670610B1 (en) Robot for airport and method thereof
KR101999033B1 (en) Method for building a map of probability of one of absence and presence of obstacles for an autonomous robot
JP6526613B2 (en) Mobile robot system
JP5385893B2 (en) POSITIONING SYSTEM AND SENSOR DEVICE
US11409295B1 (en) Dynamic positioning of an autonomous mobile device with respect to a user trajectory
CN112859873A (en) Semantic laser-based mobile robot multi-stage obstacle avoidance system and method
EP4283335A1 (en) Detection and tracking of humans using sensor fusion to optimize human to robot collaboration in industry
Baig et al. A robust motion detection technique for dynamic environment monitoring: A framework for grid-based monitoring of the dynamic environment
KR20180038870A (en) Airport robot and system including the same
CN114153200A (en) Trajectory prediction and self-moving equipment control method
Alam et al. Object detection learning for intelligent self automated vehicles
US11823458B2 (en) Object detection and tracking system
Hu et al. Bayesian fusion of ceiling mounted camera and laser range finder on a mobile robot for people detection and localization
Navarro-Serment et al. LADAR-based pedestrian detection and tracking
US11567504B1 (en) Dynamic wait location for an autonomous mobile device
WO2022198161A1 (en) System to determine non-stationary objects in a physical space
CN114489082A (en) Method, device and equipment for controlling movement of robot and storage medium
CN112256038A (en) Intelligent space service method and system
Yücel et al. Identification of mobile entities based on trajectory and shape information
US11480968B1 (en) System for dynamic positioning of an autonomous mobile device with respect to a user
US11810345B1 (en) System for determining user pose with an autonomous mobile device
US11853077B1 (en) System for determining constraint regions for an autonomous mobile device
Bogue Detecting humans in the robot workspace
JP2019079247A (en) Mobile body control device, mobile body control method and mobile body control system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210122