WO2018076191A1 - 智能巡逻设备、云端控制装置、巡逻方法、控制方法、机器人、控制器及非暂态计算机可读存储介质 - Google Patents

智能巡逻设备、云端控制装置、巡逻方法、控制方法、机器人、控制器及非暂态计算机可读存储介质 Download PDF

Info

Publication number
WO2018076191A1
WO2018076191A1 PCT/CN2016/103327 CN2016103327W WO2018076191A1 WO 2018076191 A1 WO2018076191 A1 WO 2018076191A1 CN 2016103327 W CN2016103327 W CN 2016103327W WO 2018076191 A1 WO2018076191 A1 WO 2018076191A1
Authority
WO
WIPO (PCT)
Prior art keywords
suspicious target
patrol
processing
latitude
suspicious
Prior art date
Application number
PCT/CN2016/103327
Other languages
English (en)
French (fr)
Inventor
骆磊
Original Assignee
深圳前海达闼云端智能科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳前海达闼云端智能科技有限公司 filed Critical 深圳前海达闼云端智能科技有限公司
Priority to CN201680002940.1A priority Critical patent/CN107148777B/zh
Priority to PCT/CN2016/103327 priority patent/WO2018076191A1/zh
Publication of WO2018076191A1 publication Critical patent/WO2018076191A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • H04N7/185Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0631Resource planning, allocation, distributing or scheduling for enterprises or organisations
    • G06Q10/06311Scheduling, planning or task assignment for a person or group
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/26Government or public services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects

Definitions

  • the invention relates to the field of intelligent patrol, in particular to an intelligent patrol device, a cloud control device, a patrol method, a control method, a robot, a controller and a non-transitory computer readable storage medium.
  • patrol search plays an increasingly important role in today's society.
  • patrol search has been using a combination of traditional manual patrol mechanisms and fixed-site camera video surveillance.
  • the method of manual patrol is limited by the number and physical strength of the personnel, and the route, time, frequency, and density cannot fully meet the needs.
  • due to the enthusiasm of the staff the situation is often at night and in poor weather conditions. Completion occurred.
  • due to manual patrol it is impossible to return the image and other information of the scene to the monitoring center in real time, which brings inconvenience to the command decision for the emergency.
  • the video surveillance mode of the fixed location camera requires a large number of camera installations and circuit modifications, which not only increases the cost, but also installs a large number of cameras independently, and its patrol range and patrol effect have greater limitations.
  • the existing patrol search work basically relies on the combination of manual patrol mechanism and fixed-point camera video surveillance.
  • the patrol equipment cannot be uniformly scheduled and coordinated, and the patrol range and patrol effect have greater limitations.
  • the embodiment of the invention provides a smart patrol device, a cloud control device, a patrol method, and a control party.
  • the method, the robot, the controller and the non-transitory computer readable storage medium are used to solve the problem that the existing patrol search process cannot be uniformly scheduled and coordinated among the participating patrol devices, and the patrol range and the patrol effect have greater limitations. problem.
  • an embodiment of the present invention provides a smart patrol device, including:
  • a patrol screen acquisition module configured to acquire a patrol screen through a camera
  • a target identification module configured to perform target recognition on the patrol screen according to a feature of the suspicious target, to determine a suspicious target that meets the feature, and determine a location of the suspicious target;
  • a transceiver module configured to report the suspicious target image and location to the cloud control device, and receive a tracking instruction sent by the cloud control device;
  • the tracking module is configured to perform tracking of the corresponding suspicious target according to the tracking instruction of the cloud control device.
  • an embodiment of the present invention provides a cloud control apparatus, including:
  • the patrol screen analysis module is configured to analyze the suspicious target image and location reported by the intelligent patrol device, and determine the type and quantity of the intelligent patrol device and/or the processing personnel and/or the processing equipment required for the task according to the analysis result;
  • a task processing module configured to: according to the real-time location of each smart patrol device and/or processing personnel and/or processing equipment under the control of the cloud control device, the smart patrol device and/or processing personnel required for the current task and/or Or processing the type and quantity of equipment, the location of the suspicious target, and the processing priority of the task to determine the intelligent patrol equipment and/or processing personnel and/or processing equipment involved in the processing of the task;
  • the notification module is configured to send a suspicious target tracking instruction carrying the suspicious target picture and location to the smart patrol device and/or the processing personnel and/or the processing equipment participating in the current task processing.
  • an embodiment of the present invention provides a smart patrol method, including:
  • the suspicious target picture and location are reported to the cloud control device, and the corresponding suspicious target is tracked according to the tracking instruction sent by the cloud control device.
  • an embodiment of the present invention provides a cloud control method, including:
  • a suspicious target tracking instruction carrying the suspicious target picture and location is transmitted to the smart patrol device and/or processing personnel and/or processing equipment participating in the current task processing.
  • an embodiment of the present invention provides an intelligent robot, including: a camera, a processor module, a transceiver component, and a mobile driving device;
  • a processor module configured to acquire a patrol screen by using a camera, perform target recognition on the patrol screen according to characteristics of the suspicious target, determine a suspicious target that meets the feature, determine a location of the suspicious target, and pass the transceiver component
  • the suspicious target picture and the location are reported to the cloud controller, and the tracking of the corresponding suspicious target is performed by the camera and the mobile driving device according to the tracking instruction issued by the cloud controller.
  • an embodiment of the present invention provides a cloud controller, including: a processor module and a communication component;
  • a processor module configured to control the communication component to receive the suspicious target image and position reported by the intelligent robot, analyze the suspicious target image and position received by the communication component, and determine the smart patrol device required for the task according to the analysis result and/or Or the type and number of processing personnel and/or processing equipment, and the intelligent patrol required for the mission according to the real-time location of each intelligent patrol device and/or processing personnel and/or processing equipment under the control of the cloud controller
  • the type and number of equipment and/or processing personnel and/or processing equipment, the location of the suspicious target, the processing priority of the task determines participation
  • the smart patrol device and/or processing personnel and/or processing equipment processed by the task are sent to the smart patrol device and/or processing personnel and/or processing equipment involved in the task processing by the communication component.
  • embodiments of the present invention provide a non-transitory computer readable storage medium, the non-transitory computer readable storage medium storing computer instructions for causing the computer to perform the cloud control The various steps of the method.
  • the intelligent patrol device of the present invention automatically identifies the suspicious target in the monitoring screen, and reports the location information and the screen of the suspicious target to the cloud control device, so that the cloud control device can control all the smart patrols according to the location and the screen of the suspicious target.
  • the equipment performs unified scheduling and establishes a suitable search or hunt group.
  • the invention can automatically realize intelligent monitoring and tracking of suspicious targets without manual intervention, and overcomes the existing patrol equipment on the patrol range. The problem of larger limitations.
  • FIG. 1 is a schematic structural diagram of a smart patrol device according to Embodiment 1 of the present invention.
  • FIG. 2 is a schematic diagram of determining a suspicious target location by a target recognition module in Embodiment 1 of the present invention
  • FIG. 3 is a schematic structural diagram of a cloud control device according to Embodiment 2 of the present invention.
  • FIG. 4 is a flow chart showing a smart patrol method in Embodiment 3 of the present invention.
  • FIG. 5 is a flowchart of a cloud control method in Embodiment 4 of the present invention.
  • FIG. 6 is a schematic structural diagram of an intelligent robot according to Embodiment 5 of the present invention.
  • FIG. 7 is a schematic structural diagram of a cloud controller in Embodiment 6 of the present invention.
  • the embodiments of the present invention provide a smart patrol device, a cloud control device, a patrol method, a control method, a robot, a controller, and a non-transitory computer readable storage medium.
  • FIG. 1 is a schematic structural diagram of a smart patrol device according to Embodiment 1 of the present invention. As shown in the figure, the smart patrol device in the first embodiment of the present invention includes:
  • the patrol screen acquiring module 101 is configured to acquire a patrol screen by using a camera
  • the target identification module 102 is configured to perform target recognition on the patrol screen according to the feature of the suspicious target to determine a suspicious target that meets the suspicious target feature, and determine the location of the suspicious target;
  • the transceiver module 103 is configured to report the location information of the suspicious target and the suspicious target image to the cloud control device, and receive the tracking instruction sent by the cloud control device.
  • the tracking module 104 is configured to perform tracking of the corresponding suspicious target according to the tracking instruction of the cloud control device.
  • the feature of the suspicious target may be sent by the cloud control device to the smart patrol device, or may be input by the user to the smart patrol device.
  • the target recognition module 102 may extract the image feature of the suspicious target according to the clear image of the tracked target that is sent by the cloud control device or input by the user, or may directly use the image feature data sent by the cloud control device or input by the user.
  • the image feature of the suspicious target may also extract or learn image features of a suspicious target on certain frame images of the source video delivered by the cloud control device or specified by the user.
  • the facial image feature of the wanted person is extracted by the cloud control device or the user-provided overnight photo, or the grayscale data sent by the cloud control device or input by the user as a suspicious target; or the cloud A suspicious target image or the like that is selected by the control device or is input by the user and is captured in a certain frame of the source video.
  • the target recognition module 102 adopts at least one of facial recognition, iris recognition, motion recognition, wireless signal recognition, and number identification, and performs target recognition on the monitoring image collected by the patrol target in real time according to the characteristics of the suspicious target. To determine the suspicious target monitoring screen that meets the suspicious target characteristics.
  • face recognition, iris recognition, motion recognition and number recognition are all image-based operations, and specific recognition methods can be performed with reference to existing recognition technologies.
  • Face recognition is mainly for people, taking pictures of people's faces, and also identifying the type of robots; for example, in practical applications, you can pre-store specific people such as wanted facial photos, or pre-store the contours of the robot.
  • the corresponding face is detected or the corresponding robot is detected, the corresponding person or robot is determined to be the tracked target.
  • Iris recognition is the part of the eye that captures the person's eye to accurately determine who this person is, which is much more accurate than facial recognition.
  • Motion recognition is the system determines the action of identifying the person or the robot in front of the camera; for example, it can pre-store some dangerous actions, such as fighting, violent impact, etc.
  • the sender of the action is determined to be suspicious. aims.
  • the number identification is to capture the number of the robot to determine the type parameter of the robot, etc., provided that the robot has a visible identification code. At this time, the corresponding identification code can be identified by the image recognition technology to determine the object to be tracked.
  • FIG. 2 is a diagram showing the determination of the suspicious target position by the target recognition module 102 in the first embodiment of the present invention.
  • the specific process of determining the location of the suspicious target based on the latitude and longitude and altitude information of the smart patrol device and the relative positional relationship between the smart patrol device and the suspicious target is as follows:
  • the plane of the XY axis is sea level (non-ground), and the intelligent patrol device is currently at point A in space. At some point, a suspicious target at the ground point B is found. At any time, the point that is always projected to the sea level according to the intelligent patrol device is the origin O of the XY axis (the XY axis follows the movement when the smart patrol device moves), and the X axis always points to the east direction on the sea level, and the Y axis always Point to the true north direction on the sea level.
  • the target recognition module 102 obtains the linear distance d of the intelligent patrol device to the tracking target by infrared laser ranging or dual camera ranging, and obtains the current accurate altitude h of the intelligent patrol device through the height sensor on the intelligent patrol device, wherein d and h need to ensure the same unit.
  • the angle between the optical axis of the camera and the vertical direction is obtained by the mechanical department of the camera rotating on the smart patrol device.
  • the geomagnetic sensor can know the absolute direction of the intelligent patrol device in real time, so that the absolute direction of the OC of the camera's optical axis in the sea level can be known by the camera rotating mechanical structure (C is the projection point from the target point B to the sea level).
  • the positive angle between OC and the X axis is ⁇ .
  • the real-time latitude and longitude information of the intelligent patrol device learned by the satellite positioning system is (m, n), and the target position B is the vertical line BD of the OA.
  • BC is the altitude of the suspicious target point B.
  • the coordinates of the C point are known as (d*sin ⁇ *cos ⁇ , d*sin ⁇ *sin ⁇ );
  • the latitude and longitude information of point O is the latitude and longitude information of point A
  • the change of latitude and longitude is determined.
  • the east-west longitude at point O is assumed.
  • the variation coefficient between the change and the east-west distance is j
  • the latitude change between the north-south direction and the north-south distance change coefficient of this point is k (when the O-point latitude and longitude are different, the values of j and k are also different, but as long as the O point latitude and longitude is determined , j and k are fixed values.
  • Point location information including absolute latitude and longitude and altitude information:
  • the suspicious target B moves eastward and the longitude increases, that is, the longitude is m+j*d*sin ⁇ *cos ⁇ , and the westward longitude decreases, that is, the longitude is mj*d*sin ⁇ * Cos ⁇ ;
  • the intelligent patrol device is in the west, the suspicious target moves eastward and the longitude decreases, that is, the longitude is mj*d*sin ⁇ *cos ⁇ , and the longitude of the westward movement increases, that is, the longitude is m+j*d*sin ⁇ * Cos ⁇ ;
  • the suspicious target B moves northward to increase the latitude, that is, the dimension is n+k*d*sin ⁇ *sin ⁇ , and the southward moving latitude decreases, that is, the dimension is nk*d *sin ⁇ *sin ⁇ ; if the latitude of the intelligent patrol device is south latitude (southern hemisphere), the susceptibility of the suspicious target B to the north is reduced, that is, the dimension is nk*d*sin ⁇ *sin ⁇ , and the latitude is increased to the south, that is, the dimension It is n+k*d*sin ⁇ *sin ⁇ .
  • the transceiver module 103 sets the three-dimensional coordinates of the intelligent patrol device, including the latitude and longitude and altitude information of the smart patrol device, the three-dimensional coordinates of the suspicious target, the latitude and longitude and altitude information of the suspicious target, and the screen of the suspicious target, which may be
  • the captured image of some picture frames may also be all monitored video and sent to the cloud control device by wireless or wired.
  • the transceiver module 103 is further configured to receive a tracking instruction of the cloud control device.
  • the tracking module 104 tracks the suspicious target according to the tracking instruction of the cloud control device received by the transceiver module 103, and can be implemented as follows: the patrol screen of the smart patrol device in real time Perform analysis and identification, adjust parameters such as direction and focal length of the camera according to the position of the suspicious target in the screen, and control the flight drive module so that the tracked suspicious target does not exceed the monitoring range at the center of the screen, thereby maintaining the target being tracked. track.
  • the intelligent patrol device in the embodiment can automatically identify the suspicious target in the patrol screen, and report the real-time location information and the monitoring screen of the suspicious target to the cloud control device for the cloud control device to perform unified patrol scheduling.
  • FIG. 3 is a schematic structural diagram of a cloud control device according to Embodiment 2 of the present invention.
  • the cloud control device is used to control a smart patrol device that establishes a connection with the cloud control device.
  • the patrol screen analysis module 301 is configured to analyze the suspicious target image and location reported by the intelligent patrol device, and determine the type and quantity of the smart patrol device and/or the processing personnel and/or the processing equipment required for the task according to the analysis result;
  • the task processing module 302 is configured to: according to the real-time location of each smart patrol device and/or processing personnel and/or processing equipment under the control of the cloud control device, the smart patrol device and/or processing personnel required for the current task and / or the type and quantity of processing equipment, the location of the suspicious target, and the processing priority of the task to determine the intelligent patrol equipment and / or processing personnel and / or processing equipment involved in the processing of this task;
  • the notification module 303 is configured to send a suspicious target tracking instruction carrying the suspicious target picture and location to the smart patrol device and/or processing personnel and/or processing equipment participating in the current task processing.
  • the cloud control device further includes:
  • the priority determining module 304 is configured to determine a processing priority of the task according to the suspicious target picture and location;
  • the patrol screen analysis module 301 analyzes the suspicious target screen and the location reported by the intelligent patrol device, the task with a higher priority is preferentially processed, and the analysis result corresponding to the task is determined.
  • the patrol screen analysis module 301 analyzes and estimates the number of smart patrol devices and/or processing personnel and/or equipment to be carried required for the current task according to the suspicious target screen and location reported by the smart patrol device. If it is identified that the other party is a criminal gang, it is determined that multiple mechanical police officers and personnel are required to participate in the arrest; if it is identified that the other party may have firearms or other weapons, it is necessary to carry weapons and equipment that can restrict the other party; if the other party is identified as one or Multiple suspicious robots, based on large databases or intelligent analysis, determine the weaknesses of these/these robots and how to subdue them.
  • the device determines the mechanical police, personnel, and/or equipment that can constrain the capabilities of the robot.
  • the task processing module 302 combines the three-dimensional coordinates of the suspicious target, the motion state, and the map near the location point to notify other smart patrol devices that match the suspicious target in the motion direction and whose current task priority is lower than the priority of the task. Join monitoring and location tracking of the suspicious target. Thereby effectively avoiding the intelligent patrol device being found when tracking suspicious targets.
  • the task processing module 302 when the suspicious target exceeds the monitoring distance threshold of the smart patrol device or the current capability of the smart patrol device is insufficient to continue tracking, the task processing module 302 notifies the smart patrol device to suspend monitoring and location tracking of the suspicious target.
  • the mission is dynamically changing from the beginning (establishing the participating mechanical police/personnel, and participating intelligent patrol equipment) to the end of the process (caught the target or the cloud terminates the action or artificially terminates the action).
  • the cloud control device always maintains the information synchronization of all the participating intelligent patrol devices and personnel in the current team, including real-time synchronization of the three-dimensional coordinate information of the suspicious target, real-time synchronization of the target monitoring information, etc., so that the task currently tracks the entire team information. be consistent.
  • the cloud control device can uniformly schedule all intelligent patrol devices under the control of the suspected target according to the real-time location and the monitoring screen, and simultaneously establish a matching search or hunt group, which requires almost no manual intervention, that is, It can automatically realize intelligent monitoring and tracking of monitoring targets, and at the same time overcome the limitations of existing patrol equipment on the patrol range. Sexual problem.
  • An intelligent patrol method is also provided in the embodiment of the present invention.
  • the implementation of the patrol method in this embodiment can be implemented in the implementation of the smart patrol device in the first embodiment.
  • Narration. 4 is a flowchart of a patrol method of a smart patrol device in Embodiment 3 of the present invention, including the following steps:
  • Step 401 Obtain a patrol screen through a camera
  • Step 402 Perform target recognition on the monitoring screen according to the characteristics of the suspicious target to determine a suspicious target that meets the suspicious target feature, and determine the location of the suspicious target.
  • Step 403 The real-time location information of the suspicious target is reported to the cloud control device, and the tracking of the corresponding suspicious target is performed according to the tracking instruction sent by the cloud control device.
  • target image recognition is performed on at least one of facial recognition, iris recognition, motion recognition, wireless signal recognition, and number identification.
  • the specific recognition process and the patrol screen are described in the first embodiment.
  • the identification process is the same and will not be described here.
  • the location information of the suspicious target includes the altitude of the suspicious target and the absolute latitude and longitude.
  • the location of the suspicious target is determined according to the latitude and longitude and altitude information of the smart patrol device and the relative positional relationship with the suspicious target, and the specific location determining process and the location determining process of the suspicious target described in the first embodiment The same, no longer repeat here.
  • FIG. 5 is a flowchart of a cloud control method in Embodiment 4 of the present invention, including the following steps:
  • Step 501 analyzing the suspicious target picture and location reported by the intelligent patrol device, and determining, according to the analysis result, the type and quantity of the intelligent patrol device and/or the processing personnel and/or the processing equipment required for the task;
  • Step 502 according to each intelligent patrol device and/or processing personnel under the control of the cloud control device and/or Or processing the real-time location of the equipment, the type and number of intelligent patrol equipment and/or processing personnel and/or processing equipment required for the mission, the location of the suspicious target, and the processing priority of the mission to determine the intelligent patrol involved in the processing of the mission.
  • Step 503 Send a suspicious target tracking instruction carrying the suspicious target picture and location to the smart patrol device and/or processing personnel and/or processing equipment participating in the current task processing.
  • the processing priority of the task is determined according to the suspicious target picture and location.
  • priority is given to the task with higher priority, and the smart patrol device and/or processing personnel and/or processing equipment required for the task are determined according to the analysis result corresponding to the task. Type and quantity.
  • the number of intelligent patrol devices and/or processing personnel and/or equipment to be carried required for the current task is analyzed and estimated according to the suspicious target picture and location reported by the intelligent patrol device. If it is identified that the other party is a criminal gang, it is determined that multiple mechanical police officers and personnel are required to participate in the arrest; if it is identified that the other party may have firearms or other weapons, it is necessary to carry weapons and equipment that can restrict the other party; if the other party is identified as one or Multiple suspicious robots, based on large databases or intelligent analysis, determine the weaknesses of these/these robots and how to subdue them.
  • the device determines the mechanical police, personnel, and/or equipment that can constrain the capabilities of the robot.
  • the three-dimensional coordinates of the suspicious target, the motion state, and the map near the location point may be combined to notify that the motion direction matches the suspicious target, and the current task priority is lower than the priority of the task.
  • Other smart patrol devices join the monitoring and location tracking of the suspicious target. Thereby effectively avoiding the intelligent patrol device being found when tracking suspicious targets.
  • the smart patrol device may be notified to suspend monitoring and location tracking of the suspicious target.
  • the embodiment of the present invention further provides a wisdom.
  • the intelligent controller in this embodiment refer to the implementation of the intelligent patrol device in the first embodiment, and the repeated description is omitted.
  • 6 is a schematic structural diagram of an intelligent robot according to Embodiment 5 of the present invention.
  • the fifth intelligent robot includes a camera 601, a processor module 602, a transceiver component 603, and a mobile driving device 604.
  • the processor module 604 is configured to acquire a patrol screen by the camera 601, perform target recognition on the patrol screen according to the feature of the suspicious target, determine a suspicious target that meets the feature, determine a location of the suspicious target, and pass the transceiver component 603.
  • the suspicious target screen and the location are reported to the cloud controller, and the tracking of the corresponding suspicious target is performed by the camera 601 and the mobile driving device 604 according to the tracking instruction issued by the cloud controller.
  • the intelligent robot here can be a ground security robot, a drone or other equipment capable of moving and/or rotating.
  • FIG. 7 is a schematic structural diagram of a cloud controller according to Embodiment 6 of the present invention.
  • the cloud controller of Embodiment 6 of the present invention includes a processor module 701 and a communication component 702;
  • the processor module 701 is configured to control the communication component 702 to receive the suspicious target image and location reported by the intelligent robot, analyze the suspicious target image and location received by the communication component, and determine the smart patrol device required for the task according to the analysis result. And/or the type and number of processing personnel and/or processing equipment, and the real-time location of each intelligent patrol device and/or processing personnel and/or processing equipment under the control of the cloud controller, required for the mission. The type and number of intelligent patrol devices and/or processing personnel and/or processing equipment, the location of the suspicious target, the processing priority of the task determining the smart patrol device and/or processing personnel and/or processing equipment involved in the processing of the task And transmitting, by the communication component, the suspicious target tracking instruction carrying the suspicious target picture and location to the smart patrol device and/or the processing personnel and/or the processing equipment participating in the current task processing.
  • an embodiment of the present invention further provides a non-transitory computer readable storage medium storing computer instructions for causing the computer to perform the foregoing implementation.
  • a non-transitory computer readable storage medium storing computer instructions for causing the computer to perform the foregoing implementation.
  • Each of the steps in the fourth method can be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment, or a combination of software and hardware. Moreover, the invention can take the form of a computer program product embodied on one or more computer-usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) including computer usable program code.
  • the computer program instructions can also be stored in a computer readable memory that can direct a computer or other programmable data processing device to operate in a particular manner, such that the instructions stored in the computer readable memory produce an article of manufacture comprising the instruction device.
  • the apparatus implements the functions specified in one or more blocks of a flow or a flow and/or block diagram of the flowchart.
  • These computer program instructions can also be loaded onto a computer or other programmable data processing device such that a series of operational steps are performed on a computer or other programmable device to produce computer-implemented processing for execution on a computer or other programmable device.
  • the instructions provide steps for implementing the functions specified in one or more of the flow or in a block or blocks of a flow diagram.

Landscapes

  • Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • Human Resources & Organizations (AREA)
  • Economics (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Tourism & Hospitality (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Multimedia (AREA)
  • Development Economics (AREA)
  • Marketing (AREA)
  • Quality & Reliability (AREA)
  • Operations Research (AREA)
  • Signal Processing (AREA)
  • Game Theory and Decision Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Alarm Systems (AREA)
  • Image Analysis (AREA)

Abstract

一种智能巡逻设备、云端控制装置、巡逻方法、控制方法、机器人、控制器及非暂态计算机可读存储介质,所述智能巡逻设备在监控画面中自动识别可疑目标,对可疑目标的位置进行精确定位及位置锁定,并将可疑目标的位置信息及画面上报给云端控制装置,使得云端控制装置能够根据可疑目标的位置及画面对其控制下的所有智能巡逻设备进行统一的调度,同时建立起合乎要求的搜寻或追捕小组,在几乎不需要人工干预的情况下,即可自动实现对监控目标的智能化监控和跟踪,克服了现有巡逻设备在巡逻范围上具有较大局限性的问题。

Description

智能巡逻设备、云端控制装置、巡逻方法、控制方法、机器人、控制器及非暂态计算机可读存储介质 技术领域
本发明涉及智能巡逻领域,特别涉及智能巡逻设备、云端控制装置、、巡逻方法、控制方法、机器人、控制器及非暂态计算机可读存储介质。
背景技术
随着城市规模的不断扩大,巡逻搜寻工作在当今社会发挥着越来越重要的作用。长期以来,巡逻搜寻工作一直沿用传统的人工巡逻机制和固定地点摄像头视频监视相结合的方法。其中人工巡逻的方式由于人员的数量和体力有限,其路线、时间、频率、以及密度都不能完全满足需要,另外由于工作人员的积极性也不尽相同,在夜间、天气环境不佳的情况下往往出现怠工现象。同时由于人工巡逻无法将现场的图像等信息实时回传监控中心,给针对突发事件的指挥决策带来了不便。而固定地点摄像头视频监视的方式需要安装大量的摄像头安装和线路改造,不仅成本增大,而且安装的大量摄像头之间均为独立工作,其巡逻范围及巡逻效果具有较大局限性。
现有技术不足在于:
现有的巡逻搜寻工作基本都是依靠人工巡逻机制和固定地点摄像头视频监视相结合的方法进行,参与巡逻设备之间无法进行统一调度与协同工作,巡逻范围及巡逻效果具有较大局限性。
发明内容
本发明实施例提出了智能巡逻设备、云端控制装置、、巡逻方法、控制方 法、机器人、控制器及非暂态计算机可读存储介质,用以解决现有巡逻搜寻过程中,参与巡逻设备之间无法进行统一调度与协同工作,巡逻范围及巡逻效果具有较大局限性的问题。
在一个方面,本发明实施例提供了一种智能巡逻设备,包括:
巡逻画面获取模块,用于通过摄像头获取巡逻画面;
目标识别模块,用于根据可疑目标的特征对所述巡逻画面进行目标识别,以确定符合所述特征的可疑目标,确定所述可疑目标的位置;
收发模块,用于将所述可疑目标画面及位置上报给云端控制装置,以及接收云端控制装置发出的追踪指令;
追踪模块,用于根据所述云端控制装置的追踪指令进行相应可疑目标的追踪。
在另一个方面,本发明实施例提供了一种云端控制装置,包括:
巡逻画面分析模块,用于对智能巡逻设备上报的可疑目标画面及位置进行分析,根据分析结果确定本次任务所需智能巡逻设备和/或处理人员和/或处理装备的类型及数量;
任务处理模块,用于根据所述云端控制装置控制下的各智能巡逻设备和/或处理人员和/或处理装备的实时位置,所述本次任务所需智能巡逻设备和/或处理人员和/或处理装备的类型及数量,可疑目标的位置,以及所述任务的处理优先级确定参与本次任务处理的智能巡逻设备和/或处理人员和/或处理装备;
通知模块,用于向所述参与本次任务处理的智能巡逻设备和/或处理人员和/或处理装备发送携带有所述可疑目标画面及位置的可疑目标追踪指令。
在另一个方面,本发明实施例提供了一种智能巡逻方法,包括:
通过摄像头获取巡逻画面;
根据可疑目标的特征对所述监控画面进行目标识别,以确定符合所述特征的可疑目标,确定所述可疑目标的位置;
将所述可疑目标画面及位置上报给云端控制装置,以及根据云端控制装置发出的追踪指令进行相应可疑目标的追踪。
在另一个方面,本发明实施例提供了一种云端控制方法,包括:
对智能巡逻设备上报的可疑目标画面及位置进行分析,根据分析结果确定本次任务所需智能巡逻设备和/或处理人员和/或处理装备的类型及数量;
根据所述云端控制装置控制下的各智能巡逻设备和/或处理人员和/或处理装备的实时位置,所述本次任务所需智能巡逻设备和/或处理人员和/或处理装备的类型及数量,可疑目标的位置,以及所述任务的处理优先级确定参与本次任务处理的智能巡逻设备和/或处理人员和/或处理装备;
向所述参与本次任务处理的智能巡逻设备和/或处理人员和/或处理装备发送携带有所述可疑目标画面及位置的可疑目标追踪指令。
在另一个方面,本发明实施例提供了一种智能机器人,包括:摄像头、处理器模组、收发组件和移动驱动装置;
处理器模组,用于通过摄像头获取巡逻画面,根据可疑目标的特征对所述巡逻画面进行目标识别,以确定符合所述特征的可疑目标,确定所述可疑目标的位置;并通过收发组件将所述可疑目标画面及位置上报给云端控制器,以及根据云端控制器发出的追踪指令并通过所述摄像头和所述移动驱动装置进行相应可疑目标的追踪。
在另一个方面,本发明实施例提供了一种云端控制器,包括:处理器模组和通信组件;
处理器模组,用于控制所述通信组件接收智能机器人上报的可疑目标画面及位置,对通信组件接收的可疑目标画面及位置进行分析,根据分析结果确定本次任务所需智能巡逻设备和/或处理人员和/或处理装备的类型及数量,以及根据所述云端控制器控制下的各智能巡逻设备和/或处理人员和/或处理装备的实时位置,所述本次任务所需智能巡逻设备和/或处理人员和/或处理装备的类型及数量,可疑目标的位置,所述任务的处理优先级确定参与 本次任务处理的智能巡逻设备和/或处理人员和/或处理装备,通过所述通信组件向所述参与本次任务处理的智能巡逻设备和/或处理人员和/或处理装备发送携带有所述可疑目标画面及位置的可疑目标追踪指令。
在另一个方面,本发明实施例提供了一种非暂态计算机可读存储介质,所述非暂态计算机可读存储介质存储计算机指令,所述计算机指令用于使所述计算机执行上述云端控制方法的各个步骤。
本发明的有益效果如下:
本发明的智能巡逻设备在监控画面中自动识别可疑目标,并将可疑目标的位置信息及画面上报给云端控制装置,使得云端控制装置能够根据可疑目标的位置及画面对其控制下的所有智能巡逻设备进行统一的调度,同时建立起合乎要求的搜寻或追捕小组,本发明几乎不需要人工干预,即可自动实现对可疑目标的智能化监控和跟踪,克服了现有巡逻设备在巡逻范围上具有较大局限性的问题。
附图说明
下面将参照附图描述本发明的具体实施例,其中:
图1示出了本发明实施例一中智能巡逻设备的结构示意图;
图2示出了本发明实施例一中目标识别模块确定可疑目标位置的示意图;
图3示出了本发明实施例二中云端控制装置的结构示意图;
图4示出了本发明实施例三中智能巡逻方法的流程图;
图5示出了本发明实施例四中云端控制方法的流程图;
图6示出了本发明实施例五中智能机器人的结构示意图;
图7示出了本发明实施例六中云端控制器的结构示意图。
具体实施方式
为了使本发明的技术方案及优点更加清楚明白,以下结合附图对本发明的示例性实施例进行进一步详细的说明,显然,所描述的实施例仅是本发明 的一部分实施例,而不是所有实施例的穷举。并且在不冲突的情况下,本说明中的实施例及实施例中的特征可以互相结合。
发明人在发明过程中注意到:现有技术中,巡逻搜寻工作一直沿用人工巡逻机制和固定地点摄像头视频监视相结合的方法。其中人工巡逻的方式由于人员的数量和体力有限,其路线、时间、频率、以及密度都不能完全满足需要。同时由于人工巡逻无法将现场的图像等信息实时回传监控中心,给针对突发事件的指挥决策带来了不便。而固定地点摄像头视频监视的方式需要安装大量的摄像头安装和线路改造,不仅成本增大,而且安装的大量摄像头之间均为独立工作,其巡逻范围及巡逻效果具有较大局限性。针对上述不足,本发明实施例提出了智能巡逻设备、云端控制装置、、巡逻方法、控制方法、机器人、控制器及非暂态计算机可读存储介质。
为了便于本发明的实施,下面以实例进行说明。
实施例一
图1示出了本发明实施例一中智能巡逻设备的结构示意图,如图所示,本发明实施例一中智能巡逻设备包括:
巡逻画面获取模块101,用于通过摄像头获取巡逻画面;
目标识别模块102,用于根据可疑目标的特征对巡逻画面进行目标识别,以确定符合可疑目标特征的可疑目标,确定所述可疑目标的位置;
收发模块103,用于将可疑目标的位置信息及可疑目标画面上报给云端控制装置,以及接收云端控制装置发出的追踪指令;
追踪模块104,用于根据云端控制装置的追踪指令进行相应可疑目标的追踪。
在具体实施中,可疑目标的特征可以是云端控制装置下发给智能巡逻设备的,也可以是用户输入智能巡逻设备的。目标识别模块102可以根据云端控制装置下发的或者用户输入的被跟踪目标的清晰图像提取可疑目标的图像特征,也可以以云端控制装置下发的或者用户输入的图像特征数据直接作为 可疑目标的图像特征,还可以在云端控制装置下发的或者用户指定的源视频的某些帧画面上提取或者学习某一可疑目标的图像特征。例如由云端控制装置下发的或用户提供的通缉犯照片提取通缉犯的脸部图像特征,或者云端控制装置下发的或用户输入某一图案的灰度数据作为可疑目标的图像特征;或者云端控制装置下发的或用户输入的在源视频中的某帧画面中截取的希望监控的可疑目标图像等。
在具体实施中,目标识别模块102采用面部识别、虹膜识别、动作识别、无线信号识别、编号识别中的至少一种识别方式,根据可疑目标的特征对巡逻目标实时采集的监控画面进行目标识别,以确定符合可疑目标特征的可疑目标监控画面。其中,面部识别、虹膜识别、动作识别及编号识别均是基于图像的操作,其具体的识别方法均可参照现有的识别技术进行。
面部识别主要是针对人的,对人的面部进行拍照识别,也可针对机器人进行型号判别;比如在实际应用中,可以预先存储特定人员比如通缉犯的面部照片,或者预先存储机器人的轮廓,当检测到相应的人脸或者检测到相应的机器人时,确定相应的人员或者机器人为被跟踪的目标。
虹膜识别是拍摄人的眼球虹膜部分,来精确判断此人是谁,比面部识别准确率高的多。
动作识别就是系统判定识别人或机器人在摄像头前的动作;比如可以预先存储一些危险的动作,比如打斗、猛烈撞击等,当检测到巡逻画面中出现这样的动作时,确定动作的发出者为可疑目标。
编号识别就是拍摄到机器人的编号,来判定机器人的种类参数等等,前提是机器人存在一个可见的识别码,此时可以通过图像识别技术识别相应的识别码,进而确定被追踪对象。
无线信号识别是根据检测到机器人发出的无线信号来判定其种类参数等等,或者确定其位置。
图2示出了本发明实施例一中目标识别模块102确定可疑目标位置的示 意图,如图所示,在具体实施中,目标识别模块102根据智能巡逻设备的经纬度及海拔信息,及智能巡逻设备与可疑目标的相对位置关系确定可疑目标的位置的具体过程如下:
图2中XY轴所在平面为海平面(非地面),智能巡逻设备当前处于空间中A点,在某时刻发现地面B点位置某可疑目标。任意时刻,始终按照智能巡逻设备投影到海平面上的点为XY轴的原点O(智能巡逻设备移动时,XY轴跟着移动),且X轴始终指向海平面上的正东方向,Y轴始终指向海平面上的正北方向。
目标识别模块102通过红外激光测距或者双摄像头测距方式得到智能巡逻设备到追踪目标的直线距离d,通过智能巡逻设备上的高度传感器得到智能巡逻设备当前的准确海拔高度h,其中,d与h需保证单位相同。通过智能巡逻设备上摄像头转动的机械部门得到摄像头光轴与垂直方向的夹角α。通过地磁传感器可实时获知智能巡逻设备面向的绝对方向,从而可以通过摄像头转动机械结构知道摄像头光轴方向在海平面上投影OC的绝对方向(C为目标点B到海平面的投影点),设OC与X轴正向夹角为β。通过卫星定位系统获知的智能巡逻设备的实时经纬度信息为(m,n),以目标位置B做OA的垂线BD。
在三角形ABD中,可知:
BD=d*sinα;
AD=d*cosα;
进而可得到:
OC=BD=d*sinα;
OD=h–AD=h-d*cosα;
BC=OD;
BC则为可疑目标B点的海拔高度。
根据OC长度和β角,在XY平面中,可得知C点坐标为(d*sinα*cosβ, d*sinα*sinβ);
在知道O点的具体经纬度信息后(O点经纬度信息就是A点的经纬度信息),向东西向或南北向移动一定距离后,经纬度的变化是确定的,此处假设O点处东西向的经度变化与此点东西向距离变化比例系数为j,南北向的纬度变化与此点南北向距离变化比例系数为k(O点经纬度不同时,j和k的值也不同,但只要O点经纬度确定,此j、k就是固定数值),可知目标点C的经纬度相对于点O的偏移量为(j*d*sinα*cosβ,k*d*sinα*sinβ),因此即可得到可疑目标B点的位置信息,包括绝对经纬度和海拔信息:
(m±j*d*sinα*cosβ,n±k*d*sinα*sinβ,h-d*cosα)
如果智能巡逻设备的经度为东经,则可疑目标B向东移动经度增加,也就是经度为m+j*d*sinα*cosβ,向西移动经度减小,也就是经度为m-j*d*sinα*cosβ;如果智能巡逻设备在西经,则可疑目标向东移动经度减小,也就是经度为m-j*d*sinα*cosβ,向西移动经度增加,也就是经度为m+j*d*sinα*cosβ;
如果智能巡逻设备的纬度为北纬(北半球),则可疑目标B向北移动纬度增加,也就是维度为n+k*d*sinα*sinβ,向南移动纬度减小,也就是维度为n-k*d*sinα*sinβ;如果智能巡逻设备的纬度为南纬(南半球),则可疑目标B向北移动纬度减小,也就是维度为n-k*d*sinα*sinβ,向南移动纬度增加,也就是维度为n+k*d*sinα*sinβ。
在具体实施中,收发模块103将智能巡逻设备的三维坐标,包括智能巡逻设备的经纬度及海拔信息,可疑目标的三维坐标,包括可疑目标的经纬度及海拔信息,以及可疑目标的画面,既可以为截取的某些画面帧的图像,也可以是全部监控视频,通过无线或有线方式发送给云端控制装置。同时,收发模块103还用于接收云端控制装置的追踪指令。
追踪模块104根据收发模块103接收到的云端控制装置的追踪指令对可疑目标进行追踪,可以按照如下方式实现:智能巡逻设备实时的对巡逻画面 进行分析识别,并根据可疑目标在画面中的位置调整摄像头的方向和焦距等参数,以及控制飞行驱动模块,使得被跟踪的可疑目标不超出监视范围位于画面的中心,从而保持对被跟踪目标的跟踪。
本实施例所述智能巡逻设备可在巡逻画面中自动识别可疑目标,并将可疑目标的实时位置信息及监控画面上报给云端控制装置,供云端控制装置进行统一巡逻调度。
实施例二
图3示出了本发明实施例二中云端控制装置的结构示意图,本发明实施例二云端控制装置用于对与其建立连接的智能巡逻设备进行控制,其具体包括:
巡逻画面分析模块301,用于对智能巡逻设备上报的可疑目标画面及位置进行分析,根据分析结果确定本次任务所需智能巡逻设备和/或处理人员和/或处理装备的类型及数量;
任务处理模块302,用于根据所述云端控制装置控制下的各智能巡逻设备和/或处理人员和/或处理装备的实时位置,所述本次任务所需智能巡逻设备和/或处理人员和/或处理装备的类型及数量,可疑目标的位置,以及所述任务的处理优先级确定参与本次任务处理的智能巡逻设备和/或处理人员和/或处理装备;
通知模块303,用于向所述参与本次任务处理的智能巡逻设备和/或处理人员和/或处理装备发送携带有所述可疑目标画面及位置的可疑目标追踪指令。
在具体实施中,云端控制装置还包括:
优先级确定模块304,用于根据所述可疑目标画面及位置确定所述任务的处理优先级;
巡逻画面分析模块301对智能巡逻设备上报的可疑目标画面及位置进行分析时,会优先处理优先级较高的任务,根据该任务对应的分析结果确定该 任务所需智能巡逻设备和/或处理人员和/或处理装备的类型及数量。
在具体实施中,巡逻画面分析模块301根据智能巡逻设备上报的可疑目标画面及位置分析并估算本次任务需要的智能巡逻设备和/或处理人员和/或需要携带的装备的数量。如果识别出对方为一个犯罪团伙,则判定需要多个机械警察和人员参与逮捕;如果识别出对方可能有枪支或其他武器,则需要携带能制约对方的武器装备等;如果识别出对方为一个或多个可疑机器人,则根据大数据库或者智能分析判定出这个/这些机器人的弱点以及制服其的方法,如确定如何最容易摧毁其行动能力,如何能最容易切断其供电,如何能干扰其启动自爆装置等等,进而确定出能制约该机器人能力的机械警察、人员和/或装备。
在具体实施中,任务处理模块302结合可疑目标三维坐标,运动状态,位置点附近地图,通知在运动方向与可疑目标相匹配,且当前任务优先级低于本任务优先级的的其他智能巡逻设备加入对所述可疑目标的监控及位置跟踪。从而有效地避免了智能巡逻设备在追缉可疑目标时被发现。
在具体实施中,在可疑目标超出智能巡逻设备的监控距离阈值或者智能巡逻设备当前能力不足以继续追踪时,任务处理模块302通知智能巡逻设备中止对可疑目标的监控及位置跟踪。
本次任务从开始(建立起参与的机械警察/人员,和参与的智能巡逻设备)到结束过程过程中(抓捕到目标或者云端终止行动或者人为终止行动),追踪团队的成员是动态变化的,但云端控制装置始终保持了当前团队中所有参与的智能巡逻设备及人员的信息同步,包括实时的同步可疑目标的三维坐标信息,实时同步目标的监控信息等,使任务的当前整个追踪团队信息保持一致。
本发明实施例的云端控制装置能够根据可疑目标的实时位置及监控画面对其控制下的所有智能巡逻设备进行统一的调度,同时建立起合乎要求的搜寻或追捕小组,几乎不需要人工干预,即可自动实现对监控目标的智能化监控和跟踪,同时最大限度的克服了现有巡逻设备在巡逻范围上具有较大局限 性的问题。
实施例三
基于与上述实施例一相同的发明构思,本发明实施例中还提供了一种智能巡逻方法,本实施例中巡逻方法的实施可以参见实施例一中智能巡逻设备的实施,重复之处不再赘述。图4示出了本发明实施例三中智能巡逻设备的巡逻方法的流程图,包括如下步骤:
步骤401,通过摄像头获取巡逻画面;
步骤402,根据可疑目标的特征对监控画面进行目标识别,以确定符合可疑目标特征的可疑目标,确定所述可疑目标的位置;
步骤403,将可疑目标的实时位置信息上报给云端控制装置,以及根据云端控制装置发出的追踪指令进行相应可疑目标的追踪。
在具体实施中,步骤402中采用面部识别、虹膜识别、动作识别、无线信号识别、编号识别中的至少一种识别方式对监控画面进行目标识别,具体的识别过程与实施例一中描述巡逻画面识别过程相同,在此不再赘述。
在具体实施中,可疑目标的位置信息包括可疑目标的海拔及绝对经纬度。
在具体实施中,步骤402中根据智能巡逻设备的经纬度及海拔信息,及其与可疑目标的相对位置关系确定可疑目标的位置,具体位置确定过程与实施例一中描述的可疑目标的位置确定过程相同,在此不再赘述。实施例四
基于与上述实施例二相同的发明构思,本发明实施例中还提供了一种云端控制方法,本实施例中云端控制方法的实施可以参见实施例二中云端控制装置的实施,重复之处不再赘述。图5示出了本发明实施例四中云端控制方法的流程图,包括如下步骤:
步骤501,对智能巡逻设备上报的可疑目标画面及位置进行分析,根据分析结果确定本次任务所需智能巡逻设备和/或处理人员和/或处理装备的类型及数量;
步骤502,根据云端控制装置控制下的各智能巡逻设备和/或处理人员和/ 或处理装备的实时位置,本次任务所需智能巡逻设备和/或处理人员和/或处理装备的类型及数量,可疑目标的位置,以及任务的处理优先级确定参与本次任务处理的智能巡逻设备和/或处理人员和/或处理装备;
步骤503,向所述参与本次任务处理的智能巡逻设备和/或处理人员和/或处理装备发送携带有所述可疑目标画面及位置的可疑目标追踪指令。
在具体实施中,任务的处理优先级根据所述可疑目标画面及位置确定。在对智能巡逻设备上报的可疑目标画面及位置进行分析时,优先处理优先级较高的任务,根据该任务对应的分析结果确定该任务所需智能巡逻设备和/或处理人员和/或处理装备的类型及数量。
在具体实施中,根据智能巡逻设备上报的可疑目标画面及位置分析并估算本次任务需要的智能巡逻设备和/或处理人员和/或需要携带的装备的数量。如果识别出对方为一个犯罪团伙,则判定需要多个机械警察和人员参与逮捕;如果识别出对方可能有枪支或其他武器,则需要携带能制约对方的武器装备等;如果识别出对方为一个或多个可疑机器人,则根据大数据库或者智能分析判定出这个/这些机器人的弱点以及制服其的方法,如确定如何最容易摧毁其行动能力,如何能最容易切断其供电,如何能干扰其启动自爆装置等等,进而确定出能制约该机器人能力的机械警察、人员和/或装备。
在具体实施中,在可疑目标追踪过程中,可结合可疑目标三维坐标,运动状态,位置点附近地图,通知在运动方向与可疑目标相匹配,且当前任务优先级低于本任务优先级的的其他智能巡逻设备加入对所述可疑目标的监控及位置跟踪。从而有效地避免了智能巡逻设备在追缉可疑目标时被发现。
在具体实施中,在可疑目标超出智能巡逻设备的监控距离阈值或者智能巡逻设备当前能力不足以继续追踪时,可通知智能巡逻设备中止对可疑目标的监控及位置跟踪。
实施例五
基于与上述实施例一相同的发明构思,本发明实施例中还提供了一种智 能机器人,本实施例中智能控制器的实施可以参见实施例一中智能巡逻设备的实施,重复之处不再赘述。图6示出了本发明实施例五中智能机器人的结构示意图,如图所示,本发明实施例五智能机器人包括摄像头601、处理器模组602、收发组件603和移动驱动装置604,其中,
处理器模组604,用于通过摄像头601获取巡逻画面,根据可疑目标的特征对巡逻画面进行目标识别,以确定符合所述特征的可疑目标,确定所述可疑目标的位置;并通过收发组件603将所述可疑目标画面及位置上报给云端控制器,以及根据云端控制器发出的追踪指令并通过所述摄像头601和所述移动驱动装置604进行相应可疑目标的追踪。
实际应用中,这里的智能机器人可以为地面安保机器人、无人机或其他能够移动和/或转动的设备。
实施例六
基于与上述实施例二相同的发明构思,本发明实施例中还提供了一种云端控制器,本实施例中云端控制器的实施可以参见实施例二中云端控制装置的实施,重复之处不再赘述。图7示出了本发明实施例六中云端控制器的结构示意图,如图所示,本发明实施例六的云端控制器包括处理器模组701和通信组件702;其中,
处理器模组701,用于控制所述通信组件702接收智能机器人上报的可疑目标画面及位置,对通信组件接收的可疑目标画面及位置进行分析,根据分析结果确定本次任务所需智能巡逻设备和/或处理人员和/或处理装备的类型及数量,以及根据所述云端控制器控制下的各智能巡逻设备和/或处理人员和/或处理装备的实时位置,所述本次任务所需智能巡逻设备和/或处理人员和/或处理装备的类型及数量,可疑目标的位置,所述任务的处理优先级确定参与本次任务处理的智能巡逻设备和/或处理人员和/或处理装备,通过所述通信组件向所述参与本次任务处理的智能巡逻设备和/或处理人员和/或处理装备发送携带有所述可疑目标画面及位置的可疑目标追踪指令。
实施例七
基于同一发明构思,本发明实施例还提供了一种非暂态计算机可读存储介质,所述非暂态计算机可读存储介质存储计算机指令,所述计算机指令用于使所述计算机执行前述实施例四方法中的各个步骤。本领域内的技术人员应明白,本发明的实施例可提供为方法、系统、或计算机程序产品。因此,本发明可采用完全硬件实施例、完全软件实施例、或结合软件和硬件方面的实施例的形式。而且,本发明可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。
本发明是参照根据本发明实施例的方法、设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。
尽管已描述了本发明的优选实施例,但本领域内的技术人员一旦得知了基本创造性概念,则可对这些实施例作出另外的变更和修改。所以,所附权利要求意欲解释为包括优选实施例以及落入本发明范围的所有变更和修改。

Claims (27)

  1. 一种智能巡逻设备,其特征在于,包括:
    巡逻画面获取模块,用于通过摄像头获取巡逻画面;
    目标识别模块,用于根据可疑目标的特征对所述巡逻画面进行目标识别,以确定符合所述特征的可疑目标,确定所述可疑目标的位置;
    收发模块,用于将所述可疑目标画面及位置上报给云端控制装置,以及接收云端控制装置发出的追踪指令;
    追踪模块,用于根据所述云端控制装置的追踪指令进行相应可疑目标的追踪。
  2. 如权利要求1所述的设备,其特征在于,所述目标识别模块采用面部识别、虹膜识别、动作识别、无线信号识别、编号识别中的至少一种识别方式对所述巡逻画面进行可疑目标识别。
  3. 如权利要求1所述的设备,其特征在于,所述目标识别模块根据所述智能巡逻设备的经纬度及海拔信息,以及所述智能巡逻设备与所述可疑目标的相对位置关系确定所述可疑目标的位置。
  4. 如权利要求3所述的设备,其特征在于,所述可疑目标的位置信息包括所述可疑目标的海拔及绝对经纬度:
    所述可疑目标的海拔为h-d*cosα;
    所述可疑目标的绝对经度为m±j*d*sinα*cosβ;
    所述可疑目标的绝对纬度为n±k*d*sinα*sinβ;
    其中,h为所述智能巡逻设备的海拔高度,d为所述智能巡逻设备与所述可疑目标的直线距离,h与d的单位相同,α为所述智能巡逻设备的摄像头光轴与垂直方向的夹角,m为所述智能巡逻设备的经度,j为所述智能巡逻设备投影到海平面上的点在东西向上的经度变化与此点在海平面东西向上的距离变化的比例系数,β为所述智能巡逻设备的摄像头光轴方向在海平面上投影与海平面上正东向的夹角;n为所述智能巡逻设备的纬度,k为所述智能 巡逻设备投影到海平面上的点在东西向上的纬度变化与此点在海平面南北向上的距离变化的比例系数。
  5. 如权利要求4所述的设备,其特征在于,
    若所述智能巡逻设备的经度为东经,则所述可疑目标向东移动时的绝对经度为m+j*d*sinα*cosβ,向西移动时的绝对经度为m-j*d*sinα*cosβ;
    若所述智能巡逻设备的经度为西经,则所述可疑目标向东移动时的绝对经度为m-j*d*sinα*cosβ,向西移动时的绝对经度为m+j*d*sinα*cosβ;
    若所述智能巡逻设备的纬度为北纬,则所述可疑目标向北移动时的绝对纬度为n+k*d*sinα*sinβ,向南移动时的绝对纬度为n-k*d*sinα*sinβ;
    若所述智能巡逻设备的纬度为南纬,则所述可疑目标向北移动时的纬度为n-k*d*sinα*sinβ,向南移动时的纬度为n+k*d*sinα*sinβ。
  6. 一种云端控制装置,其特征在于,包括:
    巡逻画面分析模块,用于对智能巡逻设备上报的可疑目标画面及位置进行分析,根据分析结果确定本次任务所需智能巡逻设备和/或处理人员和/或处理装备的类型及数量;
    任务处理模块,用于根据所述云端控制装置控制下的各智能巡逻设备和/或处理人员和/或处理装备的实时位置,所述本次任务所需智能巡逻设备和/或处理人员和/或处理装备的类型及数量,可疑目标的位置,以及所述任务的处理优先级确定参与本次任务处理的智能巡逻设备和/或处理人员和/或处理装备;
    通知模块,用于向所述参与本次任务处理的智能巡逻设备和/或处理人员和/或处理装备发送携带有所述可疑目标画面及位置的可疑目标追踪指令。
  7. 如权利要求6所述的装置,其特征在于,还包括,
    优先级确定模块,用于根据所述可疑目标画面及位置确定所述任务的处理优先级;
    所述巡逻画面分析模块根据分析结果确定本次任务所需智能巡逻设备和/或处理人员和/或处理装备的类型及数量,包括:
    所述巡逻画面分析模块优先处理优先级较高的任务,根据该任务对应的分析结果确定该任务所需智能巡逻设备和/或处理人员和/或处理装备的类型及数量。
  8. 如权利要求7所述的装置,其特征在于,所述任务处理模块还用于根据所述可疑目标的实时位置,将在运动方向与所述可疑目标相匹配,且当前任务优先级低于本任务优先级的其他智能巡逻设备和/或处理人员和/或处理装备加入到对所述可疑目标的追踪。
  9. 如权利要求6或8所述的装置,其特征在于,所述任务处理模块还用于在所述智能巡逻设备和/或处理人员和/或处理装备与所述可疑目标的距离大于追踪距离的阈值或被调入其他追踪任务时,中止该智能巡逻设备和/或处理人员和/或处理装备对所述可疑目标的追踪。
  10. 一种智能巡逻方法,其特征在于,包括:
    通过摄像头获取巡逻画面;
    根据可疑目标的特征对所述监控画面进行目标识别,以确定符合所述特征的可疑目标,确定所述可疑目标的位置;
    将所述可疑目标画面及位置上报给云端控制装置,以及根据云端控制装置发出的追踪指令进行相应可疑目标的追踪。
  11. 如权利要求10所述的方法,其特征在于,采用面部识别、虹膜识别、动作识别、无线信号识别、编号识别中的至少一种识别方式对所述巡逻画面进行可疑目标识别。
  12. 如权利要求10所述的方法,其特征在于,所述可疑目标的位置信息 包括所述可疑目标的海拔及绝对经纬度:
    所述可疑目标的海拔为h-d*cosα;
    所述可疑目标的绝对经度为m±j*d*sinα*cosβ;
    所述可疑目标的绝对纬度为n±k*d*sinα*sinβ;
    其中,h为所述智能巡逻设备的海拔高度,d为所述智能巡逻设备与所述可疑目标的直线距离,h与d的单位相同,α为所述智能巡逻设备的摄像头光轴与垂直方向的夹角,m为所述智能巡逻设备的经度,j为所述智能巡逻设备投影到海平面上的点在东西向上的经度变化与此点在海平面东西向上的距离变化的比例系数,β为所述智能巡逻设备的摄像头光轴方向在海平面上投影与海平面上正东向的夹角;n为所述智能巡逻设备的纬度,k为所述智能巡逻设备投影到海平面上的点在东西向上的纬度变化与此点在海平面南北向上的距离变化的比例系数。
  13. 如权利要求13所述的方法,其特征在于,
    若所述智能巡逻设备的经度为东经,则所述可疑目标向东移动时的绝对经度为m+j*d*sinα*cosβ,向西移动时的绝对经度为m-j*d*sinα*cosβ;
    若所述智能巡逻设备的经度为西经,则所述可疑目标向东移动时的绝对经度为m-j*d*sinα*cosβ,向西移动时的绝对经度为m+j*d*sinα*cosβ;
    若所述智能巡逻设备的纬度为北纬,则所述可疑目标向北移动时的绝对纬度为n+k*d*sinα*sinβ,向南移动时的绝对纬度为n-k*d*sinα*sinβ;
    若所述智能巡逻设备的纬度为南纬,则所述可疑目标向北移动时的纬度为n-k*d*sinα*sinβ,向南移动时的纬度为n+k*d*sinα*sinβ。
  14. 一种云端控制方法,其特征在于,包括:
    对智能巡逻设备上报的可疑目标画面及位置进行分析,根据分析结果确定本次任务所需智能巡逻设备和/或处理人员和/或处理装备的类型及数量;
    根据所述云端控制装置控制下的各智能巡逻设备和/或处理人员和/或处理装备的实时位置,所述本次任务所需智能巡逻设备和/或处理人员和/或处理装备的类型及数量,可疑目标的位置,以及所述任务的处理优先级确定参与本次任务处理的智能巡逻设备和/或处理人员和/或处理装备;
    向所述参与本次任务处理的智能巡逻设备和/或处理人员和/或处理装备发送携带有所述可疑目标画面及位置的可疑目标追踪指令。
  15. 如权利要求15所述的方法,其特征在于,所述任务的处理优先级根据所述可疑目标画面及位置确定;
    在对智能巡逻设备上报的可疑目标画面及位置进行分析时,优先处理优先级较高的任务,根据该任务对应的分析结果确定该任务所需智能巡逻设备和/或处理人员和/或处理装备的类型及数量。
  16. 如权利要求15所述的方法,其特征在于,还包括根据所述可疑目标的实时位置,通知在运动方向与所述可疑目标相匹配,且当前任务优先级低于本任务优先级的的其他智能巡逻设备和/或处理人员和/或处理装备加入对所述可疑目标的追踪的步骤。
  17. 如权利要求15或17所述的方法,其特征在于,还包括在所述智能巡逻设备和/或处理人员和/或处理装备与所述可疑目标的距离大于追踪距离的阈值或被调度加入其他追踪任务时,通知所述智能巡逻设备和/或处理人员和/或处理装备中止对所述可疑目标的追踪的步骤。
  18. 一种智能机器人,其特征在于,包括:摄像头、处理器模组、收发组件和移动驱动装置;
    处理器模组,用于通过摄像头获取巡逻画面,根据可疑目标的特征对所述巡逻画面进行目标识别,以确定符合所述特征的可疑目标,确定所述可疑目标的位置,并通过收发组件将所述可疑目标画面及位置上报给云端控制器, 以及根据云端控制器发出的追踪指令并通过所述摄像头和所述移动驱动装置进行相应可疑目标的追踪。
  19. 如权利要求19所述的智能机器人,其特征在于,所述处理器模组采用面部识别、虹膜识别、动作识别、无线信号识别、编号识别中的至少一种识别方式对所述巡逻画面进行可疑目标识别。
  20. 如权利要求19所述的智能机器人,其特征在于,所述处理器模组根据所述智能机器人的经纬度及海拔信息,以及所述智能机器人与所述可疑目标的相对位置关系确定所述可疑目标的位置。
  21. 如权利要求21所述的智能机器人,其特征在于,所述可疑目标的位置信息包括所述可疑目标的海拔及绝对经纬度:
    所述可疑目标的海拔为h-d*cosα;
    所述可疑目标的绝对经度为m±j*d*sinα*cosβ;
    所述可疑目标的绝对纬度为n±k*d*sinα*sinβ;
    其中,h为所述智能机器人的海拔高度,d为所述智能机器人与所述可疑目标的直线距离,h与d的单位相同,α为所述智能机器人的摄像头光轴与垂直方向的夹角,m为所述智能机器人的经度,j为所述智能机器人投影到海平面上的点在东西向上的经度变化与此点在海平面东西向上的距离变化的比例系数,β为所述智能机器人的摄像头光轴方向在海平面上投影与海平面上正东向的夹角;n为所述智能机器人的纬度,k为所述智能机器人投影到海平面上的点在东西向上的纬度变化与此点在海平面南北向上的距离变化的比例系数。
  22. 如权利要求22所述的智能机器人,其特征在于,
    若所述智能机器人的经度为东经,则所述可疑目标向东移动时的绝对经度为m+j*d*sinα*cosβ,向西移动时的绝对经度为m-j*d*sinα*cosβ;
    若所述智能机器人的经度为西经,则所述可疑目标向东移动时的绝对经 度为m-j*d*sinα*cosβ,向西移动时的绝对经度为m+j*d*sinα*cosβ;
    若所述智能机器人的纬度为北纬,则所述可疑目标向北移动时的绝对纬度为n+k*d*sinα*sinβ,向南移动时的绝对纬度为n-k*d*sinα*sinβ;
    若所述智能机器人的纬度为南纬,则所述可疑目标向北移动时的纬度为n-k*d*sinα*sinβ,向南移动时的纬度为n+k*d*sinα*sinβ。
  23. 一种云端控制器,其特征在于,包括:处理器模组和通信组件;
    处理器模组,用于控制所述通信组件接收智能机器人上报的可疑目标画面及位置,对通信组件接收的可疑目标画面及位置进行分析,根据分析结果确定本次任务所需智能巡逻设备和/或处理人员和/或处理装备的类型及数量,以及根据所述云端控制器控制下的各智能巡逻设备和/或处理人员和/或处理装备的实时位置,所述本次任务所需智能巡逻设备和/或处理人员和/或处理装备的类型及数量,可疑目标的位置,所述任务的处理优先级确定参与本次任务处理的智能巡逻设备和/或处理人员和/或处理装备,通过所述通信组件向所述参与本次任务处理的智能巡逻设备和/或处理人员和/或处理装备发送携带有所述可疑目标画面及位置的可疑目标追踪指令。
  24. 如权利要求24所述的云端控制器,其特征在于,所述处理器模组还用于根据所述可疑目标画面及位置确定所述任务的处理优先级,以及优先处理优先级较高的任务,根据该任务对应的分析结果确定该任务所需智能巡逻设备和/或处理人员和/或处理装备的类型及数量。
  25. 如权利要求25所述的云端控制器,其特征在于,所述处理器模组还用于根据所述可疑目标的实时位置,将在运动方向与所述可疑目标相匹配,且当前任务优先级低于本任务优先级的的其他智能巡逻设备和/或处理人员和/或处理装备加入到对所述可疑目标的追踪。
  26. 如权利要求24或26所述的云端控制器,其特征在于,所述处理器 模组还用于在所述智能巡逻设备和/或处理人员和/或处理装备与所述可疑目标的距离大于追踪距离的阈值或被调入其他追踪任务时,中止该智能巡逻设备和/或处理人员和/或处理装备对所述可疑目标的追踪。
  27. 一种非暂态计算机可读存储介质,其特征在于,所述非暂态计算机可读存储介质存储计算机指令,所述计算机指令用于使所述计算机执行权利要求15-18中任一所述方法的各个步骤。
PCT/CN2016/103327 2016-10-26 2016-10-26 智能巡逻设备、云端控制装置、巡逻方法、控制方法、机器人、控制器及非暂态计算机可读存储介质 WO2018076191A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201680002940.1A CN107148777B (zh) 2016-10-26 2016-10-26 智能巡逻设备、云端控制装置及巡逻方法、控制方法
PCT/CN2016/103327 WO2018076191A1 (zh) 2016-10-26 2016-10-26 智能巡逻设备、云端控制装置、巡逻方法、控制方法、机器人、控制器及非暂态计算机可读存储介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/103327 WO2018076191A1 (zh) 2016-10-26 2016-10-26 智能巡逻设备、云端控制装置、巡逻方法、控制方法、机器人、控制器及非暂态计算机可读存储介质

Publications (1)

Publication Number Publication Date
WO2018076191A1 true WO2018076191A1 (zh) 2018-05-03

Family

ID=59783832

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/103327 WO2018076191A1 (zh) 2016-10-26 2016-10-26 智能巡逻设备、云端控制装置、巡逻方法、控制方法、机器人、控制器及非暂态计算机可读存储介质

Country Status (2)

Country Link
CN (1) CN107148777B (zh)
WO (1) WO2018076191A1 (zh)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109213189A (zh) * 2018-07-19 2019-01-15 安徽共生物流科技有限公司 一种无人机巡线系统及动作判定方法
CN109614875A (zh) * 2018-11-16 2019-04-12 合肥未来计算机技术开发有限公司 一种基于运动规则的智能安防报警系统
CN109849023A (zh) * 2019-04-10 2019-06-07 江苏方天电力技术有限公司 一种轨道悬挂智能巡检机器人系统
CN110275533A (zh) * 2019-06-25 2019-09-24 李子月 一种虚实结合的无人巡逻车系统
CN111435554A (zh) * 2019-01-11 2020-07-21 余招成 巡查追踪系统
CN112270267A (zh) * 2020-10-29 2021-01-26 国网山东省电力公司淄博供电公司 可自动抓拍线路故障的摄像识别系统
CN112333421A (zh) * 2020-09-15 2021-02-05 安徽龙运智能科技有限公司 一种基于5g的博物馆智能巡检系统
CN112347306A (zh) * 2020-09-29 2021-02-09 浙江大华技术股份有限公司 一种ptz摄像机监控跟踪方法、装置、系统和计算机设备
CN112446628A (zh) * 2020-12-01 2021-03-05 云南昆船数码科技有限公司 一种应用于物业管理的巡更管理系统及其管理方法
CN113536934A (zh) * 2021-06-17 2021-10-22 杭州电子科技大学 一种巡逻机器人执行追踪任务时的主动隐藏方法
CN113554775A (zh) * 2021-06-01 2021-10-26 广东电网有限责任公司广州供电局 无人机电力巡检系统
CN116437216A (zh) * 2023-06-12 2023-07-14 湖南博信创远信息科技有限公司 基于人工智能数据处理和视觉分析的工程监管方法及系统

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107911667A (zh) * 2017-11-28 2018-04-13 上海思依暄机器人科技股份有限公司 一种安防监控方法、装置、电子设备、监控服务器及系统
CN108304799A (zh) * 2018-01-30 2018-07-20 广州市君望机器人自动化有限公司 一种人脸追踪方法
CN108040211A (zh) * 2018-01-30 2018-05-15 广州市君望机器人自动化有限公司 一种人脸追踪摄像头以及人脸追踪系统
CN108737788B (zh) * 2018-06-05 2020-08-07 北京智行者科技有限公司 一种图像信息处理方法
CN113146609A (zh) * 2018-08-22 2021-07-23 胡开良 智能巡逻机器人
CN109740461B (zh) * 2018-12-21 2020-12-25 北京智行者科技有限公司 目标跟随后的处理方法
CN109849008A (zh) * 2019-02-21 2019-06-07 广州高新兴机器人有限公司 一种基于金库的机器人盘点方法及系统
JP7398300B2 (ja) * 2020-03-17 2023-12-14 アイホン株式会社 セキュリティシステム
CN116518948A (zh) * 2023-04-12 2023-08-01 山东省地质矿产勘查开发局第一地质大队(山东省第一地质矿产勘查院) 基于三维函数测绘进行地区环境预测勘探设备及测绘方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120316701A1 (en) * 2009-04-10 2012-12-13 United States Government, As Represented By The Secretary Of The Navy Spherical infrared robotic vehicle
CN202985566U (zh) * 2012-07-26 2013-06-12 王云 基于人脸识别的安保机器人
CN103576683A (zh) * 2012-08-03 2014-02-12 中国科学院深圳先进技术研究院 多巡逻机器人的调度方法和系统
CN104965426A (zh) * 2015-06-24 2015-10-07 百度在线网络技术(北京)有限公司 基于人工智能的智能机器人控制系统、方法和装置

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2458407C1 (ru) * 2011-03-02 2012-08-10 Общество с ограниченной ответственностью "ДиСиКон" (ООО "ДСК") Система и способ видеомониторинга леса
KR101417765B1 (ko) * 2014-02-07 2014-07-14 (주)지디일렉스 2차원 서모파일 어레이 적외선 열화상을 이용한 수배전반의 설비 영역별 열화 진단 시스템 및 그 방법
CN104457736A (zh) * 2014-11-03 2015-03-25 深圳市邦彦信息技术有限公司 一种获取目标位置信息的方法及装置
CN104853167A (zh) * 2015-05-15 2015-08-19 华中科技大学 基于飞行器平台的小区智能安防系统和小区智能安防方法
CN105157708A (zh) * 2015-10-10 2015-12-16 南京理工大学 基于图像处理与雷达的无人机自主导航系统及方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120316701A1 (en) * 2009-04-10 2012-12-13 United States Government, As Represented By The Secretary Of The Navy Spherical infrared robotic vehicle
CN202985566U (zh) * 2012-07-26 2013-06-12 王云 基于人脸识别的安保机器人
CN103576683A (zh) * 2012-08-03 2014-02-12 中国科学院深圳先进技术研究院 多巡逻机器人的调度方法和系统
CN104965426A (zh) * 2015-06-24 2015-10-07 百度在线网络技术(北京)有限公司 基于人工智能的智能机器人控制系统、方法和装置

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109213189A (zh) * 2018-07-19 2019-01-15 安徽共生物流科技有限公司 一种无人机巡线系统及动作判定方法
CN109614875A (zh) * 2018-11-16 2019-04-12 合肥未来计算机技术开发有限公司 一种基于运动规则的智能安防报警系统
CN109614875B (zh) * 2018-11-16 2022-11-08 合肥未来计算机技术开发有限公司 一种基于运动规则的智能安防报警系统
CN111435554A (zh) * 2019-01-11 2020-07-21 余招成 巡查追踪系统
CN111435554B (zh) * 2019-01-11 2022-01-28 余招成 巡查追踪系统
CN109849023B (zh) * 2019-04-10 2023-10-13 江苏方天电力技术有限公司 一种轨道悬挂智能巡检机器人系统
CN109849023A (zh) * 2019-04-10 2019-06-07 江苏方天电力技术有限公司 一种轨道悬挂智能巡检机器人系统
CN110275533A (zh) * 2019-06-25 2019-09-24 李子月 一种虚实结合的无人巡逻车系统
CN112333421A (zh) * 2020-09-15 2021-02-05 安徽龙运智能科技有限公司 一种基于5g的博物馆智能巡检系统
CN112347306A (zh) * 2020-09-29 2021-02-09 浙江大华技术股份有限公司 一种ptz摄像机监控跟踪方法、装置、系统和计算机设备
CN112270267B (zh) * 2020-10-29 2023-08-08 国网山东省电力公司淄博供电公司 可自动抓拍线路故障的摄像识别系统
CN112270267A (zh) * 2020-10-29 2021-01-26 国网山东省电力公司淄博供电公司 可自动抓拍线路故障的摄像识别系统
CN112446628A (zh) * 2020-12-01 2021-03-05 云南昆船数码科技有限公司 一种应用于物业管理的巡更管理系统及其管理方法
CN113554775A (zh) * 2021-06-01 2021-10-26 广东电网有限责任公司广州供电局 无人机电力巡检系统
CN113554775B (zh) * 2021-06-01 2023-05-30 广东电网有限责任公司广州供电局 无人机电力巡检系统
CN113536934A (zh) * 2021-06-17 2021-10-22 杭州电子科技大学 一种巡逻机器人执行追踪任务时的主动隐藏方法
CN113536934B (zh) * 2021-06-17 2024-02-02 杭州电子科技大学 一种巡逻机器人执行追踪任务时的主动隐藏方法
CN116437216A (zh) * 2023-06-12 2023-07-14 湖南博信创远信息科技有限公司 基于人工智能数据处理和视觉分析的工程监管方法及系统
CN116437216B (zh) * 2023-06-12 2023-09-08 湖南博信创远信息科技有限公司 基于人工智能数据处理和视觉分析的工程监管方法及系统

Also Published As

Publication number Publication date
CN107148777B (zh) 2019-11-08
CN107148777A (zh) 2017-09-08

Similar Documents

Publication Publication Date Title
WO2018076191A1 (zh) 智能巡逻设备、云端控制装置、巡逻方法、控制方法、机器人、控制器及非暂态计算机可读存储介质
US10370102B2 (en) Systems, apparatuses and methods for unmanned aerial vehicle
Wheeler et al. Face recognition at a distance system for surveillance applications
CN109887040B (zh) 面向视频监控的运动目标主动感知方法及系统
US10403107B2 (en) Passive optical detection method and system for vehicles
CN103168467B (zh) 使用热图像坐标的安防摄像机追踪和监控系统及方法
WO2018217260A3 (en) SYSTEMS AND METHODS FOR TRACKING AND CONTROLLING A MOBILE CAMERA FOR FORMING IMAGES OF OBJECTS OF INTEREST
KR101543542B1 (ko) 지능형 감시 시스템 및 이를 이용한 지능형 감시 방법
CN110619276B (zh) 基于无人机移动监控的异常及暴力检测系统和方法
CN110175587B (zh) 一种基于人脸识别和步态识别算法的视频追踪方法
WO2018076895A1 (zh) 基于主无人机的从无人机飞行控制方法、装置及系统
JP2007219948A (ja) ユーザ異常検出装置、及びユーザ異常検出方法
Monajjemi et al. UAV, do you see me? Establishing mutual attention between an uninstrumented human and an outdoor UAV in flight
CN106056624A (zh) 无人机高清图像小目标检测与跟踪系统及其检测跟踪方法
JP2016177640A (ja) 映像監視システム
WO2020135187A1 (zh) 基于rgb_d和深度卷积网络的无人机识别定位系统和方法
WO2018103005A1 (zh) 电子设备以及对监控目标进行监控的方法和装置
WO2018222932A1 (en) Video recording by tracking wearable devices
Fawzi et al. Embedded real-time video surveillance system based on multi-sensor and visual tracking
CN107547865A (zh) 跨区域人体视频目标跟踪智能监控方法
JP7035272B2 (ja) 撮影システム
WO2018121730A1 (zh) 视频监控和人脸识别方法、装置及系统
JP6482855B2 (ja) 監視システム
CN108898056B (zh) 一种消防单兵装备与人员快速匹配系统
Mohan et al. UAV based security system for prevention of harassment against woman

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16919603

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 20.08.2019)

122 Ep: pct application non-entry in european phase

Ref document number: 16919603

Country of ref document: EP

Kind code of ref document: A1