Detailed Description
Specific embodiments of the disclosed embodiments are described in detail below with reference to the accompanying drawings. It is to be understood that the description herein of specific embodiments is only for purposes of illustrating and explaining the embodiments of the present disclosure, and is not intended to limit the embodiments of the present disclosure.
Before describing in detail the electronic device and the method and apparatus for monitoring a monitoring target according to various embodiments of the present disclosure, a scenario to which embodiments of the present disclosure are applicable is first introduced. The electronic equipment and the method and the device for monitoring the monitored target according to the embodiments of the disclosure are suitable for the cloud server. The cloud server stores therein, for example, monitoring capabilities (e.g., speed capability, monitorable range, etc.) and statuses (e.g., idle status, monitoring status, etc.) of all monitoring devices in a certain geographic area. In addition, the cloud server may communicate wirelessly with the monitoring device (for example, in the case where the monitoring device is a monitoring device capable of communicating wirelessly such as a drone, a robot, or the like), and may also communicate wiredly with the monitoring device (for example, in the case where the monitoring device is a monitoring device capable of communicating only wiredly such as a monitoring camera on the street). In addition, all the monitoring devices communicate with the cloud server in real time, so that the cloud server can know the current position and current state information of each monitoring device in real time.
According to one embodiment of the present disclosure, a method of monitoring a monitored target is provided. The method may be applied to, for example, a cloud server. As shown in fig. 1, the method may include the following steps S101 to S103.
In step S101, determining a comprehensive monitoring priority of a monitoring target;
in step S102, updating a monitoring device group that monitors the monitoring target based on the integrated monitoring priority;
in step S103, an instruction for monitoring the monitoring target is sent to the updated monitoring device group.
According to the technical scheme, the cloud server can update the monitoring equipment group for monitoring the monitoring target based on the comprehensive monitoring priority of the monitoring target, and then sends the instruction for monitoring the monitoring target to the updated monitoring equipment group, so that after a certain monitoring equipment finds the monitoring target, the cloud server can schedule one or more monitoring equipment to simultaneously and jointly monitor the found monitoring target, and the scheduling process does not need manual calling, so that manpower and material resources are saved. In addition, the cloud server can update the monitoring equipment group for monitoring the monitoring target in real time according to the comprehensive monitoring priority of the monitoring target, so that other available monitoring equipment can be added into the monitoring equipment group for monitoring the monitoring target before the monitoring target is out of the current monitoring range, the monitoring range of the monitoring equipment group is expanded, the monitoring target can be guaranteed to be always in a monitored state wherever the monitoring target moves before being apprehended or before the monitoring task is completed, and the situation that the monitoring target is lost and cannot be found any more is avoided.
In a possible embodiment, as shown in fig. 2, the step of determining the integrated monitoring priority of the monitoring target in step S101 may include the following steps S101a to S101 c.
In step S101a, a monitoring priority of the monitoring target associated with the identity of the monitoring target is obtained.
This step can be implemented in the following manner. Monitoring devices such as monitoring cameras of unmanned planes, robots, streets and the like monitor the surrounding environment through the cameras of the monitoring devices in the automatic patrol monitoring process. The existence of suspicious people, suspicious robots or suspicious pursuit targets is monitored in real time through one or more modes of image recognition, face recognition, iris recognition, action recognition, wireless signal recognition, license plate recognition, number recognition and the like. After the suspicious object is found, the monitoring device locks the suspicious object and takes an image (e.g., a photograph or video) of the suspicious object. Then, the monitoring device may identify the identity of the suspicious target (i.e., the monitoring target) and send the identified identity information to the cloud server, or the monitoring device may send the captured image to the cloud server and identify the identity of the monitoring target by the cloud server. After the cloud server obtains the identity information of the monitoring target, the monitoring priority of the monitoring target can be determined according to a preset rule, for example, the monitoring priority can be determined based on a preset corresponding relationship between the identity of the monitoring target and the monitoring priority, or based on factors such as monitoring event analysis and hazard level analysis. For example, the monitoring priority for theft may be set lower than that for gun-holding robbery, since the latter is more hazardous.
In addition, the cloud server may divide the monitoring priority of each monitoring target into, for example, A, B, C, etc., where a denotes the highest monitoring priority and C denotes the lowest monitoring priority, and it should be understood by those skilled in the art that A, B, C is merely an example, and the monitoring priority may be further divided according to actual situations. In addition, as long as the monitoring target is not changed, the monitoring priority thereof is not changed.
In step S101b, the remaining time length of the monitoring target leaving the monitoring range of the monitoring device group is obtained.
Wherein, this step can be realized by the following way: acquiring the remaining time length of the monitoring target out of the monitoring range of each member of the monitoring equipment group; and taking the maximum remaining time length in the obtained remaining time lengths as the remaining time length of the monitoring target out of the monitoring range of the monitoring equipment group.
In addition, the remaining time length of the monitoring target leaving the monitorable range of each monitoring device member of the monitoring device group may be calculated by each monitoring device member and then sent to the cloud server, or the remaining time length of the monitoring target leaving the monitorable range of each monitoring device member of the monitoring device group may be calculated by the cloud server.
In addition, the remaining time length of the monitoring target leaving the monitorable range of each monitoring device member of the monitoring device group can be calculated as follows. Firstly, the real-time three-dimensional coordinates of the monitoring target are calculated by utilizing the relative position relation. And if the monitored target is a suspicious wireless signal, taking the intersection point of the source direction of the wireless signal and the building or the object as a three-dimensional coordinate point of the monitored target. Then, motion information of the monitoring target, such as speed, acceleration, traveling direction, and the like, is determined based on the real-time three-dimensional coordinates of the monitoring target. Then, based on the real-time three-dimensional coordinates and monitoring capabilities (e.g., speed capability, monitorable range, etc.) of the monitoring device and the motion information of the monitoring target, the remaining time length of the monitoring target leaving the monitorable range of the monitoring device is calculated.
The following describes how to calculate the real-time three-dimensional coordinates of the monitored target by using the relative position relationship by taking a patrol unmanned aerial vehicle as an example. The calculation principle of the ground patrol robot and the like is similar.
As shown in fig. 8, assuming that the plane of the XY axis in fig. 8 is the sea level (not the ground), the patrol drone is currently located at a point a in the space, and a monitoring target 800 at a point B on the ground is found at some time. At any moment, the point projected to the sea level by the patrol unmanned aerial vehicle is the origin O of the XY axes (when the unmanned aerial vehicle moves, the XY axes move along with the patrol unmanned aerial vehicle), the X axis always points to the true east, and the Y axis always points to the true north.
The patrol unmanned aerial vehicle can obtain the linear distance from the patrol unmanned aerial vehicle to the monitored target 800 by means of infrared laser ranging or double-camera ranging and the like, the linear distance is assumed to be d, the accurate altitude of the patrol unmanned aerial vehicle can be obtained by a height sensor, the current altitude measured by the patrol unmanned aerial vehicle is assumed to be h, the patrol unmanned aerial vehicle can constantly know the included angle between the optical axis of the camera and the vertical direction (the included angle can be obtained by the rotating mechanical part of the camera), the patrol unmanned aerial vehicle can constantly know the absolute direction of the patrol unmanned aerial vehicle facing the patrol unmanned aerial vehicle by a geomagnetic sensor, the absolute direction of the projection OC of the optical axis direction on the sea level (C is the projection point from a target point B to the sea level) can be known by the rotating mechanical structure of the camera, the forward included angle between the OC and the X axis is assumed to be β, the patrol unmanned aerial vehicle can constantly know the longitude and latitude information of the patrol unmanned aerial vehicle by a satellite positioning system.
In the triangular ABD, it can be seen that:
BD=d*sinα;
AD=d*cosα;
further, it is possible to obtain:
OC=BD=d*sinα;
OD=h–AD=h-d*cosα;
BC=OD;
BC is the altitude of the monitored target 800 at point B.
From the OC length and the angle β, the C point coordinates are found to be (d × sin α × cos β, d × sin α × sin β) in the XY plane;
after the specific longitude and latitude information of the point O is known (the longitude and latitude information of the point O is the longitude and latitude information of the point a), after the point O moves a certain distance in the east-west direction or the south-north direction, the change of the longitude and latitude is determined, the change of the longitude and west direction of the point O and the change of the distance in the east-west direction of the point O are assumed to have a proportionality coefficient j, and the change of the latitude in the south-north direction of the point O and the change of the distance in the south-north direction of the point O have a proportionality coefficient k (when the longitude and latitude of the point O are different, the values of j and k are also different, but as long as the longitude and latitude of the point O are determined, j and k are fixed values), the offset of the longitude and latitude of the target point C relative to the point O is known as (j d sin α cos β, k d sin α sin β), so the absolute longitude and altitude:
(m±j*d*sinα*cosβ,n±k*d*sinα*sinβ,h-d*cosα)
here the sign of the relative (m, n) change in longitude and latitude depends on whether the current patrol drone is in the southern or northern hemisphere, the eastern hemisphere or the western hemisphere (latitude increases if it is offset north in the northern hemisphere and decreases if it is offset north in the southern hemisphere).
Thus, the absolute three-dimensional coordinates of the monitoring target 800 are obtained. The accuracy of the three-dimensional coordinates is obviously higher than that of the plane coordinates, for example, the same longitude and latitude are arranged above and below the bridge, which may cause the capture personnel to go to wrong positions; or the target of pursuing is a patrol unmanned aerial vehicle, can go up and down, and the plane coordinate can not judge the height information, and the problem of catching smoothly can not be solved.
The following illustrates how the remaining time length of the monitorable range of each monitoring device is calculated.
An example monitoring device is a street-head monitoring camera, whose monitorable range is a circle with a radius of r. The current linear distance between the monitoring target and the monitoring camera is d, the speed of the monitoring target is v, the advancing direction is due north, and the remaining time length t for monitoring the monitoring target by the monitoring camera is as follows: t is (r-d)/v.
An example two monitoring device is a patrol robot, the monitorable range of which is a circle of radius r. The current straight-line distance between the monitoring target and the patrol robot is d, the speed of the monitoring target is v1, the speed of the patrol robot is v2, and the directions of the monitoring target and the patrol robot are positive north. If v2 is more than or equal to v1, the monitoring target can be always in the monitoring range of the patrol robot. If v2< v1 (possibly because the speed capability of the patrol robot cannot keep up with the movement capability of the monitored target, or the patrol robot detects that the monitored target may be perceived to be monitored, so that the patrol robot slows down and continues to track the monitored target by other monitoring equipment), the remaining time length t for the patrol robot to monitor the monitored target is as follows: t-r-d)/(v 1-v 2.
The example three monitoring device is a patrol drone and its effective viewing angle ground coverage is a circle of radius r. The current linear distance between the monitoring target and the ground projection point of the patrol unmanned aerial vehicle is d, the linear distance between the patrol unmanned aerial vehicle and the most marginal point of the effective visual angle ground coverage range of the patrol unmanned aerial vehicle is l (because the furthest point in front covered by the image can be blocked by buildings, mountains, tunnels and the like, only the furthest point which can be monitored by the patrol unmanned aerial vehicle on the ground projection is considered here), the current height of the patrol unmanned aerial vehicle is h, the speed of the monitoring target is v1, the speed of the patrol unmanned aerial vehicle is v2, and the traveling directions of the patrol unmanned aerial vehicle and the patrol unmanned aerial vehicle are both due north. If v2 is more than or equal to v1, the monitoring target can be always in the effective monitoring range of the patrol unmanned aerial vehicle. When v2< v1 (possibly because the speed capability of the patrol drone cannot keep up with the speed capability of the monitoring target, or possibly because the patrol drone detects that the monitoring target may be perceived to be monitored, so the patrol drone slows down and continues to track the monitoring target by other monitoring devices), the remaining time period t for which the patrol drone can also monitor the monitoring target is:
it will be appreciated by those skilled in the art that the above three examples are merely theoretical mathematical analyses and are all simpler cases, for illustrative purposes only. Actually, for reasons of various landforms, turning or blocking, and the like, the remaining time length t of the monitored target departing from the monitoring range of each monitoring device needs to be estimated by combining a three-dimensional map and mathematical calculation. Due to the limited computing power of the monitoring device, the success rate of calculating the remaining time length t of the monitored target leaving the monitorable range by the monitoring device itself is low, but the calculation delay is low. If the cloud server is used for computing, due to the fact that computing capability of the cloud server is high, estimation of high accuracy can be achieved, and the estimation accuracy can be gradually improved by combining with experience accumulation of deep learning. Therefore, under the condition that the monitoring device can transmit the real-time image of the monitored target to the cloud server with low delay, the cloud server can estimate the remaining time length t of the monitored target out of the monitoring range of the monitoring device better.
In step S101c, the integrated monitoring priority is determined based on the monitoring priority and the remaining time length.
In this step, the cloud server may determine the integrated monitoring priority directly based on the monitoring priority and the remaining time length of the monitoring target leaving the monitoring range of the monitoring device group. The cloud server may also determine the monitoring urgency level of the monitoring target according to a preset remaining time length range in which the remaining time length of the monitoring target departing from the monitoring range of the monitoring device group is located, for example, the monitoring urgency level corresponding to the preset remaining time length range of 0-1 minute is level 1, the monitoring urgency level corresponding to the preset remaining time length range of 1-3 minutes is level 2, and the monitoring urgency level corresponding to the preset remaining time length range of 3 minutes or more is level 3, where level 1 is the highest urgency level, and it should be understood by those skilled in the art that the setting of the preset remaining time length range is only an example, and the embodiment of the present disclosure does not limit this; then, the cloud server may determine an integrated monitoring priority of the monitoring target based on the monitoring priority and the monitoring urgency. For example, the integrated monitoring priorities of the respective monitoring targets may be the same or different, and may be represented by, for example, a1, a2, A3, B1, B2, B3, C1, C2, C3, etc., where A, B, C represents the monitoring priority, 1, 2, 3 represents the monitoring urgency, and a1 represents the highest integrated monitoring priority and C3 represents the lowest integrated monitoring priority.
In addition, when determining the comprehensive monitoring priority, the monitoring priority is considered first, and then the monitoring urgency level, that is, the monitoring target M with the high monitoring priority is considered, even if the current monitoring urgency level is lower than the monitoring urgency level of the monitoring target N with the low monitoring priority, the comprehensive monitoring priority level of the monitoring target M is still higher than the comprehensive monitoring priority level of the monitoring target N. For example, if the monitoring priority of the monitoring target M is a, the monitoring urgency level is 3, the monitoring priority of the monitoring target N is B, and the monitoring urgency level is 1, although the monitoring urgency level 1 of the monitoring target N is higher than the monitoring urgency level 3 of the monitoring target M, since the monitoring priority B of the monitoring target N is lower than the monitoring priority level a of the monitoring target M, the integrated monitoring priority level A3 of the monitoring target M is still higher than the integrated monitoring priority level B1 of the monitoring target N, that is, A3> B1.
In addition, in the case where the cloud server determines the integrated monitoring priority directly based on the monitoring priority and the remaining time length of the monitoring target out of the monitoring range of the monitoring device group (instead of the preset remaining time length range), the monitoring urgency degree with respect to the monitoring target is actually divided into infinite stages.
In addition, since the remaining time length of the monitoring target leaving the monitoring range of the monitoring equipment group is changed in real time, the monitoring urgency degree is also changed at any time.
Through the steps S101a to S101c, the comprehensive monitoring priority of the monitoring target can be effectively determined, so that the monitoring device group can be updated in real time according to the comprehensive monitoring priority.
In a possible implementation manner, the step of updating the monitoring device group monitoring the monitoring target based on the comprehensive monitoring priority in step S102 may include: adding the monitoring devices meeting a first preset condition to the monitoring device group, wherein the first preset condition may include:
(1) the monitoring device can reduce the comprehensive monitoring priority of the monitoring target or increase the remaining time length of the monitoring target out of the monitoring range of the monitoring device group after being added into the monitoring device group (in the case that the monitoring urgency degree of the monitoring target is determined, the monitoring urgency degree of the monitoring target can be reduced after the monitoring device is added into the monitoring device group); and
(2) one of the following: (i) the integrated monitoring priority of the current monitoring target of the monitoring device is lower than the integrated monitoring priority of the monitoring target monitored by the monitoring device group to which the monitoring device is added, and the exiting of the monitoring device from the current monitoring device group does not cause the integrated monitoring priority of the current monitoring target to be higher than or equal to the integrated monitoring priority of the monitoring target monitored by the monitoring device group to which the monitoring device is added and also does not cause the monitoring of the current monitoring target to be interrupted (for example, if the monitoring device is the only monitoring device of the current monitoring target, the exiting of the monitoring device causes the monitoring of the current monitoring target to be interrupted); and (ii) the integrated monitoring priority of the current monitoring target of the monitoring device is higher than the integrated monitoring priority of the monitoring target monitored by the monitoring device group to which the monitoring device is to be added, and the exit of the monitoring device from its current monitoring device group does not raise the integrated monitoring priority of the current monitoring target nor interrupt the monitoring of the current monitoring target.
In a possible implementation manner, the updating, in step S102, the monitoring device group that monitors the monitoring target based on the integrated monitoring priority may further include: and moving the monitoring equipment meeting the second preset condition in the monitoring equipment group out of the monitoring equipment group, so that the moved monitoring equipment can continuously monitor other monitoring targets, and the waste of resources is avoided. Wherein the second preset condition comprises any one or a combination of the following conditions:
(1) the moving-out of the monitoring equipment does not cause the comprehensive monitoring priority of the monitoring target to be increased;
(2) the monitoring target is separated from the monitoring range of the monitoring equipment;
(3) the current capability of the monitoring device is insufficient to continue monitoring the monitored target.
In addition, after the monitoring device is moved out of its current monitoring device group, the cloud server may set the current state of the moved monitoring device to an idle state, so that the moved monitoring device can freely discover a new monitoring target or be scheduled to other monitoring tasks.
In a possible implementation, as shown in fig. 3, the method according to this embodiment may further include the following steps S301 to S303.
In step S301, it is determined whether the monitoring target has left the monitoring range of the monitoring device group and no available monitoring device can be added to the monitoring device group.
In step S302, when the monitoring target has departed from the monitoring range of the monitoring device group and no available monitoring device can be added to the monitoring device group, a motion trajectory of the monitoring target is obtained.
In this step, the cloud server may predict the movement prediction of the monitoring target in combination with the three-dimensional map based on information such as a three-dimensional coordinate, a speed, an acceleration, a traveling direction, and the like when the monitoring target disappears.
In step S303, an instruction for monitoring the monitoring target is sent to the monitoring device on the motion trajectory.
In this step, the image of the monitored target may be sent to the monitoring device on the pre-determined motion trajectory, so that the monitoring device on the pre-determined motion trajectory may continue to monitor the monitored target.
The embodiment of the disclosure also provides a device for monitoring the monitored target, which can be applied to a cloud server. As shown in fig. 4, the apparatus may include:
a determining module 401, configured to determine a comprehensive monitoring priority of a monitoring target;
an updating module 402, configured to update a monitoring device group that monitors the monitoring target based on the comprehensive monitoring priority;
a sending module 403, configured to send an instruction for monitoring the monitoring target to the updated monitoring device group.
According to the technical scheme, the cloud server can update the monitoring equipment group for monitoring the monitoring target based on the comprehensive monitoring priority of the monitoring target, and then sends the instruction for monitoring the monitoring target to the updated monitoring equipment group, so that after a certain monitoring equipment finds the monitoring target, the cloud server can schedule one or more monitoring equipment to simultaneously and jointly monitor the found monitoring target, and the scheduling process does not need manual calling, so that manpower and material resources are saved. In addition, the cloud server can update the monitoring equipment group for monitoring the monitoring target in real time according to the comprehensive monitoring priority of the monitoring target, so that other available monitoring equipment can be added into the monitoring equipment group for monitoring the monitoring target before the monitoring target is out of the current monitoring range, the monitoring range of the monitoring equipment group is expanded, the monitoring target can be guaranteed to be always in a monitored state wherever the monitoring target moves before being apprehended or before the monitoring task is completed, and the situation that the monitoring target is lost and cannot be found any more is avoided.
In one possible implementation, as shown in fig. 5, the determining module 402 may include:
a monitoring priority obtaining sub-module 402a, configured to obtain a monitoring priority of the monitoring target, where the monitoring priority is associated with an identity of the monitoring target;
a remaining time length obtaining submodule 402b configured to obtain a remaining time length of the monitoring target departing from the monitoring range of the monitoring device group;
an integrated monitoring priority determining sub-module 402c, configured to determine the integrated monitoring priority based on the monitoring priority and the remaining time length.
In a possible implementation, the remaining time length obtaining sub-module 502 may be further configured to:
acquiring the remaining time length of the monitoring target out of the monitoring range of each member of the monitoring equipment group;
and taking the maximum remaining time length in the obtained remaining time lengths as the remaining time length of the monitoring target out of the monitoring range of the monitoring equipment group.
In a possible implementation manner, the updating module 402, based on the integrated monitoring priority, may update a monitoring device group that monitors the monitoring target, including: adding the monitoring equipment meeting a first preset condition to the monitoring equipment group, wherein the first preset condition comprises the following steps:
(1) after the monitoring equipment is added into the monitoring equipment group, the comprehensive monitoring priority of the monitoring target can be reduced or the remaining time length of the monitoring target out of the monitoring range of the monitoring equipment group can be increased (under the condition that the monitoring urgency degree of the monitoring target is determined, the monitoring urgency degree of the monitoring target can be reduced after the monitoring equipment is added into the monitoring equipment group); and
(2) one of the following: (i) the comprehensive monitoring priority of the current monitoring target of the monitoring equipment is lower than the comprehensive monitoring priority of the monitoring target monitored by the monitoring equipment group to which the monitoring equipment is added, and the monitoring equipment quits from the current monitoring equipment group, so that the comprehensive monitoring priority of the current monitoring target is not higher than or equal to the comprehensive monitoring priority of the monitoring target monitored by the monitoring equipment group to which the monitoring equipment is added, and the monitoring of the current monitoring target is not interrupted; and (ii) the integrated monitoring priority of the current monitoring target of the monitoring device is higher than the integrated monitoring priority of the monitoring target monitored by the monitoring device group to which the monitoring device is to be added, and the exit of the monitoring device from its current monitoring device group does not raise the integrated monitoring priority of the current monitoring target nor interrupt the monitoring of the current monitoring target.
In a possible implementation manner, the updating module 402 updates the monitoring device group monitoring the monitoring target based on the integrated monitoring priority, and may further include: moving out the monitoring equipment meeting a second preset condition in the monitoring equipment group from the monitoring equipment group, wherein the second preset condition comprises any one or combination of the following conditions:
(1) the moving-out of the monitoring equipment does not cause the comprehensive monitoring priority of the monitoring target to be increased;
(2) the monitoring target is separated from the monitoring range of the monitoring equipment;
(3) the current capability of the monitoring device is insufficient to continue monitoring the monitored target.
In a possible implementation manner, as shown in fig. 6, the apparatus according to this embodiment may further include an obtaining module 404:
the determining module 401 is further configured to determine whether the monitoring target has departed from the monitoring range of the monitoring device group and no available monitoring device can be added to the monitoring device group;
the obtaining module 404 is configured to obtain a motion trajectory of the monitoring target when the monitoring target has departed from the monitoring range of the monitoring device group and no available monitoring device can be added to the monitoring device group;
the sending module 403 is further configured to send an instruction for monitoring the monitored target to the monitoring device on the motion trajectory.
Specific implementation manners of operations performed by each module in the apparatus according to the embodiment of the present disclosure have been described in detail in the method according to the embodiment of the present disclosure, and are not described herein again.
Fig. 7 is a block diagram illustrating an apparatus 500 for monitoring a monitoring target according to an exemplary embodiment, where the apparatus 500 may be an electronic device. As shown in fig. 5, the apparatus 500 may include: a processor 501, a memory 502, a multimedia component 503, an input/output (I/O) interface 504, and a communication component 505.
The processor 501 is configured to control the overall operation of the apparatus 500, so as to complete all or part of the steps in the above-mentioned method for monitoring a monitoring target. The memory 502 is used to store various types of data to support operation of the apparatus 500, such as instructions for any application or method operating on the apparatus 500, and application-related data, such as contact data, messaging, pictures, audio, video, and so forth. The Memory 502 may be implemented by any type of volatile or non-volatile Memory device or combination thereof, such as Static Random Access Memory (SRAM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Erasable Programmable Read-Only Memory (EPROM), Programmable Read-Only Memory (PROM), Read-Only Memory (ROM), magnetic Memory, flash Memory, magnetic disk or optical disk. The multimedia component 503 may include a screen and an audio component. Wherein the screen may be, for example, a touch screen and the audio component is used for outputting and/or inputting audio signals. For example, the audio component may include a microphone for receiving external audio signals. The received audio signal may further be stored in the memory 502 or transmitted through the communication component 505. The audio assembly also includes at least one speaker for outputting audio signals. The I/O interface 504 provides an interface between the processor 501 and other interface modules, such as a keyboard, mouse, buttons, etc. These buttons may be virtual buttons or physical buttons. The communication component 505 is used for wired or wireless communication between the apparatus 500 and other devices. Wireless communication, such as Wi-Fi, bluetooth, Near Field Communication (NFC), 2G, 3G, or 4G, or a combination of one or more of them, so that the corresponding communication component 505 may include: Wi-Fi module, bluetooth module, NFC module.
In an exemplary embodiment, the apparatus 500 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors or other electronic components for performing the above-described method for monitoring a monitoring target.
In another exemplary embodiment, a non-transitory computer readable storage medium comprising instructions, such as the memory 502 comprising instructions, executable by the processor 501 of the apparatus 500 to perform the control method for an electronic device described above is also provided. The non-transitory computer readable storage medium may be, for example, ROM, Random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like.
According to yet another embodiment of the present disclosure, a computer program product is provided, which is characterized in that the computer program product comprises a computer program executable by a programmable apparatus, the computer program having code portions for performing the above-mentioned method of monitoring a monitoring target when the computer program is executed by the programmable apparatus.
According to still another embodiment of the present disclosure, there is provided a non-transitory computer-readable storage medium, including one or more programs for executing the above-described method of monitoring a monitoring target.
According to still another embodiment of the present disclosure, there is provided an electronic apparatus including: the non-transitory computer-readable storage medium described above; and one or more processors to execute the programs in the non-transitory computer readable storage medium.
The preferred embodiments of the present disclosure are described in detail with reference to the accompanying drawings, however, the present disclosure is not limited to the specific details of the above embodiments, and various simple modifications may be made to the technical solution of the present disclosure within the technical idea of the present disclosure, and these simple modifications all belong to the protection scope of the present disclosure.
It should be noted that the various features described in the above embodiments may be combined in any suitable manner without departing from the scope of the invention. In order to avoid unnecessary repetition, various possible combinations will not be separately described in this disclosure.
In addition, any combination of various embodiments of the present disclosure may be made, and the same should be considered as the disclosure of the present disclosure, as long as it does not depart from the spirit of the present disclosure.