CN111757046B - Data processing method, device, equipment and machine readable medium - Google Patents

Data processing method, device, equipment and machine readable medium Download PDF

Info

Publication number
CN111757046B
CN111757046B CN201910245758.4A CN201910245758A CN111757046B CN 111757046 B CN111757046 B CN 111757046B CN 201910245758 A CN201910245758 A CN 201910245758A CN 111757046 B CN111757046 B CN 111757046B
Authority
CN
China
Prior art keywords
picture
monitoring
monitoring area
equipment
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910245758.4A
Other languages
Chinese (zh)
Other versions
CN111757046A (en
Inventor
陈刚先
莫鋆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cainiao Smart Logistics Holding Ltd
Original Assignee
Cainiao Smart Logistics Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cainiao Smart Logistics Holding Ltd filed Critical Cainiao Smart Logistics Holding Ltd
Priority to CN201910245758.4A priority Critical patent/CN111757046B/en
Publication of CN111757046A publication Critical patent/CN111757046A/en
Application granted granted Critical
Publication of CN111757046B publication Critical patent/CN111757046B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/643Communication protocols
    • H04N21/6437Real-time Transport Protocol [RTP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/647Control signaling between network components and server or clients; Network processes for video distribution between server and clients, e.g. controlling the quality of the video stream, by dropping packets, protecting content from unauthorised alteration within the network, monitoring of network load, bridging between two different networks, e.g. between IP and wireless

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Security & Cryptography (AREA)
  • Alarm Systems (AREA)

Abstract

The embodiment of the application provides a data processing method, a data processing device, equipment and a machine readable medium, wherein the method is applied to first equipment and specifically comprises the following steps: acquiring a video corresponding to a monitoring area from a video source; capturing a video corresponding to the monitoring area to obtain a picture corresponding to the monitoring area; and uploading the picture corresponding to the monitoring area to second equipment. The embodiment of the application can meet the requirement of remote monitoring under the condition of reducing uploading cost.

Description

Data processing method, device, equipment and machine readable medium
Technical Field
The present application relates to the field of monitoring technologies, and in particular, to a data processing method, a data processing apparatus, a device, and a machine-readable medium.
Background
With the rapid development of monitoring technology and network communication technology, network-based monitoring systems are increasingly widely used. For example, a monitoring system of a warehouse storage center gradually develops towards intellectualization and remote monitoring, and the requirement of remotely monitoring a storage object anytime and anywhere can be met by monitoring the conditions in the warehouse in real time.
Current remote monitoring processes typically include: receiving a video stream uploaded on a monitoring site, and storing the video stream in a centralized manner; and acquiring the video stream from the centralized storage place, and playing back the video stream, thereby realizing the remote monitoring requirement.
In practical applications, uploading of a video stream usually requires a large amount of upstream bandwidth. Due to the limitation of operators, the current civil bandwidth is usually low, such as 4Mbps/s, that is, the current civil bandwidth cannot meet the requirement of remote monitoring. In this case, in order to meet the requirement of remote monitoring, hardware upgrading is required to upgrade the civil bandwidth to the dedicated bandwidth, for example, bandwidth upgrading can be realized by modifying hardware such as a router and a modem; however, upgrading the hardware will increase the material costs.
Disclosure of Invention
The technical problem to be solved by the embodiments of the present application is to provide a data processing method, which can meet the requirement of remote monitoring while reducing the uploading cost.
Correspondingly, the embodiment of the application also provides a data processing device, equipment and a machine readable medium, which are used for ensuring the realization and the application of the method.
In order to solve the above problem, an embodiment of the present application discloses a data processing method, which is applied to a first device, and the method includes:
acquiring a video corresponding to a monitoring area from a video source;
capturing a video corresponding to the monitoring area to obtain a picture corresponding to the monitoring area;
and uploading the picture corresponding to the monitoring area to second equipment.
In order to solve the above problem, an embodiment of the present application discloses a data processing method, which is applied to a second device, and the method includes:
receiving a picture corresponding to a monitoring area from first equipment; the picture corresponding to the monitoring area is obtained according to the video screenshot corresponding to the monitoring area;
and storing the picture corresponding to the monitoring area.
In order to solve the above problem, an embodiment of the present application discloses a data processing method, which is applied to a third device, and the method includes:
receiving a picture corresponding to the monitoring area from the second equipment; the picture corresponding to the monitoring area is obtained according to the video screenshot corresponding to the monitoring area;
and processing the picture corresponding to the monitoring area.
On the other hand, the embodiment of the present application further discloses a data processing apparatus, which is applied to a first device, and the apparatus includes:
the video acquisition module is used for acquiring videos corresponding to the monitoring area from the video source;
the video screenshot module is used for screenshot the video corresponding to the monitoring area to obtain the picture corresponding to the monitoring area; and
and the picture uploading module is used for uploading the picture corresponding to the monitoring area to second equipment.
On the other hand, the embodiment of the present application further discloses a data processing apparatus, which is applied to a second device, and the apparatus includes:
the image receiving module is used for receiving an image corresponding to the monitoring area from the first equipment; the picture corresponding to the monitoring area is obtained according to the video screenshot corresponding to the monitoring area; and
and the picture storage module is used for storing the picture corresponding to the monitoring area.
On the other hand, the embodiment of the present application further discloses a data processing apparatus, which is applied to a third device, and the apparatus includes:
the picture receiving module is used for receiving a picture corresponding to the monitoring area from the second equipment; the picture corresponding to the monitoring area is obtained according to the video screenshot corresponding to the monitoring area; and
and the picture processing module is used for processing the picture corresponding to the monitoring area.
In another aspect, an embodiment of the present application further discloses an apparatus, including:
one or more processors; and
one or more machine-readable media having instructions stored thereon that, when executed by the one or more processors, cause the apparatus to perform one or more of the methods described above.
In yet another aspect, embodiments of the present application disclose one or more machine-readable media having instructions stored thereon, which when executed by one or more processors, cause an apparatus to perform one or more of the methods described above.
Compared with the prior art, the embodiment of the application has the following advantages:
according to the embodiment of the application, the pictures corresponding to the monitoring area are obtained through the first equipment locally arranged in the monitoring area, and the pictures corresponding to the monitoring area are stored in a concentrated mode through the second equipment. According to the embodiment of the application, the remote monitoring of the monitoring area can be realized by the remote equipment such as the second equipment by checking the picture corresponding to the monitoring area, and the uploading cost of the picture can be less than that of the video, so that the remote monitoring requirement can be realized under the condition of reducing the uploading cost.
Drawings
FIG. 1 is a flow chart of steps of a first embodiment of a data processing method of the present application;
fig. 2 is a schematic architecture diagram of a first device according to an embodiment of the present application;
FIG. 3 is a flowchart illustrating steps of a second embodiment of a data processing method according to the present application;
fig. 4 is a schematic architecture diagram of a monitoring application in a first device according to an embodiment of the present application;
FIG. 5 is a flowchart illustrating the steps of a third embodiment of a data processing method according to the present application;
FIG. 6 is a flowchart illustrating the fourth step of an embodiment of a data processing method according to the present application;
FIG. 7 is a flow chart of the steps of an embodiment of a data processing method of the present application;
FIG. 8 is a block diagram of a second device according to an embodiment of the present application;
FIG. 9 is a flowchart illustrating the steps of a sixth embodiment of a data processing method according to the present application;
FIG. 10 is a block diagram of a data processing apparatus according to an embodiment of the present application;
FIG. 11 is a block diagram of a data processing apparatus according to an embodiment of the present application;
FIG. 12 is a block diagram of a data processing apparatus according to an embodiment of the present application; and
fig. 13 is a schematic structural diagram of an apparatus provided in an embodiment of the present application.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, the present application is described in further detail with reference to the accompanying drawings and the detailed description.
The technical solutions in the embodiments of the present application will be described clearly and completely with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only some embodiments of the present application, and not all embodiments. All other embodiments that can be derived from the embodiments given herein by a person of ordinary skill in the art are intended to be within the scope of the present disclosure.
While the concepts of the present application are susceptible to various modifications and alternative forms, specific embodiments thereof have been shown by way of example in the drawings and will herein be described in detail. It should be understood, however, that the description above is not intended to limit the application to the particular forms disclosed, but on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the application.
Reference in the specification to "one embodiment," "an embodiment," "a particular embodiment," or the like, means that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may or may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, where a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the art to affect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described. In addition, it should be understood that the following possible items may be included in the entries in the list included in the form "at least one of a, B and C": (A) (ii) a (B) (ii) a (C) (ii) a (A and B); (A and C); (B and C); or (A, B and C). Likewise, a listing of items in the form of "at least one of a, B, or C" may mean (a); (B) (ii) a (C) (ii) a (A and B); (A and C); (B and C); or (A, B and C).
In some cases, the disclosed embodiments may be implemented as hardware, firmware, software, or any combination thereof. The disclosed embodiments may also be implemented as instructions carried or stored on one or more transitory or non-transitory machine-readable (e.g., computer-readable) storage media, which may be executed by one or more processors. A machine-readable storage medium may be implemented as a storage device, mechanism, or other physical structure (e.g., a volatile or non-volatile memory, a media disk, or other media other physical structure device) for storing or transmitting information in a form readable by a machine.
In the drawings, some structural or methodical features may be shown in a particular arrangement and/or ordering. Preferably, however, such specific arrangement and/or ordering is not necessary. Rather, in some embodiments, such features may be arranged in different ways and/or orders than as shown in the figures. Moreover, the inclusion of structural or methodological features in a particular figure is not meant to imply that such features are required in all embodiments, and in some embodiments, may not be included or may be combined with other features.
The embodiment of the application can be applied to monitoring scenes. The monitoring scene can be used for realizing remote monitoring of the monitored object in the monitoring area. The monitoring area may refer to an area where a monitoring object is located. The monitoring object may include: at least one of goods, people, and vehicles.
For example, in a logistics monitoring scenario, the monitoring objects may include: goods, like this, can realize the control of goods piling up the condition. Or, in the logistics monitoring scenario, the monitoring object may include: cargo and vehicles, so that the cargo handling condition can be monitored.
As another example, in an office monitoring scenario, the monitoring objects may include: people can realize the monitoring of the attendance condition.
It can be understood that the logistics monitoring scene and the office monitoring scene are only used as optional embodiments of the monitoring scene, and actually, a person skilled in the art can apply the embodiment of the present application to a required monitoring scene according to the actual application requirements, and the embodiment of the present application does not limit the specific monitoring scene.
According to the embodiment of the application, the first equipment is arranged in the local monitoring area, and the second equipment is arranged in the remote monitoring area. The second device may be located in the cloud. The Cloud end may refer to an end opposite to a local end, which may provide Cloud services on the internet based on Cloud technology (Cloud technology). The cloud technology is a hosting technology for integrating series resources such as hardware, software, networks and the like to realize the calculation, storage, processing and sharing of data.
The method comprises the steps that a first device obtains a video corresponding to a monitoring area from a video source, captures the video corresponding to the monitoring area to obtain a picture corresponding to the monitoring area, and uploads the picture corresponding to the monitoring area to a second device.
And the second equipment receives the picture corresponding to the monitoring area from the first equipment and stores the picture corresponding to the monitoring area.
According to the embodiment of the application, the picture corresponding to the monitoring area is obtained through the first equipment locally arranged in the monitoring area, and the picture corresponding to the monitoring area is stored in a concentrated mode through the second equipment. Because the embodiment of the application can enable the remote monitoring of the monitoring area by checking the picture corresponding to the monitoring area, and the uploading cost of the picture can be less than that of the video, the embodiment of the application can realize the remote monitoring requirement under the condition of reducing the uploading cost.
Method embodiment one
Referring to fig. 1, a flowchart illustrating steps of a first embodiment of a data processing method according to the present application is shown, and the method is applied to a first device, and specifically may include the following steps:
101, acquiring a video corresponding to a monitoring area from a video source;
step 102, capturing a video corresponding to the monitoring area to obtain a picture corresponding to the monitoring area;
and 103, uploading the picture corresponding to the monitoring area to second equipment.
The first device may be located locally to the monitored area. The first device may be a device having processing capabilities. The CPU architecture supported by the first device may include: ARM (Advanced RISC Machine), X86, MIPS (average execution speed of single-length fixed-point Instructions, Million Instructions Per Second), and the like.
In an alternative embodiment of the present application, the first device may be an edge computing device. The edge computing device, also called an end device, is a server deployed in a monitoring room local to the monitoring area, and the server has a monitoring application installed thereon, and the monitoring application can be used to execute at least one step of the method shown in fig. 1.
In step 101, a video source is available to provide video. The video source may have a video capture function. The video sources may include: cameras, video cameras, etc. The video source may be located within the surveillance area.
In an optional embodiment of the present application, a video corresponding to the monitoring area may be obtained from a video source through an RTSP (Real Time Streaming Protocol) address. Of course, RTSP is only an optional embodiment of a video Transport Protocol, and actually, a person skilled in the art may use other required Transport protocols, such as RTP (Real-time Transport Protocol), RTCP (Real-time Transport Control Protocol), and the like, according to the actual application requirement.
In this embodiment, the number of paths of the video source may be used to represent the number of paths of the video source connected to the first device. The number of passes of the video source may be equal to 1, or greater than 1.
In step 102, a video screenshot technology may be used to screenshot a video corresponding to the monitored area to obtain a picture corresponding to the monitored area. The video screenshot technology can continuously intercept pictures from videos corresponding to the monitoring area according to screenshot frequency. The screenshot frequency can be determined by those skilled in the art according to the actual application requirements, for example, the screenshot frequency can be 5s, 10s, 1s, and the like.
In step 103, a picture corresponding to the monitoring area may be uploaded to the second device by using a data transmission channel between the first device and the second device. The data transmission channel may be a channel of a public network. A public network may refer to a public network such as the internet, etc.
Optionally, a management channel between the first device and the second device may be utilized to implement management of the first device by the second device, where the management may include: the remote login, the updating of the monitoring application program, the data monitoring, the hardware resource information monitoring and the like of the first equipment by the second equipment are realized. The management channel may be a channel of a private network, and the private network may be a private network.
Optionally, the management channel may specifically include: a first channel and a second channel. The second channel can be a backup of the first channel, so that the first device can be managed through the second channel when the first channel is unavailable, and therefore the problem of high cost caused by field operation and maintenance management can be avoided to a certain extent.
Alternatively, the first channel may be a VPN (Virtual Private Network) channel. The principle of a VPN can be: a data communication tunnel is encapsulated on the public network by using an encryption technology. The method has wide application in enterprise networks. The VPN gateway realizes remote access through encryption of the data packet and conversion of a data packet target address. VPNs can reduce operational costs and reduce connection costs for remote users compared to traditional wide area networks. In addition, the fixed communication cost of the VPN is helpful for enterprises to better understand the own operation expenses.
According to the embodiment of the application, the VPN network is established through the public network, the intercommunication connection between the second equipment and the first equipment is realized, and the operation and maintenance of the monitoring application program on the first equipment are carried out through the private encryption channel corresponding to the VPN. Monitoring the operation and maintenance of the application program may include: monitoring the updating of application programs, data monitoring, hardware resource information monitoring and the like.
Alternatively, the update of the monitoring application may be a hot update. Hot-update may refer to dynamically issuing code that may enable developers to repair BUGs and release functions without releasing new versions of the application.
The data monitoring the corresponding data may include: and the data such as videos corresponding to the monitoring areas, pictures corresponding to the monitoring areas and the like.
The hardware resources may include: computing resources and storage resources. The computing resources may include: a CPU. Storage resources typically include internal memory (RAM, random access memory) and external memory (also referred to as secondary storage). External memory may be managed by standard I/O (Input/Output), while internal memory may be accessed by the CPU directly over the system bus.
Optionally, the hardware resource information may specifically include at least one of the following information: CPU usage information, memory usage information, disk information, and network bandwidth information.
When the hardware resource information of the first device reaches a preset threshold, a hardware resource fault is easily caused. For example, when the CPU is used too much, hardware faults such as network speed over-slow, disk read-write over-slow, etc., or memory overflow caused by over-used memory, or disk read-write error caused by over-used disk, network bandwidth blocking caused by over-used network bandwidth, etc. may occur.
Alternatively, the second channel may be a proxy channel. The proxy tunnel may be a backup of the VPN tunnel and replace the first tunnel if the VPN tunnel is not available.
In an optional embodiment of the present application, the first device may implement a function of a monitoring application using a container technology, where the monitoring application is configured to perform the method of the embodiment of the present application, an image of the monitoring application is stored in a private repository of the container, and a key of the private repository is provided by the second device.
The container technology is a virtualization technology, and the sandbox mechanism is adopted for operation, so that the achievable functions and operation mechanism are similar to those of a virtual machine, but the weight is lighter and the resource utilization is more efficient compared with the virtual machine. The container technology packages an application program and a dependency package of the application program into a portable container mirror by deploying a container in an operating system, and realizes the functions provided by the application program through an isolation environment provided by the container mirror.
The containers corresponding to the application programs are isolated from each other, one container has a file system of the container, processes among the containers cannot be influenced mutually, and computing resources can be distinguished. The container can be deployed quickly, and can be migrated between different clouds and different versions of operating systems due to the fact that the container is decoupled from underlying facilities and a machine file system. The container occupies less resources and is deployed quickly, the application program can be packaged into a container mirror image, and the container has greater advantages due to the one-to-one relation between the application program and the container.
According to the embodiment of the application, the mirror image of the monitoring application program is stored in the private warehouse of the container, and the key of the private warehouse is provided by the second device, so that the safety of the monitoring application program can be improved.
Referring to fig. 2, a schematic architecture diagram of a first device according to an embodiment of the present application is shown, which may specifically include: an application layer 201, a container layer 202, a network layer 203, and an operating system layer 204.
The application layer 201 may include: an application, such as a monitoring application.
The container layer 202 may use resource arrangement technology and container technology to improve the availability of the first device and the application program, so that the application program and the first device can recover themselves after failure.
The container technique may include: docker and pouch, and the like. The resource arranging technology can realize container management, support cluster management of the container across the host equipment and enable different host equipment to be backed up mutually. The host device may refer to the device in which the container is located.
The network layer 203 is configured to establish a data transmission channel and a management channel between the first device and the second device. The management channel may include: a first channel and a second channel backed up from each other.
The operating system layer 204 may be used to run an operating system. The operating system may include: linux, Windows, etc. The Linux operating system may include: a centros (Community Enterprise Operating System), and the like.
To sum up, in the data processing method according to the embodiment of the present application, the picture corresponding to the monitoring area is obtained through the first device locally disposed in the monitoring area, and the picture corresponding to the monitoring area is sent to the second device, so that the second device performs centralized storage on the pictures corresponding to the monitoring area. According to the embodiment of the application, the remote monitoring of the monitoring area can be realized by the remote equipment such as the second equipment by checking the picture corresponding to the monitoring area, and the uploading cost of the picture can be less than that of the video, so that the remote monitoring requirement can be realized under the condition of reducing the uploading cost. The above remote device may further include: a third device, a fourth device, etc.
Method example II
Referring to fig. 3, a flowchart illustrating steps of a second embodiment of the data processing method of the present application is shown, which may specifically include the following steps:
301, acquiring a video corresponding to a monitoring area from a video source;
step 302, capturing a video corresponding to the monitoring area to obtain a picture corresponding to the monitoring area;
and step 303, uploading the picture corresponding to the monitoring area to the second device.
With respect to the first method embodiment shown in fig. 1, the method of the embodiment of the present application may include:
and 304, acquiring a real-time video stream corresponding to the monitoring area from a video source according to the received live broadcast instruction, and uploading the real-time video stream to the second equipment.
The embodiment of the application can support the live broadcast of the video stream corresponding to the monitoring area. Specifically, the first device may receive a live broadcast instruction sent by the second device, and send a real-time video stream corresponding to the monitoring area to the second device according to the live broadcast instruction, so that the second device distributes the real-time video stream, and a user live broadcasts the real-time video stream corresponding to the monitoring area within a preset area range. For example, the second device may distribute the real-time video stream corresponding to the monitoring area through a CDN (Content Delivery Network).
In this embodiment, the load task of the first device may include: and (4) screenshot tasks, or screenshot tasks and live tasks.
The screenshot task is used for acquiring videos corresponding to a monitoring area from a video source, performing screenshot on the videos corresponding to the monitoring area to obtain pictures corresponding to the monitoring area, and uploading the pictures corresponding to the monitoring area to second equipment.
The live broadcast task is used for acquiring a real-time video stream corresponding to the monitoring area from a video source according to a received live broadcast instruction, and uploading the real-time video stream to the second equipment.
Optionally, the parameters of the screenshot task may include: screenshot frequency, and/or the number of paths of the video source. Generally, the higher the screenshot frequency, the more hardware resource information is occupied. Or, the more the number of paths of the video source is, the more the hardware resource information is occupied.
Optionally, the information of the load task may include: identification of the monitored area, etc. For the live broadcast task, live broadcast can be performed on the monitoring area corresponding to the identifier. For the screenshot task, screenshot can be performed on the monitoring area corresponding to the identifier.
In this embodiment of the present application, optionally, the hardware resource information of the first device is matched with the load task information corresponding to the first device, so that peak crowding of a network bandwidth and heating of the first device are reduced, and a fault of the first device can be further reduced.
In this embodiment, the communication between the first device and the external device may consume a network bandwidth, and the external device may include: a video source or a second device, etc.
The embodiment of the application can meet the large-scale live broadcast requirement. Specifically, the second device may schedule the plurality of first devices according to the hardware resource information of the first devices and/or the load task information corresponding to the first devices, so as to allocate the load task to the best-adapted first device. In practical application, a plurality of first devices may be disposed in one monitoring area, and the plurality of first devices may support a large-scale load task and may improve the balance of the large-scale load task.
Optionally, in the embodiment of the application, the candidate first device matched with the identifier may be obtained according to the identifier of the monitoring area carried in the load task, and the target first device corresponding to the load task may be determined from the candidate first devices according to the hardware resource information of the candidate first device.
According to an embodiment, a first device with the most hardware resource information may be determined from the candidate first devices as a target first device. For example, the first device a and the second device B are both matched with the identifier of the monitored area, but the hardware resource information of the first device a is greater than that of the second device B, and then the first device a may be selected as the target first device.
According to another embodiment, if the hardware resource information of the first device reaches the preset threshold and the load task of the first device includes: and (4) the screenshot task can be updated according to the parameters of the screenshot task of the first device, so that the hardware resource information of the first device does not exceed a preset threshold value. For example, the screenshot frequency of the first device corresponding to the screenshot task, and/or the path number of the video source and the like can be reduced.
Referring to fig. 4, a schematic diagram of an architecture of a monitoring application in a first device according to an embodiment of the present application is shown, which may specifically include: an atomic function layer 401, a task orchestration layer 402, a task execution layer 403, and a security layer 404.
Among other things, the atomic functional layer 401 may be used to implement atomic functions, which may refer to non-separable functions.
The atomic functions of the embodiments of the present application may include at least one of the following functions:
the resolution updating function is used for updating the resolution of the picture corresponding to the monitoring area;
a screenshot frequency updating function for updating the screenshot frequency of the screenshot task;
a hardware reporting function, configured to report hardware resource information;
the video pulling function is used for pulling the video corresponding to the monitoring area from the video source;
a decoding function for decoding the pulled video; alternatively, the decoding function may be implemented using a digital video compression format.
And the screenshot function is used for intercepting the picture corresponding to the monitoring area from the decoded video.
The picture uploading function is used for uploading pictures corresponding to the monitoring area to the second equipment;
a video stream pushing function for pushing the real-time video stream to the second device; optionally, a Real-Time communication Protocol, such as RTMP (Real Time Messaging Protocol), may be used to push the Real-Time video stream to the second device.
And the task pre-detection function is used for detecting the load task in advance.
And the task orchestration layer 402 is used for orchestrating the load tasks.
Orchestration of the task orchestration layer 402 may include: task Pipeline (Pipeline) creation, task off-peak, task increment, parallel task Pipeline scheduling, and the like.
The Pipeline is just like a production line in a factory, and is responsible for the position of a deployment worker (valve), and the valve is responsible for different operations on the production line. The completion of one production line requires two steps: 1, transporting the raw materials to workers; 2, the worker completes the part of his own responsibility.
The task staggering is used for enabling the first equipment to miss a peak period of the task by updating the parameters of the load task, namely enabling the hardware resource information of the first equipment not to exceed a preset threshold value.
The task execution layer 403 may be used to execute the load task.
The functions of the task execution layer 403 may include: heartbeat detection to improve the effectiveness of the connection between the first device and the second device. The task execution layer 403 may include: and (4) thread pool. A thread pool is a form of multi-threaded processing in which tasks are added to a queue and then automatically started after a thread is created.
The security layer 404 is used to improve the security of the connection between the first device and the second device.
The functions of the security layer 404 may include: connection authentication and channel encryption.
The connection authentication may be used for performing connection authentication in a process of establishing a connection between the first device and the second device. Specifically, an authentication request may be sent to the second device, and after the authentication request passes, the connection between the first device and the second device may be established.
Alternatively, connection authentication may be performed based on an asymmetric key. Optionally, the authentication request may include: and monitoring the corresponding mark of the area. Optionally, the number of errors of SSH (Secure Shell) connection between the first device and the second device may not exceed a preset number, which may reduce the situation of illegal connection to some extent. Optionally, in a case that the number of errors of the SSH connection exceeds a preset number, the account corresponding to the first device may be locked.
Alternatively, the management channel between the first device and the second device may be an encrypted channel. The encryption channel may be based on a Cipher Block Chaining mode (CBC).
In this embodiment of the application, optionally, the monitoring application establishes a connection with the second device after the instance is authenticated by the asymmetric key system, and performs command interaction and data interaction with the second device based on the session.
To sum up, in the data processing method according to the embodiment of the present application, the first device may receive a live broadcast instruction sent by the second device, and send a real-time video stream corresponding to the monitoring area to the second device according to the live broadcast instruction, so that the second device distributes the real-time video stream, so that the user can live broadcast the real-time video stream corresponding to the monitoring area within the preset area range.
Method example III
Referring to fig. 5, a flowchart illustrating steps of a third embodiment of the data processing method in the present application is shown, and is applied to a second device, where the method specifically includes the following steps:
step 501, receiving a picture corresponding to a monitoring area from first equipment; the picture corresponding to the monitoring area can be obtained according to the video screenshot corresponding to the monitoring area;
and 502, storing the picture corresponding to the monitoring area.
The second device may be a cloud-based device. The second device may receive, from the first device, the pictures corresponding to the monitoring area based on the management channel with the first device, and store the pictures corresponding to the monitoring area in a centralized manner.
Optionally, an Object Storage Service (OSS) technology may be used to store the pictures corresponding to the monitored area in a centralized manner, and it may be understood that the embodiment of the present application does not limit a specific Storage technology.
In the embodiment of the application, a user can check the picture corresponding to the monitoring area through the first preset interface so as to obtain the situation in the remote monitoring area.
Optionally, in order to improve the Access security of the picture corresponding to the monitoring area, an RBAC (Role-Based Access Control) technology may be adopted to Control the picture accessed by the user.
In summary, according to the data processing method in the embodiment of the present application, the second device in the cloud performs centralized storage on the pictures corresponding to the monitoring area, so that, for example, the remote device of the second device can view the pictures corresponding to the monitoring area to realize remote monitoring on the monitoring area, and the uploading cost of the pictures can be less than the uploading cost of the video, so that the remote monitoring requirement can be realized in the case of reducing the uploading cost in the embodiment of the present application.
Method example four
Referring to fig. 6, a flowchart illustrating a fourth step of the data processing method according to the embodiment of the present application is shown, and is applied to a second device, where the method specifically includes the following steps:
601, receiving a picture corresponding to a monitoring area from first equipment; the picture corresponding to the monitoring area can be obtained according to the video screenshot corresponding to the monitoring area;
and step 602, storing the picture corresponding to the monitoring area.
With respect to the third embodiment of the method shown in fig. 5, the method of this embodiment may further include:
step 603, determining service information corresponding to the monitoring area according to the picture corresponding to the monitoring area;
and step 604, sending the service information corresponding to the monitoring area to a third device.
The picture corresponding to the monitoring area is analyzed to obtain the business information corresponding to the monitoring area, and the business information can enable a user corresponding to the third equipment to obtain the situation in the remote monitoring area as the analysis result corresponding to the picture. In the embodiment of the application, a user can check the service information corresponding to the monitoring area through the second preset interface so as to obtain the condition in the remote monitoring area.
A service refers to a transaction that a service provider proposes to meet the needs of a user. Those skilled in the art can determine the service according to the actual application requirements. For example, in the field of logistics monitoring, the monitored objects may include: vehicles or goods, the services may include: vehicle handling, or cargo accumulation. As another example, in an office monitoring area, the monitoring objects may include: personas, services may include: person attendance, etc.
In an optional embodiment of the present application, the step 603 of determining the service information corresponding to the monitoring area may specifically include: determining monitoring object information corresponding to the picture; and if the monitoring object information corresponding to the picture accords with the service condition, outputting service prompt information corresponding to the service condition.
Optionally, the monitoring object may include: at least one of goods, people, and vehicles. Of course, a person skilled in the art may determine a specific monitored object according to an actual application requirement, and the specific monitored object is not limited in the embodiment of the present application.
The method and the device for monitoring the image comprise the steps of firstly determining monitoring object information corresponding to the image, and then outputting service prompt information corresponding to service conditions under the condition that the monitoring object information corresponding to the image meets the service conditions so as to provide service prompt information of a service layer for a user.
The service conditions and service prompt information can be determined by those skilled in the art according to the actual application requirements. Optionally, the service condition may be used to constrain a threshold corresponding to the monitored object information. For example, the traffic conditions may include: the cargo accumulation condition reaches a first preset value, or the vehicle loading and unloading condition reaches a second preset value, or the staff attendance condition reaches a third preset value, and the like.
Of course, the above-mentioned output service prompt information is only an optional embodiment, and actually, the monitoring object information may be directly output.
In this embodiment of the application, optionally, the determining the monitoring object information corresponding to the picture may specifically include: determining monitoring object information corresponding to the picture according to a mapping relation between the picture and the monitoring object information; the mapping relationship can be obtained according to the image sample and the labeled monitoring object information corresponding to the image sample.
According to one embodiment, the mapping relationship between the picture and the monitored object information may be characterized by a mapping table. In the mapping relationship, the image can be characterized by the picture characteristics. In the mapping relationship, the monitoring object information may include: information of one monitored object or information of a plurality of monitored objects. In this case, the mapping table may be searched according to the picture corresponding to the monitoring area, so as to obtain the monitoring object information corresponding to the picture.
According to an embodiment, monitoring the object information may include: the number of the monitoring objects in the picture, the density of the monitoring objects in the picture in the monitoring area corresponding to the picture and the like.
According to another embodiment, monitoring the object information may include: the relationship information between the first monitoring object and the second monitoring object in the picture, such as whether people exist on an office station or whether goods exist on a vehicle, and the like.
The picture features may include: at least one of a color feature, a texture feature, a shape feature, and a spatial relationship feature.
The color feature is a global feature and describes the surface property of the logistics object corresponding to the picture or the picture area; the texture feature is also a global feature, and describes the surface property of the logistics object corresponding to the picture or the picture area; the shape features are expressed in two types, one is outline features, the other is region features, the outline features of the pictures mainly aim at the outer boundary of the logistics object, and the region features of the pictures are related to the whole shape region; the spatial relationship characteristic refers to a spatial position or a relative direction relationship between a plurality of targets segmented from the picture, and these relationships can be also divided into a connection/adjacency relationship, an overlapping/overlapping relationship, an inclusion/containment relationship, and the like. Optionally, the picture features may also be determined using a convolutional neural network. The embodiment of the present application does not limit the specific picture features.
According to another embodiment, the mapping relationship between the picture and the monitored object information may be characterized by a data analyzer. Correspondingly, the method may further include: training the training data to obtain a data analyzer; the data analyzer can be used for representing the mapping relation between the picture and the monitoring object information; the training data may include: the image sample and the monitored object information corresponding to the image sample can determine the monitored object information corresponding to the image sample in an annotation mode.
In an alternative embodiment of the present application, the mathematical model may be trained based on training data to obtain a data analyzer, and the data analyzer may characterize a mapping relationship between input data (pictures) and output data (monitored object information), so as to determine a breakage detection result based on the input.
The mathematical model is a scientific or engineering model constructed by using a mathematical logic method and a mathematical language, and is a mathematical structure which is generally or approximately expressed by adopting the mathematical language aiming at the characteristic or quantity dependency relationship of a certain object system, and the mathematical structure is a relational structure which is described by means of mathematical symbols. The mathematical model may be one or a set of algebraic, differential, integral or statistical equations, and combinations thereof, by which the interrelationship or causal relationship between the system variables is described quantitatively or qualitatively. In addition to mathematical models described by equations, there are also models described by other mathematical tools, such as algebra, geometry, topology, mathematical logic, etc. Where the mathematical model describes the behavior and characteristics of the system rather than the actual structure of the system. The method can adopt methods such as machine learning and deep learning methods to train the mathematical model, and the machine learning method can comprise the following steps: linear regression, decision trees, random forests, etc., and the deep learning method may include: convolutional Neural Networks (CNN), Long Short-Term Memory (LSTM), Gated cyclic units (GRU), and so on.
In this embodiment of the application, optionally, the picture may correspond to a first feature, and the data analyzer may correspond to a second feature, and then the determining of the monitored object information corresponding to the picture may specifically include: determining a target data analyzer corresponding to the picture according to the first characteristic corresponding to the picture and the second characteristic corresponding to the data analyzer; and determining the monitored object information corresponding to the picture by using the target data analyzer.
The embodiment of the application can determine the target data analyzer from the plurality of data analyzers based on the matching between the first characteristic and the second characteristic. The second feature may include: data analyzers are good at processing the characteristics of pictures. The first feature may include: the category of the object, such as cargo, people, or vehicles, is monitored.
Optionally, one data analyzer may correspond to a category of one monitored object, and then, the results output by the multiple data analyzers may be fused to obtain the monitored object information corresponding to the picture. For example, a first data analyzer corresponds to a vehicle, a second data analyzer corresponds to a person, a third data analyzer corresponds to a cargo, and so on.
According to the embodiment of the application, the data analyzer can be used for processing the plurality of pictures matched with the data analyzer in batch, so that the processing efficiency of the pictures can be improved.
In an alternative embodiment of the present application, it is assumed that the service requirement is: determining the monitored object information corresponding to the picture according to the vehicle staying time may specifically include: determining a starting time under the condition that the vehicle exists in the picture; determining a departure time in the case where the vehicle departs in the picture; and determining the vehicle stopping time according to the starting time and the leaving time.
In an alternative embodiment of the present application, it is assumed that the service requirement is: the determining of the monitoring object information corresponding to the picture according to the handling efficiency information may specifically include: determining the loading and unloading starting time under the condition that a vehicle exists and a person exists in the picture; determining the loading and unloading end time when the vehicle leaves in the picture; and determining loading and unloading efficiency information according to the loading and unloading starting time and the loading and unloading ending time.
In practical application, whether the vehicle exists in the picture can be judged by utilizing a first data analyzer corresponding to the vehicle; the second data analyzer corresponding to the person can be used to determine whether the person exists in the picture.
It is understood that the above information on the vehicle staying time and the loading and unloading efficiency is only used as an alternative example of the service requirement, and in fact, a person skilled in the art can determine the required service requirement according to the actual application requirement, and the embodiment of the present application does not impose a limitation on the specific service requirement.
To sum up, the data processing method according to the embodiment of the present application analyzes the picture corresponding to the monitored area to obtain the service information corresponding to the monitored area, and the service information can enable the user corresponding to the third device to obtain the situation in the remote monitored area as the analysis result corresponding to the picture.
Method example five
Referring to fig. 7, a flowchart of a fifth step of an embodiment of the data processing method in the present application is shown, and is applied to a second device, where the method specifically includes the following steps:
step 701, receiving a picture corresponding to a monitoring area from first equipment; the picture corresponding to the monitoring area can be obtained according to the video screenshot corresponding to the monitoring area;
and step 702, storing the picture corresponding to the monitoring area.
With respect to the third embodiment of the method shown in fig. 5, the method of this embodiment may further include:
703, issuing a live broadcasting instruction to the first equipment;
step 704, receiving a real-time video stream corresponding to the monitoring area uploaded by the first device;
step 705, distributing the link of the real-time video stream corresponding to the monitoring area to a third device.
The embodiment of the application can support the live broadcast of the video stream corresponding to the monitoring area. Specifically, the second device may issue a live broadcast instruction to the first device, receive the real-time video stream corresponding to the monitoring area from the first device, and distribute a link of the real-time video stream corresponding to the monitoring area to the third device, so that a user corresponding to the third device can live broadcast the real-time video stream corresponding to the monitoring area within a preset area range. For example, the second device may distribute, through the CDN, a real-time video stream corresponding to the monitoring area.
In the embodiment of the application, a user can trigger a live broadcast request through a third preset interface; and the second device may issue a live command to the first device in response to the live request. The live broadcast request or the live broadcast instruction may carry an identifier of the monitoring area.
In an optional embodiment of the application, the issuing the live broadcast instruction to the first device specifically may include: determining target first equipment from a plurality of first equipment corresponding to the live broadcast instruction according to the characteristic information corresponding to the first equipment; the feature information may include: hardware resource information and/or load task information; and issuing the live broadcasting instruction to the target first equipment.
In this embodiment of the application, one live broadcast instruction may correspond to a plurality of first devices, and in this embodiment of the application, a plurality of first devices corresponding to the live broadcast instruction may be scheduled according to feature information corresponding to the first devices, so as to improve a balance degree between loads of the plurality of first devices. In practical application, a plurality of first devices may be disposed in one monitoring area, and the plurality of first devices may support a large-scale load task and may improve the balance of the large-scale load task.
According to an embodiment, a first device with the most hardware resource information can be determined from a plurality of first devices corresponding to a live broadcast instruction, and the first device is used as a target first device. For example, the first device a and the second device B are both matched with the identifier of the monitoring area corresponding to the live broadcast instruction, but the hardware resource information of the first device a is greater than that of the second device B, and then the first device a may be selected as the target first device.
According to another embodiment, a first device with the lightest load task may be determined from a plurality of first devices corresponding to the live broadcast instruction as a target first device. For example, the first device C and the second device D are both matched with the identifier of the monitoring area corresponding to the live broadcast instruction, but the load task of the first device C only includes the screenshot task, and the load task of the second device D includes the live broadcast task, and then the first device C may be selected as the target first device.
According to another embodiment, if the hardware resource information of the first device reaches the predetermined threshold and the load task of the first device includes: and (4) the screenshot task can be updated according to the parameters of the screenshot task of the first device, so that the hardware resource information of the first device does not exceed a preset threshold value. For example, the screenshot frequency of the first device corresponding to the screenshot task, and/or the path number of the video source and the like can be reduced.
In this embodiment of the application, a link of the real-time video stream corresponding to the monitoring area may be distributed to the third device, and the link may be used to point to the real-time video stream.
Alternatively, in order to improve the access security of the real-time video stream, the link of the real-time video stream may have a deadline, and the length of the deadline of the link may be determined by those skilled in the art according to the actual application requirements, for example, the length of the deadline of the link may be 10 minutes, and the like. The linking of the real-time video stream may fail after the link deadline is exceeded.
To sum up, in the data processing method according to the embodiment of the present application, the second device may issue a live broadcast instruction to the first device, receive the real-time video stream corresponding to the monitoring area from the first device, and distribute a link of the real-time video stream corresponding to the monitoring area to the third device, so that a user corresponding to the third device can live broadcast the real-time video stream corresponding to the monitoring area within a preset area range. For example, the second device may distribute, through the CDN, a real-time video stream corresponding to the monitored area.
Referring to fig. 8, an architecture diagram of a second device in the embodiment of the present application is shown, which may specifically include: a connection layer 801, an authentication session layer 802, a task scheduling layer 803, a task processing layer 804, and a management layer 805.
Wherein the connection layer 801 may establish a connection with a first device. The connection may be a TCP (Transmission Control Protocol) connection. Optionally, the connection may include: BIO (synchronous blocking input/output), NIO (non-synchronous blocking input/output), and the like.
Alternatively, the above-described connections may be utilized for serialized transmission of data or commands. Serialization may refer to the process of converting a data structure or object into a binary string. Deserialization may refer to the process of converting a binary string generated in the serialization process into a data structure or object. In particular, in the embodiments of the present application, the data structure or the object may correspond to a picture or a real-time video stream corresponding to the monitored area.
The authentication session layer 802 may authenticate the first device and establish a session after authentication is passed to communicate data or commands based on the session. Techniques employed by authentication session layer 802 may include: asymmetric encryption, health check, meta-information comparison, session establishment and authentication. The health check may refer to checking hardware resource information corresponding to the first device. The meta-information comparison may verify an identifier of the monitoring area corresponding to the first device.
The task scheduling layer 803 may schedule the load tasks. The processing procedure of the task scheduling layer 803 may include: hardware resource information analysis, task peak staggering, task recovery, task increment, task pre-detection, first equipment selection, load balancing and the like. The task off-peak may cause the first device to miss a peak period of the task, that is, cause hardware resource information of the first device not to exceed a preset threshold. Task reclamation may be used to reclaim outstanding load tasks. The task delta may be used to update the load task.
The task processing layer 804 may be configured to process a picture corresponding to the monitored area. The processing procedure of the task processing layer 804 may include: picture reception, picture storage, analysis result notification, picture identification and the like. The analysis result notification is used for notifying the user of the analysis result corresponding to the picture.
The management layer 805 may be used to manage the first device and the third device and provide picture viewing and live services to the third device.
Optionally, the task processing layer 804 may implement picture recognition by calling a picture recognition interface. Assuming that the picture recognition interface is provided by the analysis service, the task processing layer 804 may send the picture recognition task to the analysis service in a streaming computing manner.
The analysis service department can be provided with a plurality of data analyzers, and one data analyzer can acquire the most matched picture identification task from the received picture identification tasks and perform batch processing so as to improve the processing performance.
Method example six
Referring to fig. 9, a flowchart illustrating steps of a sixth embodiment of the data processing method according to the present application is shown, where the method is applied to a third device, and specifically may include the following steps:
step 901, receiving a picture corresponding to the monitoring area from the second device; the picture corresponding to the monitoring area can be obtained according to the video screenshot corresponding to the monitoring area;
and 902, processing the picture corresponding to the monitoring area.
The third device may refer to a device used by the user. And the user can perform remote monitoring corresponding to the monitoring area through the third equipment.
The third device may specifically include, but is not limited to: smart phones, tablet computers, electronic book readers, MP3 (Moving Picture Experts Group Audio Layer III) players, MP4 (Moving Picture Experts Group Audio Layer IV) players, laptop portable computers, car-mounted computers, desktop computers, set-top boxes, smart televisions, wearable devices, smart speakers, and the like. It is to be understood that the embodiments of the present application are not limited to the specific devices.
Processing the picture corresponding to the monitoring area may specifically include: and displaying or alternatively broadcasting the pictures corresponding to the monitoring area. Carousel may refer to the circular playing of multiple pictures.
In the embodiment of the application, a user can check the picture corresponding to the monitoring area through the first preset interface so as to obtain the situation in the remote monitoring area. Optionally, the third device may send, in response to a trigger request of the user for the first preset interface, a picture viewing request to the second device, so that the second device obtains, according to the picture viewing request, a picture corresponding to the picture viewing request from pictures stored in a centralized manner. Optionally, the picture viewing request may carry an identifier of the first monitoring area, and then step 901 may receive a picture corresponding to the first monitoring area. Optionally, the picture viewing request may carry time information to obtain the first monitoring area and a picture corresponding to the time information.
Optionally, in order to improve the access security of the picture corresponding to the monitored area, the RBAC technology may be adopted to control the picture accessed by the user. Under the condition of adopting the RBAC technology, users with different roles have different authorities, so that different pictures can be viewed. For example, users in different roles can view pictures corresponding to different monitoring areas.
In step 902, a plurality of pictures may be carousel according to a carousel frequency set by a user.
In an application example of the application, the embodiment of the application can meet the monitoring requirement of a head office on the operation condition of a monitoring area distributed in a branch office.
Specifically, a video source and a first device may be deployed in a monitoring area of a branch company, a monitoring application program is run on the first device, and the monitoring application program converts a video stream into pictures at a local interval (the time interval may be set according to actual business needs, such as 3 seconds, 30 seconds, 1 minute, and the like), and uploads the pictures to a second device of a headquarter through a network to meet the management needs of a headquarter.
Under the condition that a head office needs to trace back the operation condition of a monitoring area of a certain branch office at a certain time T, the playback can be carried out according to the previously uploaded pictures, and the carousel frequency corresponding to the playback can be set by a user. Because the network bandwidth occupied by uploading the pictures is much lower than that of uploading the video streams, and the storage cost of the pictures is much lower than that of the video streams, the method and the device for playing back the historical scenes in a certain monitoring area can meet the requirement of a user on playing back the historical scenes in the certain monitoring area under the condition of reducing the cost.
In an optional embodiment of the present application, the method may further include: and receiving service information corresponding to the monitoring area from a second device, wherein the service information can be obtained according to the picture.
In an optional embodiment of the present application, the service information may include: and service prompt information, wherein the service prompt information is used for prompting that the monitored object corresponding to the monitored area meets the service condition.
In an optional embodiment of the present application, the method may further include: after the user logs in, sending a live broadcast request to second equipment; receiving a link of a real-time video stream corresponding to a monitoring area from the second device; and the real-time video stream corresponding to the monitoring area corresponds to the live broadcast request. Optionally, the live broadcast request may carry an identifier of the second monitoring area, and then a link of the real-time video stream corresponding to the second monitoring area may be received.
In this embodiment of the present application, optionally, the link may correspond to a deadline.
To sum up, in the data processing method of the embodiment of the application, a picture corresponding to a monitoring area is received from a second device, and the picture corresponding to the monitoring area is processed; the monitoring requirement of the operation condition of a certain monitoring area at a certain time T can be met; and the corresponding carousel frequency may be set by a user. Because the network bandwidth occupied by uploading the pictures is much lower than that of uploading the video streams, and the storage cost of the pictures is much lower than that of the video streams, the method and the device for playing back the historical scenes in the monitoring area can meet the requirement of a user on playing back the historical scenes in the monitoring area under the condition of reducing the cost.
It should be noted that, for simplicity of description, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the embodiments are not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the embodiments. Further, those of skill in the art will recognize that the embodiments described in this specification are presently preferred embodiments and that no particular act is required to implement the embodiments of the disclosure.
The embodiment of the application also provides a data processing device.
Referring to fig. 10, a block diagram of a data processing apparatus according to an embodiment of the present application is shown, where the apparatus may be applied to a first device, and specifically includes the following modules:
a video obtaining module 1001, configured to obtain a video corresponding to a monitored area from a video source;
a video capture module 1002, configured to capture a video corresponding to the monitored area to obtain a picture corresponding to the monitored area; and
and the picture uploading module 1003 is configured to upload a picture corresponding to the monitoring area to the second device.
Optionally, the apparatus may further include:
and according to the received live broadcasting instruction, acquiring a real-time video stream corresponding to the monitoring area from a video source, and uploading the real-time video stream to second equipment.
Optionally, the management channel between the first device and the second device may include: a first channel and a second channel.
Optionally, the second channel may be a backup of the first channel, so that the first device can be managed through the second channel if the first channel is not available.
Optionally, the hardware resource information of the first device is matched with the load task information corresponding to the first device; the load task may include: and (4) screenshot tasks, or screenshot tasks and live tasks.
Optionally, the parameters of the screenshot task may include: screenshot frequency, and/or the number of paths of the video source.
Optionally, the first device implements a function of a monitoring application corresponding to the apparatus by using a container technology, an image of the monitoring application is stored in a private repository of the container, and a key of the private repository is provided by the second device.
Referring to fig. 11, a block diagram of a data processing apparatus according to an embodiment of the present application is shown, where the apparatus may be applied to a second device, and specifically includes the following modules:
a picture receiving module 1101, configured to receive a picture corresponding to a monitoring area from a first device; the picture corresponding to the monitoring area is obtained according to the video screenshot corresponding to the monitoring area; and
and an image storage module 1102, configured to store an image corresponding to the monitoring area.
Optionally, the apparatus may further include:
the service information determining module is used for determining service information corresponding to the monitoring area according to the picture corresponding to the monitoring area;
and the service information sending module is used for sending the service information corresponding to the monitoring area to the third equipment.
Optionally, the service information determining module may include:
the monitoring object information determining module is used for determining monitoring object information corresponding to the picture; and
and the service prompt information output module is used for outputting the service prompt information corresponding to the service condition if the monitored object information corresponding to the picture conforms to the service condition.
Optionally, the monitored object information determining module may include:
the mapping-based determining module is used for determining the monitoring object information corresponding to the picture according to the mapping relation between the picture and the monitoring object information; the mapping relation is obtained according to the picture samples and the marked monitoring object information corresponding to the picture samples.
Optionally, the monitoring object may include: at least one of goods, people, and vehicles.
Optionally, the monitored object information determining module may include:
the starting time determining module is used for determining loading and unloading starting time under the condition that a vehicle exists and a person exists in the picture;
the end time determining module is used for determining the loading and unloading end time under the condition that the vehicle leaves in the picture;
and the efficiency determining module is used for determining loading and unloading efficiency information according to the loading and unloading starting time and the loading and unloading ending time.
Optionally, the monitored object information determining module may include:
the target data analyzer determining module is used for determining a target data analyzer corresponding to the picture according to the first characteristic corresponding to the picture and the second characteristic corresponding to the data analyzer;
and the object determining module is used for determining the monitoring object information corresponding to the picture by using the target data analyzer.
Optionally, the apparatus may further include:
the live broadcasting instruction sending module is used for issuing a live broadcasting instruction to the first equipment;
the real-time video stream receiving module is used for receiving a real-time video stream corresponding to the monitoring area uploaded by the first device;
and the link distribution module is used for distributing the link of the real-time video stream corresponding to the monitoring area to third equipment.
Optionally, the live instruction sending module may include:
the target first equipment determining module is used for determining target first equipment from a plurality of first equipment corresponding to the live broadcast instruction according to the characteristic information corresponding to the first equipment; the characteristic information may include: hardware resource information and/or load task information;
and the instruction issuing module is used for issuing the live broadcasting instruction to the target first equipment.
Referring to fig. 12, a block diagram of an embodiment of a data processing apparatus according to the present application is shown, where the apparatus may be applied to a third device, and specifically may include the following modules:
a picture receiving module 1201, configured to receive a picture corresponding to the monitoring area from the second device; the picture corresponding to the monitoring area is obtained according to the video screenshot corresponding to the monitoring area; and
and an image processing module 1202, configured to process an image corresponding to the monitoring area.
Optionally, the apparatus may further include:
and the service information receiving module is used for receiving the service information corresponding to the monitoring area from the second equipment, and the service information is obtained according to the picture.
Optionally, the service information may include: and service prompt information, wherein the service prompt information is used for prompting that the monitored object corresponding to the monitored area meets the service condition.
Optionally, the apparatus may further include:
the live broadcast request sending module is used for sending a live broadcast request to the second equipment after the user logs in;
the link receiving module is used for receiving a link of a real-time video stream corresponding to the monitoring area from the second equipment; and the real-time video stream corresponding to the monitoring area corresponds to the live broadcast request.
Optionally, the link may correspond to a deadline.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
Embodiments of the application can be implemented as a system or apparatus employing any suitable hardware and/or software for the desired configuration. Fig. 13 schematically illustrates an example device 1300 that can be used to implement various embodiments described herein.
For one embodiment, fig. 13 illustrates an exemplary apparatus 1300, which apparatus 1300 may comprise: one or more processors 1302, a system control module (chipset) 1304 coupled to at least one of the processors 1302, system memory 1306 coupled to the system control module 1304, non-volatile memory (NVM)/storage 1308 coupled to the system control module 1304, one or more input/output devices 1310 coupled to the system control module 1304, and a network interface 1312 coupled to the system control module 1306. The system memory 1306 may include: instruction 1362, the instruction 1362 executable by the one or more processors 1302.
Processor 1302 may include one or more single-core or multi-core processors, and processor 1302 may include any combination of general-purpose processors or special-purpose processors (e.g., graphics processors, application processors, baseband processors, etc.). In some embodiments, the device 1300 can be a server, a target device, a wireless device, etc., as described in embodiments herein.
In some embodiments, device 1300 may include one or more machine-readable media (e.g., system memory 1306 or NVM/storage 1308) having instructions and one or more processors 1302, which may be configured to execute the instructions to implement the modules included by the aforementioned means, thereby performing the actions described in embodiments of the present application, in conjunction with the one or more machine-readable media.
System control module 1304 for one embodiment may include any suitable interface controller to provide any suitable interface to at least one of processors 1302 and/or any suitable device or component in communication with system control module 1304.
System control module 1304 for one embodiment may include one or more memory controllers to provide an interface to system memory 1306. The memory controller may be a hardware module, a software module, and/or a firmware module.
System memory 1306 for one embodiment may be used to load and store data and/or instructions 1362. For one embodiment, system memory 1306 may include any suitable volatile memory, such as suitable DRAM (dynamic random access memory). In some embodiments, system memory 1306 may include: double data rate type four synchronous dynamic random access memory (DDR4 SDRAM).
System control module 1304 for one embodiment may include one or more input/output controllers to provide an interface to NVM/storage 1308 and input/output device(s) 1310.
NVM/storage 1308 for one embodiment may be used to store data and/or instructions 1382. NVM/storage 1308 may include any suitable non-volatile memory (e.g., flash memory, etc.) and/or may include any suitable non-volatile storage device(s), e.g., one or more Hard Disk Drives (HDDs), one or more Compact Disc (CD) drives, and/or one or more Digital Versatile Disc (DVD) drives, etc.
The NVM/storage 1308 may include storage resources that are physically part of the device on which the apparatus 1300 is installed or may be accessible by the device and not necessarily part of the device. For example, the NVM/storage 1308 may be accessed over a network via the network interface 1312 and/or through the input/output devices 1310.
Input/output device(s) 1310 for one embodiment may provide an interface for device 1300 to communicate with any other suitable device, and input/output devices 1310 may include communication components, audio components, sensor components, and so forth.
Network interface 1312 of one embodiment may provide an interface for device 1300 to communicate with one or more networks and/or with any other suitable apparatus, and device 1300 may communicate wirelessly with one or more components of a wireless network according to any of one or more wireless network standards and/or protocols, such as to access a communication standard-based wireless network, such as WiFi, 2G, or 3G, or a combination thereof.
For one embodiment, at least one of the processors 1302 may be packaged together with logic for one or more controllers (e.g., memory controllers) of the system control module 1304. For one embodiment, at least one of processors 1302 may be packaged together with logic for one or more controllers of system control module 1304 to form a System In Package (SiP). For one embodiment, at least one of the processors 1302 may be integrated on the same novelty as the logic of one or more controllers of the system control module 1304. For one embodiment, at least one of processors 1302 may be integrated on the same chip with logic for one or more controllers of system control module 1304 to form a system on a chip (SoC).
In various embodiments, apparatus 1300 may include, but is not limited to: a computing device such as a desktop computing device or a mobile computing device (e.g., a laptop computing device, a handheld computing device, a tablet, a netbook, etc.). In various embodiments, device 1300 may have more or fewer components and/or different architectures. For example, in some embodiments, device 1300 may include one or more cameras, a keyboard, a Liquid Crystal Display (LCD) screen (including a touch screen display), a non-volatile memory port, multiple antennas, a graphics chip, an Application Specific Integrated Circuit (ASIC), and speakers.
Wherein, if the display includes a touch panel, the display screen may be implemented as a touch screen display to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation.
The present application also provides a non-transitory readable storage medium, where one or more modules (programs) are stored in the storage medium, and when the one or more modules are applied to an apparatus, the apparatus may be caused to execute instructions (instructions) of methods in the present application.
Provided in one example is an apparatus comprising: one or more processors; and, instructions in one or more machine-readable media stored thereon, which when executed by the one or more processors, cause the apparatus to perform a method as in embodiments of the present application, which may include: the method of fig. 2 or fig. 3 or fig. 4 or fig. 5 or fig. 6 or fig. 7 or fig. 8 or fig. 9.
One or more machine-readable media are also provided in one example, having instructions stored thereon, which when executed by one or more processors, cause an apparatus to perform a method as in embodiments of the application, which may include: the method shown in fig. 2 or fig. 3 or fig. 4 or fig. 5 or fig. 6 or fig. 7 or fig. 8 or fig. 9.
The specific manner in which each module performs operations of the apparatus in the above embodiments has been described in detail in the embodiments related to the method, and will not be described in detail here, and reference may be made to part of the description of the method embodiments for relevant points.
The embodiments in the present specification are all described in a progressive manner, and each embodiment focuses on differences from other embodiments, and portions that are the same and similar between the embodiments may be referred to each other.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present application have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including the preferred embodiment and all changes and modifications that fall within the true scope of the embodiments of the present application.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The foregoing detailed description has been made of a data processing method, a data processing apparatus, a device, and a machine-readable medium, which are provided by the present application, and specific examples are applied herein to explain the principles and embodiments of the present application, and the descriptions of the foregoing examples are only used to help understand the method and its core ideas of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (30)

1. A data processing method, applied to a first device, the method comprising:
acquiring a video corresponding to a monitoring area from a video source;
capturing a video corresponding to the monitoring area to obtain a picture corresponding to the monitoring area; when the hardware resource information of the first device reaches a preset threshold value, and the load task of the first device comprises the following steps: updating parameters of the screenshot task of the first device under the condition of the screenshot task; the updating comprises the following steps: reducing the number of paths of video sources corresponding to the first equipment;
uploading pictures corresponding to the monitoring area to second equipment so that the second equipment can determine monitoring object information corresponding to the pictures and third equipment can perform carousel on the pictures according to carousel frequency set by a user; the monitored object information includes: the density of the monitoring objects in the picture in the monitoring area corresponding to the picture and/or the relation information between the first monitoring object and the second monitoring object in the picture; the density is used for determining the goods stacking condition; the relationship information includes: whether cargo is on the vehicle; and outputting corresponding prompt information when the cargo accumulation condition represented by the density reaches a first preset value or the vehicle loading and unloading condition represented by the relationship information reaches a second preset value.
2. The method of claim 1, further comprising:
and according to the received live broadcast instruction, acquiring a real-time video stream corresponding to the monitoring area from a video source, and uploading the real-time video stream to the second equipment.
3. The method of claim 1, wherein managing the channel between the first device and the second device comprises: a first channel and a second channel.
4. The method of claim 3, wherein the second channel is a backup of the first channel to enable management of the first device over the second channel if the first channel is unavailable.
5. The method of claim 1, wherein the hardware resource information of the first device matches the corresponding load task information of the first device; the load task comprises the following steps: and (4) screenshot tasks, or screenshot tasks and live tasks.
6. The method of claim 5, wherein the parameters of the screenshot task include: screenshot frequency, and/or the number of paths of the video source.
7. The method according to any one of claims 1 to 5, wherein the first device implements the function of a monitoring application using container technology, the monitoring application being configured to perform the method according to the embodiment of the present application, an image of the monitoring application being stored in a private repository of the container, a key of the private repository being provided by the second device.
8. A data processing method, applied to a second device, the method comprising:
receiving a picture corresponding to a monitoring area from first equipment; the picture corresponding to the monitoring area is obtained according to the video screenshot corresponding to the monitoring area; when the hardware resource information of the first device reaches a preset threshold value, and the load task of the first device includes: updating parameters of the screenshot task of the first device under the condition of the screenshot task; the updating comprises the following steps: reducing the number of paths of video sources corresponding to the first equipment;
storing pictures corresponding to the monitoring areas;
determining monitoring object information corresponding to the picture; the monitored object information includes: the method comprises the following steps of (1) monitoring the density of a monitoring object in a picture in a monitoring area corresponding to the picture and/or relationship information between a first monitoring object and a second monitoring object in the picture; the density is used for determining the goods stacking condition; the relationship information includes: whether cargo is on the vehicle;
outputting corresponding prompt information under the condition that the cargo accumulation condition represented by the density degree reaches a first preset value or the vehicle loading and unloading condition represented by the relation information reaches a second preset value;
and sending the pictures corresponding to the monitoring area to the third equipment so that the third equipment can perform carousel on the pictures according to the carousel frequency set by the user.
9. The method of claim 8, further comprising:
determining service information corresponding to the monitoring area according to the picture corresponding to the monitoring area;
and sending the service information corresponding to the monitoring area to a third device.
10. The method according to claim 9, wherein the determining the service information corresponding to the monitoring area includes:
determining monitoring object information corresponding to the picture;
and if the monitored object information corresponding to the picture accords with the service condition, outputting service prompt information corresponding to the service condition.
11. The method according to claim 10, wherein the determining the monitored object information corresponding to the picture comprises:
determining monitoring object information corresponding to the picture according to a mapping relation between the picture and the monitoring object information; and the mapping relation is obtained according to the picture sample and the marked monitoring object information corresponding to the picture sample.
12. The method of claim 10, wherein the monitoring the object comprises: at least one of goods, people, and vehicles.
13. The method according to claim 10, wherein the determining the monitored object information corresponding to the picture comprises:
determining the loading and unloading starting time under the condition that a vehicle exists and a person exists in the picture;
determining the loading and unloading end time when the vehicle leaves in the picture;
and determining loading and unloading efficiency information according to the loading and unloading starting time and the loading and unloading ending time.
14. The method according to claim 10, wherein the determining the monitored object information corresponding to the picture comprises:
determining a target data analyzer corresponding to the picture according to the first characteristic corresponding to the picture and the second characteristic corresponding to the data analyzer;
and determining the monitored object information corresponding to the picture by using the target data analyzer.
15. The method of any one of claims 8 to 14, further comprising:
issuing a live broadcast instruction to first equipment;
receiving a real-time video stream corresponding to a monitoring area uploaded by the first device;
and distributing the link of the real-time video stream corresponding to the monitoring area to third equipment.
16. The method of claim 15, wherein the issuing a live command to the first device comprises:
determining target first equipment from a plurality of first equipment corresponding to the live broadcast instruction according to the characteristic information corresponding to the first equipment; the characteristic information includes: hardware resource information and/or load task information;
and issuing the live broadcasting instruction to the target first equipment.
17. A data processing method, applied to a third device, the method comprising:
receiving a picture corresponding to the monitoring area from the second equipment; the picture corresponding to the monitoring area is obtained by the first equipment according to the video screenshot corresponding to the monitoring area; when the hardware resource information of the first device reaches a preset threshold value, and the load task of the first device includes: under the condition of screenshot task, updating the parameters of the screenshot task of the first device; the updating comprises the following steps: reducing the number of paths of the video source corresponding to the first equipment; the second equipment determines monitoring object information corresponding to the picture; the monitored object information includes: the density of the monitoring objects in the picture in the monitoring area corresponding to the picture and/or the relation information between the first monitoring object and the second monitoring object in the picture; the density is used for determining the goods stacking condition; the relationship information includes: whether cargo is on the vehicle; outputting corresponding prompt information when the cargo accumulation condition represented by the density reaches a first preset value or the vehicle loading and unloading condition represented by the relationship information reaches a second preset value;
processing the picture corresponding to the monitoring area, including: and performing carousel on the plurality of pictures according to the carousel frequency set by the user.
18. The method of claim 17, further comprising:
and receiving service information corresponding to the monitoring area from second equipment, wherein the service information is obtained according to the picture.
19. The method of claim 18, wherein the service information comprises: and service prompt information, wherein the service prompt information is used for prompting that the monitored object corresponding to the monitoring area conforms to the service condition.
20. The method of claim 17, further comprising:
after the user logs in, sending a live broadcast request to second equipment;
receiving a link of a real-time video stream corresponding to a monitoring area from the second device; and the real-time video stream corresponding to the monitoring area corresponds to the live broadcast request.
21. The method of claim 20, wherein the link corresponds to a deadline.
22. A data processing apparatus, for application to a first device, the apparatus comprising:
the video acquisition module is used for acquiring videos corresponding to the monitoring area from the video source;
the video screenshot module is used for screenshot of the video corresponding to the monitoring area to obtain the pictures corresponding to the monitoring area, and the third equipment plays the pictures in turn according to the carousel frequency set by the user; when the hardware resource information of the first device reaches a preset threshold value, and the load task of the first device includes: under the condition of screenshot task, updating the parameters of the screenshot task of the first device; the updating comprises the following steps: reducing the number of paths of the video source corresponding to the first equipment;
and
the picture uploading module is used for uploading a picture corresponding to the monitoring area to second equipment so that the second equipment can determine monitoring object information corresponding to the picture; the monitored object information includes: the density of the monitoring objects in the picture in the monitoring area corresponding to the picture and/or the relation information between the first monitoring object and the second monitoring object in the picture; the density is used for determining the goods stacking condition; the relationship information includes: whether cargo is on the vehicle; and outputting corresponding prompt information when the cargo accumulation condition represented by the density reaches a first preset value or the vehicle loading and unloading condition represented by the relationship information reaches a second preset value.
23. A data processing apparatus, for application to a second device, the apparatus comprising:
the picture receiving module is used for receiving the picture corresponding to the monitoring area from the first equipment and sending the picture corresponding to the monitoring area to the third equipment so that the third equipment can perform carousel on the pictures according to the carousel frequency set by the user; the picture corresponding to the monitoring area is obtained according to the video screenshot corresponding to the monitoring area; when the hardware resource information of the first device reaches a preset threshold value, and the load task of the first device comprises the following steps: updating parameters of the screenshot task of the first device under the condition of the screenshot task; the updating comprises the following steps: reducing the screenshot frequency of the screenshot task corresponding to the first equipment and/or the path number of a video source;
and
the picture storage module is used for storing pictures corresponding to the monitoring areas;
the object information determining module is used for determining the monitoring object information corresponding to the picture; the monitored object information includes: the density of the monitoring objects in the picture in the monitoring area corresponding to the picture and/or the relation information between the first monitoring object and the second monitoring object in the picture; the density is used for determining the cargo accumulation condition; the relationship information includes: whether cargo is on the vehicle; and outputting corresponding prompt information under the condition that the cargo accumulation condition represented by the density degree reaches a first preset value or the vehicle loading and unloading condition represented by the relation information reaches a second preset value.
24. A data processing apparatus, for application to a third device, the apparatus comprising:
the picture receiving module is used for receiving a picture corresponding to the monitoring area from the second equipment; the picture corresponding to the monitoring area is obtained by the first equipment according to the video screenshot corresponding to the monitoring area; when the hardware resource information of the first device reaches a preset threshold value, and the load task of the first device comprises the following steps: under the condition of screenshot task, updating the parameters of the screenshot task of the first device; the updating comprises the following steps: reducing the number of paths of video sources corresponding to the first equipment; the second equipment determines monitoring object information corresponding to the picture; the monitored object information includes: the density of the monitoring objects in the picture in the monitoring area corresponding to the picture and/or the relation information between the first monitoring object and the second monitoring object in the picture; the density is used for determining the goods stacking condition; the relationship information includes: whether cargo is on the vehicle; outputting corresponding prompt information when the cargo accumulation condition represented by the density reaches a first preset value or the vehicle loading and unloading condition represented by the relationship information reaches a second preset value;
and
the image processing module is used for processing the image corresponding to the monitoring area, and comprises: and performing carousel on the plurality of pictures according to the carousel frequency set by the user.
25. An electronic device, comprising:
one or more processors; and
one or more machine-readable media having instructions stored thereon that, when executed by the one or more processors, cause the electronic device to perform the method recited by one or more of claims 1-7.
26. One or more machine-readable media having instructions stored thereon, which when executed by one or more processors, cause an electronic device to perform the method recited by one or more of claims 1-7.
27. An electronic device, comprising:
one or more processors; and
one or more machine-readable media having instructions stored thereon that, when executed by the one or more processors, cause the electronic device to perform the method recited by one or more of claims 8-16.
28. One or more machine-readable media having instructions stored thereon, which when executed by one or more processors, cause an electronic device to perform the method recited by one or more of claims 8-16.
29. An electronic device, comprising:
one or more processors; and
one or more machine-readable media having instructions stored thereon, which when executed by the one or more processors, cause the electronic device to perform the method recited by one or more of claims 17-21.
30. One or more machine-readable media having instructions stored thereon, which when executed by one or more processors, cause an electronic device to perform the method recited by one or more of claims 17-21.
CN201910245758.4A 2019-03-28 2019-03-28 Data processing method, device, equipment and machine readable medium Active CN111757046B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910245758.4A CN111757046B (en) 2019-03-28 2019-03-28 Data processing method, device, equipment and machine readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910245758.4A CN111757046B (en) 2019-03-28 2019-03-28 Data processing method, device, equipment and machine readable medium

Publications (2)

Publication Number Publication Date
CN111757046A CN111757046A (en) 2020-10-09
CN111757046B true CN111757046B (en) 2022-09-30

Family

ID=72672017

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910245758.4A Active CN111757046B (en) 2019-03-28 2019-03-28 Data processing method, device, equipment and machine readable medium

Country Status (1)

Country Link
CN (1) CN111757046B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116095414A (en) * 2022-10-12 2023-05-09 京东科技信息技术有限公司 Method and device for acquiring video screenshot

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106603553A (en) * 2016-12-29 2017-04-26 大连陆海科技股份有限公司 Boat-coast video snapshot transmission system and transmission method
CN107566786A (en) * 2017-08-11 2018-01-09 深圳英飞拓科技股份有限公司 A kind of method, apparatus and terminal device for obtaining monitor video
CN108307147A (en) * 2017-12-28 2018-07-20 天地融科技股份有限公司 A kind of method and system carrying out security control using safety equipment
CN109168041A (en) * 2018-09-26 2019-01-08 深圳壹账通智能科技有限公司 A kind of mobile terminal monitored method and system

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6711461B2 (en) * 2001-03-20 2004-03-23 Lockheed Martin Corporation Object and method for accessing of articles for reliable knowledge of article positions
CN100542255C (en) * 2006-09-26 2009-09-16 腾讯科技(深圳)有限公司 A kind of network TV monitoring system and method
JP2010152728A (en) * 2008-12-25 2010-07-08 Seiko Precision Inc Management system, management method, program, management device and on-vehicle machine
US20100191809A1 (en) * 2009-01-23 2010-07-29 Alcatel-Lucent Usa Inc. Capability to capture and share image displayed from multicast video
CN103546766B (en) * 2013-11-14 2017-02-08 腾讯科技(成都)有限公司 Video processing method, related equipment and communication system
CN105744215B (en) * 2014-12-09 2019-08-06 视联动力信息技术股份有限公司 A kind of interactive mode monitoring system
CN107403249A (en) * 2016-05-19 2017-11-28 阿里巴巴集团控股有限公司 Article control method, device, intelligent storage equipment and operating system
CN206194068U (en) * 2016-11-18 2017-05-24 广州图卫科技股份有限公司 Forest fire prevention early warning system
CN108182607A (en) * 2018-01-29 2018-06-19 河北三川科技有限公司 The advertisement monitoring system and method uploaded based on terminal sectional drawing
CN108733821A (en) * 2018-05-22 2018-11-02 武汉微创光电股份有限公司 A kind of distribution of monitor video sectional drawing and methods of exhibiting and system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106603553A (en) * 2016-12-29 2017-04-26 大连陆海科技股份有限公司 Boat-coast video snapshot transmission system and transmission method
CN107566786A (en) * 2017-08-11 2018-01-09 深圳英飞拓科技股份有限公司 A kind of method, apparatus and terminal device for obtaining monitor video
CN108307147A (en) * 2017-12-28 2018-07-20 天地融科技股份有限公司 A kind of method and system carrying out security control using safety equipment
CN109168041A (en) * 2018-09-26 2019-01-08 深圳壹账通智能科技有限公司 A kind of mobile terminal monitored method and system

Also Published As

Publication number Publication date
CN111757046A (en) 2020-10-09

Similar Documents

Publication Publication Date Title
US11290537B1 (en) Discovery of device capabilities
US20210326128A1 (en) Edge Computing Platform
US11501881B2 (en) Apparatus and method for deploying a mobile device as a data source in an IoT system
US11233826B2 (en) System and method of microservice-based application deployment with automating authorization configuration
KR101797416B1 (en) Method and device for processing request
US11765123B1 (en) Receiving a data object at a device
US10635687B2 (en) Delivering a data object to a device
US11356537B2 (en) Self-learning connected-device network
JP7453426B2 (en) Network management systems, methods, devices and electronic equipment
US10958536B2 (en) Data management policies for internet of things components
Xiong et al. Design and implementation of a prototype cloud video surveillance system
US20140207942A1 (en) Network element diagnostic evaluation
CN111757046B (en) Data processing method, device, equipment and machine readable medium
Limna et al. A flexible and scalable component-based system architecture for video surveillance as a service, running on infrastructure as a service
CN114610442A (en) One-stop cloud migration system, method, equipment and storage medium
US11636679B2 (en) Apparatus and method for detecting suspicious content
US20190082230A1 (en) Controlling internet of things (iot) devices and aggregating media content through a common device
US8774599B2 (en) Method for transcoding and playing back video files based on grid technology in devices having limited computing power
US20120265879A1 (en) Managing servicability of cloud computing resources
US20230280997A1 (en) Automated process and system update scheduling in a computer network
US20190036880A1 (en) Automated firewall-compliant customer support resolution provisioning system
EP3688588B1 (en) Receiving a data object at a device
Perez et al. An experimental publish-subscribe monitoring assessment to Beyond 5G networks
US11922161B2 (en) Scheduling a pausable automated process in a computer network
US11792135B2 (en) Automated process scheduling in a computer network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant