CN112702612A - Method, device and system for dynamically reading video - Google Patents

Method, device and system for dynamically reading video Download PDF

Info

Publication number
CN112702612A
CN112702612A CN201911009331.0A CN201911009331A CN112702612A CN 112702612 A CN112702612 A CN 112702612A CN 201911009331 A CN201911009331 A CN 201911009331A CN 112702612 A CN112702612 A CN 112702612A
Authority
CN
China
Prior art keywords
video address
video
camera
address
failure
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911009331.0A
Other languages
Chinese (zh)
Inventor
刘川江
章儒洪
徐晓晨
曹建
杨健
司良君
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Shundian Technology Co ltd
Chengdu Yuanben Innovation Technology Co ltd
Original Assignee
Chengdu Shundian Technology Co ltd
Chengdu Yuanben Innovation Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Shundian Technology Co ltd, Chengdu Yuanben Innovation Technology Co ltd filed Critical Chengdu Shundian Technology Co ltd
Priority to CN201911009331.0A priority Critical patent/CN112702612A/en
Publication of CN112702612A publication Critical patent/CN112702612A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L43/00Arrangements for monitoring or testing data switching networks
    • H04L43/08Monitoring or testing based on specific metrics, e.g. QoS, energy consumption or environmental parameters
    • H04L43/0805Monitoring or testing based on specific metrics, e.g. QoS, energy consumption or environmental parameters by checking availability
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/658Transmission by the client directed to the server

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Databases & Information Systems (AREA)
  • Environmental & Geological Engineering (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

The application provides a method, a device and a system for dynamically reading videos, wherein the method comprises the following steps: acquiring camera related information, wherein the camera related information comprises camera point location information, a video address corresponding to a camera and video service provider information corresponding to the video address; determining a video address type based on the camera point location information, wherein the video address type comprises: an indefinite failure video address, a timed failure video address, a permanent effective video address; updating the valid video address based on the video address type; and reading the video based on the valid video address.

Description

Method, device and system for dynamically reading video
Technical Field
The present invention relates to the field of video technologies, and in particular, to a method, an apparatus, and a system for dynamically reading a video.
Background
The current outstanding food safety problem is particularly the food safety problem of main bodies such as school canteens and the like. And the third-party supervision platform monitors the main body by using the video so as to realize real-time supervision. However, different subjects may select different video service providers, and different video service providers may select different live broadcast providing methods, such as irregular failure, regular failure, or permanent effectiveness, which may cause interruption of live broadcast video service. In order to ensure that a user continuously and stably uses the live broadcast service, a third-party supervision platform needs to provide a method for dynamically reading a video.
Disclosure of Invention
One aspect of the present application relates to a method of dynamically reading video, the method comprising: acquiring camera related information, wherein the camera related information comprises camera point location information, a video address corresponding to a camera and video service provider information corresponding to the video address; determining a video address type based on camera point location information, wherein the video address type comprises: an indefinite failure video address, a timed failure video address, a permanent effective video address; updating the valid video address based on the video address type; and reading the video based on the valid video address.
In some embodiments, the camera point location information includes: the camera number, the area where the camera is located, and the main body ID of the main body to which the camera belongs.
In some embodiments, further comprising determining whether the video address is stale based on a network detect command,
if the video address server network is normal, the video address server network is not invalid; if not, the operation is failed.
In some embodiments, the network detection command comprises: PING commands, network management protocol commands, traceroute, actual connection video address.
In some embodiments, updating the sporadic failure video address comprises: configuring an update time; and in the updating time, judging whether the irregular failure video address fails or not based on the network detection command, if the irregular failure video address fails, requesting an effective irregular failure video address based on the video service provider information corresponding to the irregular failure video address, covering the failed irregular failure video address by using the effective irregular failure video address, and if the irregular failure video address does not fail, not updating.
In some embodiments, updating the timed stale video address comprises: obtaining the effective time of the timing failure video address; requesting a new valid timed failed video address based on the video service provider information corresponding to the timed failed video address within the valid time; storing the new effective timing failure video address; and replacing the timed out video address with the stored new valid timed out video address.
Another aspect of the present application relates to a dynamic read video apparatus, comprising: the acquisition module is used for acquiring camera related information, wherein the camera related information comprises camera point location information, a video address corresponding to the camera and video service provider information corresponding to the video address; the determining module is used for determining a video address type based on the camera point location information, wherein the video address type comprises: an indefinite failure video address, a timed failure video address, a permanent effective video address; the updating module is used for updating the effective video address based on the video address type; and the reading module is used for reading the video based on the effective video address.
In some embodiments, the camera point location information includes: the camera number, the area where the camera is located, and the main body ID of the main body to which the camera belongs.
In some embodiments, the updating module updates the occasionally-expired video address comprises: a configuration unit for configuring an update time; the device comprises a judging unit used for judging whether the irregular failure video address fails or not based on a network detection command in the updating time, wherein if the irregular failure video address fails, the acquiring unit is used for requesting the effective irregular failure video address based on the video service provider information corresponding to the irregular failure video address, the updating unit is used for covering the failed irregular failure video address by using the effective irregular failure video address, and if the irregular failure video address does not fail, the updating unit is not used for updating.
In some embodiments, the network detection command comprises: PING command, network management protocol command, traceroute, actually connect the video address.
In some embodiments, the update module updates the timed stale video address, comprising: determining the effective time of the timing failure video address; an acquisition unit, configured to request a new valid timing failure video address based on video service provider information corresponding to the timing failure video address within a valid time; storing the new effective timing failure video address; and overwriting the timing failure video address with the stored new valid timing failure video address.
Yet another aspect of the present application relates to a dynamic reading video system, comprising: a memory including instructions; and a processor configured to execute instructions to: acquiring camera related information, wherein the camera related information comprises camera point location information, a video address corresponding to a camera, and video service provider information corresponding to a video; determining a video address type based on camera point location information, wherein the video address type comprises: an indefinite failure video address, a timed failure video address, a permanent effective video address; updating the valid video address based on the video address type; and reading the video based on the valid video address.
Yet another aspect of the application relates to a non-transitory computer-readable storage medium storing instructions that, when executed by one or more processors, cause the processors to perform a method of dynamically reading video, the method comprising: acquiring camera related information, wherein the camera related information comprises camera point location information, a video address corresponding to a camera, and video service provider information corresponding to a video; determining a video address type based on camera point location information, wherein the video address type comprises: an indefinite failure video address, a timed failure video address, a permanent effective video address; updating the valid video address based on the video address type; and reading the video based on the valid video address.
Additional features of the present application will be set forth in part in the description which follows. Additional features of some aspects of the present application will be apparent to those of ordinary skill in the art in view of the following description and accompanying drawings, or in view of the production or operation of the embodiments. The features of the present application may be realized and obtained by means of the instruments and methods and by means of the methods and combinations set forth in the detailed examples discussed below.
Drawings
The present application will be further described in conjunction with the exemplary embodiments. These exemplary embodiments will be described in detail by means of the accompanying drawings. These embodiments are not intended to be limiting, and like reference numerals refer to like structures throughout these embodiments.
Fig. 1 is a schematic diagram of an exemplary dynamic read video system, shown in accordance with some embodiments of the present application.
FIG. 2 is a schematic diagram of exemplary hardware and software components of a computing device shown in accordance with some embodiments of the present application.
Fig. 3 is a diagram illustrating exemplary hardware and/or software components of a mobile device on which a terminal may be implemented according to some embodiments of the present application.
FIG. 4 is an exemplary block diagram of a processing engine shown in accordance with some embodiments of the present application.
FIG. 5 is an exemplary block diagram of an update module shown in accordance with some embodiments of the present application.
Fig. 6 is an exemplary flow diagram of a method of dynamically reading video, according to some embodiments of the present application.
Fig. 7 is an exemplary flow diagram illustrating a method of updating an occasionally-expired video address according to some embodiments of the present application.
Fig. 8 is an exemplary flow diagram of a method of updating a timed stale video address according to some embodiments of the present application.
Detailed Description
In order to more clearly explain the technical solutions of the embodiments of the present application, the drawings that are required to be used in the description of the embodiments will be briefly introduced below. It is obvious that the drawings described below are only examples or embodiments of the present application, and that for a person skilled in the art, the present application can also be applied to other similar scenarios without inventive effort. Unless otherwise apparent from the context, or otherwise indicated, like reference numbers in the figures refer to the same structure or operation.
As used in this application and the appended claims, the terms "a," "an," and/or "the" do not denote a singular or plural number, unless the context clearly dictates otherwise. It should be understood that the terms "comprises" and/or "comprising," when used in this application, specify the presence of stated features, integers, steps, operations, elements, and/or components. But does not preclude the addition of at least one other feature, constant, step, operation, element, and/or component.
It should be appreciated that the terms "system," "engine," "unit," "module," and/or "block," as used herein, are a means for distinguishing between different components, elements, and/or parts in an ascending order. However, these terms may be substituted by other expressions if other expressions achieve the same object. Although various references are made herein to certain modules in embodiments according to the present application, any number of different modules may be used and run on the client and/or server. These modules are intended to be illustrative and not to limit the scope of the present application. Different modules may be used in different aspects of the systems and/or methods.
The application and other features, methods of operation, and functions of related elements will be more apparent from the following description of the drawings, which form a part of this specification. It is to be understood, however, that the drawings are designed solely for the purposes of illustration and description and are not intended as a definition of the limits of the application. It should be understood that the drawings are not to scale. Flow charts are used herein to illustrate operations performed by systems according to embodiments of the present application. It should be understood that the operations of the flow diagrams are not necessarily performed exactly in order. Rather, these operations may be performed in the reverse order or simultaneously. Also, at least one other operation may be added to the flowchart. At least one operation may be removed from the flowchart.
Fig. 1 is a schematic block diagram of an exemplary dynamic read video system 100 shown in accordance with some embodiments of the present application. As shown in fig. 1: dynamic reading video system 100 may include a server 110, a network 120, at least one user terminal 130, a database 140, at least one video service provider 150.
In some embodiments, the server 110 may be a single server or a group of servers. The server groups may be centralized or distributed. In some embodiments, the server 110 may be local or remote. For example, server 110 may connect to user terminals 130, databases 140, and/or video service providers 150 via network 120 to access stored information and/or data. In some embodiments, the server 110 may be implemented on a cloud platform. By way of example only, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a cell cloud, a distributed cloud, across clouds, a cloudy, the like, or any combination thereof. In some embodiments, server 110 may be implemented on a computing device 200 having at least one of the components illustrated in FIG. 2 herein.
In some embodiments, server 110 may include a processing engine 111. The processing engine 111 may process information and data related to the dynamic read video to perform at least one function described herein. In some embodiments, processing engine 111 may include at least one processing device (e.g., a single core processing device or a multi-core processor). By way of example only, the processing engine 111 may include a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), an application specific instruction set processor (ASIP), an image processing unit (GPU), a physical arithmetic processing unit (PPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), a Programmable Logic Device (PLD), a controller, a microcontroller unit, a Reduced Instruction Set Computer (RISC), a microprocessor, or the like, or any combination thereof.
Network 120 may facilitate the exchange of information and/or data. In some embodiments, at least one component of the dynamic reading video system 100 (e.g., the server 110, the user terminal 130, the database 140, the video service provider 150, etc.) may send information and/or data to other components in the dynamic reading video system 100 over the network 120. In some embodiments, the network 120 may be any form of wired or wireless network, or any combination thereof. By way of example only, network 120 may include a wired network, a fiber optic network, a telecommunications network, an intranet, the internet, a Local Area Network (LAN), a Wide Area Network (WAN), a Wireless Local Area Network (WLAN), a Metropolitan Area Network (MAN), a Public Switched Telephone Network (PSTN), a bluetooth network, a ZigBee network, a Near Field Communication (NFC) network, the like, or any combination thereof. In some embodiments, network 120 may include at least one network access point. For example, the network 120 may include wired or wireless network access points, such as base stations and/or Internet switching points 120-1, 120-2 … …. At least one component of dynamic reading video system 100 may be connected to network 120 to exchange data and/or information.
In some embodiments, the user may be a user of the subscriber terminal 130. In some embodiments, the user of the user terminal 130 may also be a person other than a subscriber. For example, user a, e.g., user terminal 130, may use user terminal 130 to send a service request to user B, or to receive services and/or information or instructions from server 110. In some embodiments, "user" and "user terminal" may be used interchangeably. In some embodiments, the service request may be charged or free of charge.
In some embodiments, the user terminal 130 may include a personal computer 131, a mobile device 132, a tablet computer 133, and the like, or any combination thereof. In some embodiments, the mobile device 132 may include a smart home device, a wearable device, a smart mobile device, a virtual reality device, an augmented reality device, and the like, or any combination thereof. In some embodiments, the smart home devices may include smart televisions, smart detection devices, and the like, or any combination thereof. In some embodiments, the wearable device may include a smart bracelet, smart glasses, a smart watch, a smart backpack, a smart accessory, or the like, or any combination thereof. In some embodiments, the smart mobile device may include a smart phone, a personal digital assistant (PAD), a gaming device, a navigation device, the like, or any combination thereof. In some embodiments, the virtual reality device and/or the augmented reality device may include a virtual reality helmet, virtual reality glasses, virtual reality eyeshields, augmented reality helmets, augmented reality glasses, augmented reality eyeshields, and the like, or any combination thereof. For example, a virtual reality device and/or an augmented reality device may include: google Glass, Oculus Rift, Gear VR, etc.
In some embodiments, database 140 may store data and/or instructions obtained from user terminal 130 and/or video service provider 150. For example, the database 140 may store camera information (including camera ID, camera area code, video address corresponding to camera ID, video service provider information corresponding to video address), video address (including valid video address, invalid video address), and the like. As another example, database 140 may store data and/or instructions that server 110 may execute or use to perform the example methods described herein.
In some embodiments, database 140 may include mass storage, removable storage, volatile read-write memory, read-only memory (ROM), the like, or any combination thereof. Exemplary mass storage devices may include magnetic disks, optical disks, solid state disks, and the like. Exemplary removable memory may include flash drives, floppy disks, optical disks, memory cards, compact disks, magnetic tape, and the like. Exemplary volatile read and write memories can include Random Access Memory (RAM). Exemplary random access memories may include Dynamic Random Access Memory (DRAM), double-rate synchronous dynamic random access memory (DDRSDRAM), Static Random Access Memory (SRAM), thyristor random access memory (T-RAM), zero-capacitance random access memory (Z-RAM), and the like. Exemplary read-only memories may include mask read-only memory (MROM), programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), compact disc read-only memory (CD-ROM), digital versatile disc read-only memory, and the like. In some embodiments, database 140 may be implemented on a cloud platform. For illustration only, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, across clouds, multiple clouds, and the like, or any combination thereof.
The above description is for illustrative purposes only and does not limit the scope of the present application. Many alternatives, modifications, and variations will be apparent to those skilled in the art. The features, structures, methods, and other features of the exemplary embodiments described herein may be combined in various ways to obtain additional and/or alternative exemplary embodiments. For example, the database 140 may be a data store comprising a cloud computing platform, which may be a public cloud, a private cloud, a community cloud, a hybrid cloud, and the like. However, such variations and modifications do not depart from the scope of the present application.
Fig. 2 is a schematic diagram of exemplary hardware and/or software components of a computing device 200 shown in accordance with some embodiments of the present application. As shown in fig. 2, computing device 200 may include a processor 210, a memory 220, input/output (I/O) ports 230, and communication ports 240.
Processor 210 may execute computer instructions (e.g., program code) and perform the functions of processing engine 111 according to the techniques described herein. The computer instructions may include, for example, routines, programs, objects, components, data structures, procedures, modules, and functions that perform particular functions described herein. For example, processor 210 may process data obtained from server 110, user terminal 130, and/or any other component of dynamic reading video system 100. In some embodiments, processor 210 may include at least one hardware processor, such as a microcontroller, microprocessor, reduced instruction computer (RISC), Application Specific Integrated Circuit (ASIC), application specific instruction set processor (ASIP), Central Processing Unit (CPU), Graphics Processing Unit (GPU), Physical Processing Unit (PPU), microcontroller unit, Digital Signal Processor (DSP), Field Programmable Gate Array (FPGA), advanced reduced instruction system computer (ARM), Programmable Logic Device (PLD), any circuit or processor capable of performing at least one function, and the like or any combination thereof.
For illustration only, FIG. 2 depicts only one processor in the computing device 200. It should be noted, however, that the computing device 200 may include multiple processors, and thus operations and/or method steps described herein as being performed by one processor may also be performed by multiple processors, collectively or independently. For example, if in the present application, the processors of computing device 200 perform process A and process B, it should be understood that process A and process B may also be performed jointly or independently by two or more different processors in computing device 200 (e.g., a first processor performs process A and a second processor performs process B; or a first processor and a second processor perform processes A and B together).
The memory 220 may store data/information obtained from the server 110, the user terminal 130, the database 140, the video service provider 150. In some embodiments, memory 220 may include, but is not limited to, mass storage, removable storage, volatile read-write memory, read-only memory (ROM), the like, or any combination thereof. Exemplary mass storage may include magnetic disks, optical disks, solid state drives, and the like. Exemplary removable storage may include flash memory, floppy disks, optical disks, memory cards, compact disks, magnetic tape, and the like. Exemplary volatile read and write memories can include Random Access Memory (RAM). Exemplary RAM may include Dynamic RAM (DRAM), double-data-rate synchronous dynamic RAM (DDRSDRAM), Static RAM (SRAM), thyristor RAM (T-RAM), zero-capacitance RAM (Z-RAM), and the like. Exemplary ROMs may include shielded ROM (MROM), Programmable ROM (PROM), erasable programmable ROM (PEROM), Electrically Erasable Programmable ROM (EEPROM), compact disk ROM (CD-ROM), and digital versatile disk ROM, among others. In some embodiments, memory 220 may store at least one program and/or instructions to perform the example methods described herein.
I/O230 may input and/or output signals, data, information, and the like. In some embodiments, I/O230 may enable interaction with processing engine 111. In some embodiments, I/O230 may include one input device and one output device. Exemplary input devices may include a keyboard, mouse, touch screen, microphone, etc., or any combination thereof. Exemplary output devices may include a display device, speakers, printer, projector, etc., or any combination thereof. Exemplary display devices may include Liquid Crystal Displays (LCDs), Light Emitting Diode (LED) based displays, flat panel displays, curved screens, television devices, Cathode Ray Tubes (CRTs), touch screens, and the like, or any combination thereof.
The communication port 240 may be connected to a network (e.g., network 120) to facilitate data communication. The communication port 240 may establish a connection between the processing engine 111 and the server 110, the user terminal 130, the database 140, the video service provider 150. The connection may be a wired connection, a wireless connection, any other communication connection capable of enabling data transmission and/or reception, and/or any combination of these connections. The wired connection may include, for example, an electrical cable, an optical cable, a telephone line, etc., or any combination thereof. The wireless connection may comprise BluetoothTMConnection, WiFiTMConnection, WiMaxTMA connection, a WLAN connection, a ZigBee connection, a mobile network connection (e.g., 3G, 4G, 5G, 6G, etc.), etc., or any combination thereof. In some embodiments, the communication port 240 may be (or include) a standardized communication port, such as RS232, RS485, and the like. In some embodiments, the communication port 240 may be a specially designed communication port.
Fig. 3 is an exemplary hardware and/or software diagram of a mobile device on which a terminal may be implemented, according to some embodiments of the present application. The user terminal 130 may be implemented on the mobile device 300. As shown in fig. 3, mobile device 300 may include communication ports 310, display 320, Graphics Processing Unit (GPU)330, Central Processing Unit (CPU)340, input/output (I/O) 350, memory 360, and storage 390. The CPU may include interface circuitry and processing circuitry similar to processor 220. At one endIn some embodiments, any other suitable component, including but not limited to a system bus or a controller (not shown), may also be included in the mobile device 300. In some embodiments, the operating system 370 is moved (e.g., IOS)TM、Android TM、Windows Phone TMEtc.) and at least one application 380 may be loaded from storage 390 into memory 360 for execution by CPU 340. The application 380 may include a browser or any other suitable mobile application for receiving and presenting information related to a service. User interaction with the information stream may be accomplished through I/O device 350 and provided to processing engine 111 and/or other components of dynamic reading video system 100 via a network.
To implement the various modules, units, and functions thereof described herein, a computer hardware platform may be used as a hardware platform for at least one of the elements described herein (e.g., dynamic read video system 100 and/or other components of dynamic read video system 100 described with respect to fig. 1). The hardware elements, operating systems, and programming languages of such computers are conventional in nature, and it is assumed that one of ordinary skill in the art is familiar with these techniques to adapt these techniques for use with dynamic read video methods, update video address methods, and the like as described herein. The computer containing the user interface elements may be used as a Personal Computer (PC) or other type of workstation or terminal device, suitably programmed, or may be used as a server. It is believed that one of ordinary skill in the art will be familiar with the structure, programming, and general operation of such computer devices, and accordingly all of the figures will be apparent.
It will be appreciated by those of ordinary skill in the art that dynamic reading of elements of the video system 100 may be performed by electrical and/or electromagnetic signals. For example, when the user terminal 130 needs to process a task such as making a determination, identifying, or selecting an object, the user terminal 130 may control logic in its processor to complete the task. When user terminal 130 issues a read video request to server 110, the processor of user terminal 130 may generate an electrical signal encoding the request. The processor of the user terminal 130 may then send the electrical signal to the output port. If user terminal 130 communicates with server 110 over a wired network, the output port may be physically connected to a cable that also sends electrical signals to the input port of server 110. If user terminal 130 communicates with server 110 over a wireless network, the output port of user terminal 130 may be at least one antenna that converts electrical signals to electromagnetic signals. Within an electronic device (e.g., user terminal 130 and/or server 110), instructions and/or actions are performed by electrical signals when a processor thereof processes the instructions, issues the instructions, and/or performs the actions. For example, when the processor retrieves or saves data from the storage medium, it may send electrical signals to a read/write device of the storage medium, which may read or write structured data in the storage medium. The structured data may be sent to the processor in the form of electrical signals over a bus of the electronic device. Wherein an electrical signal may refer to one electrical signal, a series of electrical signals, and/or at least two discrete electrical signals.
FIG. 4 is an exemplary block diagram of a processing engine shown in accordance with some embodiments of the present application. As shown in fig. 4, the processing engine may include: an acquisition module 410, a determination module 420, an update module 430, and a read module 440.
The obtaining module 410 is configured to obtain camera related information, where the camera related information includes camera location information, a video address corresponding to a camera, and video service provider information (such as a video service provider ID) corresponding to the video address.
A determining module 420 configured to determine a video address type based on the camera location information, wherein the video address type includes: an indefinite failed video address, a timed failed video address, a permanently valid video address. In some embodiments, it may be further tested whether the video address is valid based on the network detect command. Wherein, the network detection command test may include: PING commands, network management protocol commands, traceroute, actually connecting the video address, etc.
An update module 430 configured to update the valid video address based on the video address type. For more description of the update module 430, reference may be made to fig. 5, which is not repeated herein.
A reading module 440 configured to read the video based on the valid video address. In some embodiments, the program that reads the video address may include a video player, a web player, a television player, and the like.
The modules may be connected or in communication with each other by a wired or wireless connection. The wired connection may include a metal cable, an optical cable, a hybrid cable, etc., or any combination thereof. The wireless connection may include a Local Area Network (LAN), a Wide Area Network (WAN), BluetoothTM、ZigBee TMNetwork, Near Field Communication (NFC), etc., or any combination thereof. Two or more modules may be combined into one module, and any one module may be split into two or more modules.
FIG. 5 is an exemplary block diagram of the update module 430 shown according to some embodiments of the present application. As shown in fig. 5, the update module 430 may include: a judging unit 510, a configuring unit 520, an obtaining unit 530, a storing unit 540, and an updating unit 550.
A determining unit 510 configured to determine whether the sporadic failure video address and/or the timed failure video address fail. In some embodiments, it may be determined whether the timed stale video address is stale based on the valid time of the timed stale video address. In other embodiments, the sporadic stale video addresses and/or the timed stale video addresses may also be tested for validity based on the network detect command. The network policy command test may include a PING command, a network management protocol command, traceroute, actually connect the video address, and the like.
A configuration unit 520 configured to configure the occasionally stale video address update time. In some embodiments, the update time may be an update of the sporadic stale video addresses at intervals (e.g., seconds, minutes, hours, days, months, quarters, years, etc.), such as at 2min intervals. In some embodiments, the updating manner may include: reserved, covered, replaced, etc.
An obtaining unit 530 configured to request a new valid video address based on video service provider information (e.g., video service provider ID) corresponding to the video address. For example, an API provided by the video service provider may be requested based on the corresponding video service provider information ID to obtain a corresponding new and/or valid video address.
A storage unit 540 configured to store the new valid timing fail video address.
An updating unit 550 configured to update the valid video address. In some embodiments, the update means may include retention, override, replacement, or the like.
The units may be connected or in communication with each other by a wired or wireless connection. The wired connection may include a metal cable, an optical cable, a hybrid cable, etc., or any combination thereof. The wireless connection may include a Local Area Network (LAN), a Wide Area Network (WAN), BluetoothTM、ZigBee TMNetwork, Near Field Communication (NFC), etc., or any combination thereof. Two or more units may be combined into one unit, and any one unit may be split into two or more units.
Fig. 6 is an exemplary flow diagram of a method of dynamically reading video, according to some embodiments of the present application. In some embodiments, at least one operation of the process 500 shown in FIG. 6 for implementing dynamic read video may be implemented in the dynamic read video system 100 shown in FIG. 1. For example, the flow 500 illustrated in fig. 6 may be stored in the database 140 in the form of instructions and invoked and/or executed by the processing engine 111 (e.g., the processor 210 of the computing apparatus 200 illustrated in fig. 2, the CPU340 of the mobile device 300 illustrated in fig. 3). The operations of flow 500 shown below are intended to be illustrative. In some embodiments, flow 500 may be accomplished with at least one additional operation not described. Further, the order of the operations of the process illustrated in FIG. 5 and described below is not intended to be limiting.
And step 610, acquiring relevant information of the camera. Step 610 may be implemented by the acquisition module 410. The camera related information includes camera location information and video service provider information (e.g., video service provider ID) corresponding to the camera. In some embodiments, the camera location information may include a camera number, a region where the camera is located, and an ID of a subject to which the camera belongs. In some embodiments, the camera number may include, but is not limited to, any combination of numbers, characters, letters, words, and the like. In some embodiments, the area where the camera is located may indicate any combination of country, province, city, prefecture, street, county, etc. where the camera is located. The area in which the camera is located may be identified by an area code, which may include, but is not limited to, any combination of numbers, characters, letters, words, and the like. In some embodiments, the body may include, but is not limited to: food management places such as catering shops, small workshops, small business stores, stall dealers, school canteens (including kindergartens, secondary schools, colleges and universities and the like), food distributors and the like. In some embodiments, the subject ID of the subject to which the camera belongs may include, but is not limited to, any combination of numbers, characters, words, letters, and the like. In some embodiments, the ID of the body to which the camera belongs may indicate information such as the position, number, orientation, etc. of the camera in the body.
And step 620, determining the video address type based on the camera point location information. Step 620 may be implemented by determination module 420. In some embodiments the video address types may include: a permanent valid video address, a timed stale video address, an indeterminate stale video address. In some embodiments, the video service providers selected by the subject (e.g., school cafeteria) have regionality, and the types of video addresses provided by different video service providers are different, so that the video address type can be determined based on the camera location information (e.g., the region where the camera is located). For example, if the camera belongs to the first primary school in zheng city, the type of the video address corresponding to the camera is an irregular disabled video address. For another example, if the camera belongs to the seventh middle school of metropolis, the type of the video address corresponding to the camera is a timing failure video address. For another example, if the camera belongs to the third school of Shenzhen city, the video address type corresponding to the camera is a permanent effective video address.
In some embodiments, it may be further tested whether the video address is valid based on the network detect command. In some embodiments, the network detection command may include, but is not limited to: PING commands, network management protocol commands, traceroute, actual connection video address. If the network connection of the video address server is normal, the video address is not invalid; if not, the video address is invalid.
Based on the video address type, the valid video address is updated, step 630. Step 630 may be implemented by update module 430. In some embodiments, the failed video address types may include: a timed stale video address and an untimed stale video address. The timing failure video address can mean that the effective time of the timing failure video address is not exceeded, and the live video content of the timing failure video address is not interrupted. The occasionally-disabled video address can refer to a one-time video address, and the live video content of the video address can be interrupted occasionally. For more description of the update-failed timed failed video address and the update-failed non-timed failed video address, reference may be made to fig. 7 and 8, which are not repeated herein.
And step 640, reading the video based on the effective video address. Step 640 may be implemented by read module 440. In some embodiments, if the type of the video address corresponding to the camera is a permanently valid video address, the permanently valid video address corresponding to the camera stored in the database 140 can be directly read. In some embodiments, if the type of the video address corresponding to the camera is a timing failure video address, the valid video address stored in the database 140 corresponding to the camera may be directly read. In some embodiments, if the type of the video address corresponding to the camera is an irregular failure video address, a new valid irregular failure video address corresponding to the camera may be directly read. In some embodiments, the program that reads the video address may include a video player, a web player, a television player, and the like.
It should be noted that the above description regarding flow 500 is for illustrative purposes only and is not intended to limit the scope of the present application. It will be apparent to those skilled in the art from this disclosure that various changes and modifications can be made herein without departing from the scope of the disclosure. In some embodiments, the exemplary process 500 may also adjust at least one sequence of operations. For example, operations 610 and 620 may interchange the order of operations.
Fig. 7 is an exemplary flow diagram illustrating a method of updating an occasionally-expired video address according to some embodiments of the present application. In some embodiments, at least one operation of the process 600 shown in FIG. 7 for implementing dynamic read video may be implemented in the dynamic read video system 100 shown in FIG. 1. For example, the flow 600 illustrated in fig. 7 may be stored in the database 140 in the form of instructions and invoked and/or executed by the processing engine 111 (e.g., the processor 210 of the computing apparatus 200 illustrated in fig. 2, the CPU340 of the mobile device 300 illustrated in fig. 3). The operations of flow 600 shown below are intended to be illustrative. In some embodiments, flow 600 may be accomplished with at least one additional operation not described. Further, the order of the operations of the process illustrated in FIG. 6 and described below is not intended to be limiting.
And step 710, determining the video address as an irregular failure video address based on the camera point location information. In some embodiments, the video service providers selected by the subject (e.g., school cafeteria) have regionality, and the types of video addresses provided by different video service providers are different, so that the video address type can be determined based on the camera location information (e.g., the region where the camera is located). For example, if the camera belongs to the first primary school in zheng city, the type of the video address corresponding to the camera is an irregular disabled video address.
Step 720, configure the update time. Step 720 may be implemented by the configuration unit 520. In some embodiments, the update time may be an update of the sporadic stale video addresses at intervals (e.g., seconds, minutes, hours, days, months, quarters, years, etc.), such as every 2s, 2min, etc.
Step 730, determine whether the sporadic failure video address is failed. Step 730 may be implemented by the determining unit 510. In some embodiments, the network detect command may be used to test whether the sporadic stale video addresses are valid. In some embodiments, the network detection command may include, but is not limited to: PING commands, network management protocol commands, traceroute, actual connection video address. If the network connection of the video address server is normal, the irregular failed video address is not failed; if not, the untimely failure video address fails.
In some embodiments, if the occasionally-expired video address is not expired, step 760 is executed to keep the current occasionally-expired video address, which is not expired, and not updated.
In some embodiments, if the sporadic failure video address fails, step 740 is performed to perform the update.
At step 740, a new valid occasionally-failing video address is requested based on the video service provider information (e.g., video service provider ID) corresponding to the occasionally-failing video address that failed. Step 740 may be implemented by the retrieving unit 540. For example, an API provided by the video service provider may be requested based on the corresponding video service provider ID to obtain a corresponding valid occasionally-expired video address.
Step 750, overwriting or replacing the failed sporadic failure video address with the valid sporadic failure video address. Step 750 may be implemented by update unit 560.
It should be noted that the above description of flow 600 is for illustrative purposes only and is not intended to limit the scope of the present application. Various changes and modifications may be suggested to one skilled in the art in light of the teachings herein, but would nevertheless be a part of the disclosure. In some embodiments, the exemplary process 600 may also adjust at least one sequence of operations. For example, step 710 and step 720 may interchange the order of operations.
Fig. 8 is an exemplary flow diagram of a method of updating a timed stale video address according to some embodiments of the present application. In some embodiments, at least one operation of the process 700 of FIG. 8 for implementing dynamic read video may be implemented in the dynamic read video system 100 of FIG. 1. For example, the flow 700 shown in fig. 8 may be stored in the database 140 in the form of instructions and invoked and/or executed by the processing engine 111 (e.g., the processor 210 of the computing apparatus 200 shown in fig. 2, the CPU340 of the mobile device 300 shown in fig. 3). The operations of flow 700 shown below are intended to be illustrative. In some embodiments, flow 700 may be accomplished with at least one additional operation not described. Further, the order of the operations of the process illustrated in FIG. 8 and described below is not intended to be limiting.
And step 810, determining the video address as a timing failure video address based on the camera point location information. In some embodiments, the video service providers selected by the subject (e.g., school cafeteria) have regionality, and the types of video addresses provided by different video service providers are different, so that the video address type can be determined based on the camera location information (e.g., the region where the camera is located). For another example, if the camera belongs to the seventh middle school of metropolis, the type of the video address corresponding to the camera is a timing failure video address.
At step 820, the effective time of the timed stale video address is determined. In some embodiments, the validity time for the timed out video address may be determined based on a profile provided by the video service provider. In some embodiments, the effective time may be seconds, minutes, hours, days, months, quarters, years, etc.
Step 830, in the valid time, request a new valid timed failed video address based on the video service provider corresponding to the timed failed video address. Step 830 may be implemented by the obtaining unit 540. In some embodiments, the validity period may refer to a period of time before the validity period is exceeded, such as 2s, 2min before the validity period expires. In some embodiments, the corresponding new valid timing failure video address may be obtained based on corresponding video service provider information (e.g., video service provider ID), such as an API provided by the requesting video service provider.
Step 840, storing the new valid timed stale video address. Step 840 may be implemented by storage unit 550.
The stored new valid timed stale video address is used to replace or overwrite the timed stale video address, step 850. Step 860 may be implemented by the updating unit 560. In some embodiments, the stored new valid timed out video address may be manually and/or automatically replaced or overwritten with the timed out video address, etc. during the validity time. In other embodiments, the stored new valid timed expiration video address may be manually and/or automatically replaced or overwritten after a period of time (e.g., 2s, 2min, etc.) after the validity time has elapsed. In other embodiments, it may also be determined whether the timing failure video fails, and if the timing failure video fails, the stored new valid timing failure video address is manually and/or automatically used to replace or overwrite the timing failure video address.
It should be noted that the above description regarding flow 700 is for illustrative purposes only and is not intended to limit the scope of the present application. Various changes and modifications will occur to those skilled in the art based on the description herein. However, variations and modifications may be made without departing from the scope of the present application. In some embodiments, one or more sequences of operations may be adjusted in the exemplary process 700. For example, steps 810 and 820 may be interchanged in order. For another example, an operation of determining whether the timing failure video has failed may be added before step 850.
Having thus described the basic concepts, it will be apparent to those skilled in the art that the foregoing disclosure is by way of example only, and is not intended to limit the present application. Various modifications, improvements and adaptations may occur to those skilled in the art. Such alterations, modifications, and improvements are intended to be within the spirit and scope of the exemplary embodiments of this application.
Also, this application uses specific terminology to describe embodiments of the application. For example, "one embodiment," "an embodiment," and/or "some embodiments" means a certain feature, structure, or characteristic described in connection with at least one embodiment of the application. Therefore, it is emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, some features, structures, or characteristics of one or more embodiments of the present application may be combined as appropriate.
Moreover, those of ordinary skill in the art will understand that aspects of the present application may be illustrated and described in terms of several patentable species or situations, including any new and useful combination of processes, machines, articles, or materials, or any new and useful modification thereof. Accordingly, various aspects of the present application may be embodied entirely in hardware, entirely in software (including firmware, resident software, micro-code, etc.) or in a combination of hardware and software. The above hardware or software may be referred to as "data block," module, "" engine, "" unit, "" component, "or" system. Furthermore, aspects of the present application may be embodied as a computer product, comprising computer readable program code, in one or more computer readable media.
A computer readable signal medium may comprise a propagated data signal with computer program code embodied therein, for example, on a baseband or as part of a carrier wave. The propagated signal may take any of a variety of forms, including electromagnetic, optical, etc., or any suitable combination. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code on a computer readable signal medium may be propagated over any suitable medium, including radio, electrical cable, fiber optic cable, RF, etc., or any combination thereof.
Computer program code required for the operation of various portions of the present application may be written in any one or more programming languages, including an object oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C + +, C, VB.NET, Python, etc., a conventional procedural programming language such as C, VisualBasic, Fortran2003, Perl, COBOL2002, PHP, ABAP, a dynamic programming language such as Python, Ruby, and Groovy, or other programming languages, etc. The programming code may run entirely on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter case, the remote calculator may be connected to the user calculator through any form of network, for example, a Local Area Network (LAN) or a Wide Area Network (WAN), or connected to an external calculator (for example, through the internet), or in a cloud computing environment, or used as a service such as software as a service (SaaS).
Additionally, unless explicitly stated in the claims, the order of processing elements or sequences, the use of numerical letters, or the use of other names are not intended to limit the order of the processes and methods of the present application. While various presently contemplated embodiments of the invention have been discussed in the foregoing disclosure by way of example, it is to be understood that such detail is solely for that purpose and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements that are within the spirit and scope of the embodiments herein. For example, although the system components described above may be implemented by being installed in a hardware device, they may also be implemented by software-only solutions, such as installing the described system on an existing server or mobile carrier.
Similarly, it should be noted that in the preceding description of embodiments of the application, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the embodiments. This method of disclosure, however, is not intended to suggest that more features are required of the subject application than are expressly recited in the claims. Indeed, the embodiments may be characterized as having less than all of the features of a single embodiment disclosed above.
In some embodiments, the number, nature, etc. of certain embodiments used to describe and claim the present application should be understood to be modified in some instances by the terms "about", "approximately", or "substantially". Unless otherwise indicated, "about", "approximately" or "substantially" indicates that the number allows a variation of ± 20%. Accordingly, in some embodiments, the numerical parameters used in the specification and claims are approximations that may vary depending upon the desired properties of the individual embodiments. In some embodiments, the numerical parameter should take into account the specified significant digits and employ a general digit preserving approach. Notwithstanding that the numerical ranges and broad parameters setting forth the embodiments of the application are approximations, the numerical values set forth in the specific examples are reported as precisely as possible.
All patents, patent applications, patent application publications, and other materials mentioned in this application, such as articles, books, specifications, publications, documents, things, and the like, are herein incorporated by reference in their entirety for all purposes to the same extent as if any prosecution document record associated with this document, any document inconsistent or conflicting with this document, or any document having in time a limitation on the broadest scope of the claims associated with this document. By way of example, if there is any inconsistency or conflict between the use of terms that describe, define and/or are associated with any of the incorporated materials and the terms associated with this document, the terms described, defined and/or used in this document shall control this document.
Finally, it should be understood that the examples herein are merely illustrative of the principles of the examples herein. Other variations are also possible within the scope of the present application. Thus, by way of example, and not limitation, alternative configurations of embodiments of the present application may be considered consistent with the description of the present application. Accordingly, the embodiments of the present application are not limited to only those embodiments explicitly described and depicted herein.

Claims (13)

1. A method for dynamically reading video, comprising:
acquiring camera related information, wherein the camera related information comprises camera point location information, a video address corresponding to a camera and video service provider information corresponding to the video address;
determining a video address type based on the camera point location information, wherein the video address type comprises: an indefinite failure video address, a timed failure video address, a permanent effective video address;
updating the valid video address based on the video address type; and
and reading the video based on the effective video address.
2. The method of claim 1, wherein the camera point location information comprises: the camera number, the area where the camera is located, and the main body ID of the main body to which the camera belongs.
3. The method of claim 1, further comprising:
determining whether the video address is invalid based on a network detection command,
if the video address server network is normal, it is not invalid,
if not, the operation is failed.
4. The method of claim 3, wherein the network detection command comprises: PING command, network management protocol command, traceroute, actually connect the video address.
5. The method of claim 1, wherein updating the occasionally-expired video address comprises:
configuring an update time;
determining whether the sporadic failure video address is failed based on the network detection command within the update time,
requesting a valid sporadic failure video address based on video service provider information corresponding to the sporadic failure video address if the sporadic failure video address fails, and overwriting the failed sporadic failure video address with the valid sporadic failure video address,
and if the data is not invalid, the data is not updated.
6. The method of claim 1, wherein updating the timed stale video address comprises:
determining the effective time of the timing failure video address;
requesting a new valid timed failure video address based on the video service provider information corresponding to the timed failure video address within the valid time;
storing the new effective timing failure video address; and
replacing the timed stale video address with the stored new valid timed stale video address.
7. A dynamic read video apparatus, comprising:
the system comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for acquiring camera related information, and the camera related information comprises camera point location information, a video address corresponding to a camera and video service provider information corresponding to the video address;
a determining module, configured to determine a video address type based on the camera location information, where the video address type includes: an indefinite failure video address, a timed failure video address, a permanent effective video address;
the updating module is used for updating the effective video address based on the video address type; and
and the reading module is used for reading the video based on the effective video address.
8. The apparatus of claim 7, wherein the camera point location information comprises: the camera number, the area where the camera is located, and the main body ID of the main body to which the camera belongs.
9. The apparatus of claim 7, wherein the update module updates the occasionally-expired video address comprising:
a configuration unit for configuring an update time;
a judging unit for judging whether the irregular failure video address fails based on the network detection command within the update time,
an acquiring unit for requesting a valid sporadic failure video address based on video service provider information corresponding to the sporadic failure video address if the sporadic failure video address fails, an updating unit for overwriting the failed sporadic failure video address with the valid sporadic failure video address,
and if the data is not invalid, the data is not updated.
10. The apparatus of claim 9, wherein the network detection command comprises: PING command, network management protocol command, traceroute, actually connect the video address.
11. The apparatus of claim 7, wherein the update module updates the timed invalidation video address comprising:
determining the effective time of the timing failure video address;
an obtaining unit, configured to request a new valid timing failure video address based on video service provider information corresponding to the timing failure video address within the valid time;
the storage unit is used for storing the new timing failure video address; and
and the updating unit is used for replacing the timing failure video address with the stored new effective timing failure video address.
12. A dynamic read video system, comprising:
a memory including instructions; and
a processor configured to execute the instructions to:
acquiring camera related information, wherein the camera related information comprises camera point location information, a video address corresponding to a camera and video service provider information corresponding to the video address;
determining a video address type based on the camera point location information, wherein the video address type comprises: an indefinite failure video address, a timed failure video address, a permanent effective video address;
updating the valid video address based on the video address type; and
and reading the video based on the effective video address.
13. A non-transitory computer-readable storage medium storing instructions that, when executed by one or more processors, cause the one or more processors to perform operations
Causing the processor to perform a method of dynamically reading video, the method comprising:
acquiring camera related information, wherein the camera related information comprises camera point location information, a video address corresponding to a camera and video service provider information corresponding to the video address;
determining a video address type based on the camera point location information, wherein the video address type comprises: an indefinite failure video address, a timed failure video address, a permanent effective video address;
updating the valid video address based on the video address type; and
and reading the video based on the effective video address.
CN201911009331.0A 2019-10-23 2019-10-23 Method, device and system for dynamically reading video Pending CN112702612A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911009331.0A CN112702612A (en) 2019-10-23 2019-10-23 Method, device and system for dynamically reading video

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911009331.0A CN112702612A (en) 2019-10-23 2019-10-23 Method, device and system for dynamically reading video

Publications (1)

Publication Number Publication Date
CN112702612A true CN112702612A (en) 2021-04-23

Family

ID=75505244

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911009331.0A Pending CN112702612A (en) 2019-10-23 2019-10-23 Method, device and system for dynamically reading video

Country Status (1)

Country Link
CN (1) CN112702612A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5761741A (en) * 1994-03-24 1998-06-02 Discovision Associates Technique for addressing a partial word and concurrently providing a substitution field
CN103067514A (en) * 2012-12-29 2013-04-24 深圳先进技术研究院 Cloud computing resource optimization method and cloud computing resource optimization system used for video mointoring and analysis system
US8463915B1 (en) * 2010-09-17 2013-06-11 Google Inc. Method for reducing DNS resolution delay
CN103546830A (en) * 2013-10-28 2014-01-29 Tcl集团股份有限公司 Method and system for processing video address failure
CN104518955A (en) * 2013-09-27 2015-04-15 广州市千钧网络科技有限公司 Video uploading method and system
CN106293791A (en) * 2015-05-29 2017-01-04 四川效率源信息安全技术有限责任公司 Data extraction method based on big magnificent embedded security device
EP3127298A1 (en) * 2014-03-31 2017-02-08 Google, Inc. Specifying a mac address based on location
CN107666592A (en) * 2017-08-18 2018-02-06 深圳市艾特智能科技有限公司 Camera head monitor method, system, storage medium and computer equipment

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5761741A (en) * 1994-03-24 1998-06-02 Discovision Associates Technique for addressing a partial word and concurrently providing a substitution field
US8463915B1 (en) * 2010-09-17 2013-06-11 Google Inc. Method for reducing DNS resolution delay
CN103067514A (en) * 2012-12-29 2013-04-24 深圳先进技术研究院 Cloud computing resource optimization method and cloud computing resource optimization system used for video mointoring and analysis system
CN104518955A (en) * 2013-09-27 2015-04-15 广州市千钧网络科技有限公司 Video uploading method and system
CN103546830A (en) * 2013-10-28 2014-01-29 Tcl集团股份有限公司 Method and system for processing video address failure
EP3127298A1 (en) * 2014-03-31 2017-02-08 Google, Inc. Specifying a mac address based on location
CN106293791A (en) * 2015-05-29 2017-01-04 四川效率源信息安全技术有限责任公司 Data extraction method based on big magnificent embedded security device
CN107666592A (en) * 2017-08-18 2018-02-06 深圳市艾特智能科技有限公司 Camera head monitor method, system, storage medium and computer equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
蒋仕光等: "基于4G的机车监控视频实时传输系统的设计", 《科技创新导报》 *

Similar Documents

Publication Publication Date Title
US11362923B2 (en) Techniques for infrastructure analysis of internet-based activity
DE202017105760U1 (en) Decomposition of dynamic graphical user interfaces
US11064053B2 (en) Method, apparatus and system for processing data
AU2018282441A1 (en) Systems and methods for determining an optimal strategy
CN107770146B (en) User data authority control method and device
CN110990090A (en) Dynamic wallpaper display method, device and computer readable medium
US20200133951A1 (en) Systems and methods for data storage and data query
CN109376192A (en) A kind of user retains analysis method, device, electronic equipment and storage medium
CN109343863B (en) Interface configuration method and system for HDFS (Hadoop distributed File System) permission
US9892199B2 (en) Specialized virtual personal assistant setup
CN108810144A (en) A kind of data transmission method, server and storage medium
CN114021016A (en) Data recommendation method, device, equipment and storage medium
CN111859077A (en) Data processing method, device, system and computer readable storage medium
CN112702612A (en) Method, device and system for dynamically reading video
US20200125616A1 (en) Systems and methods for data processing related to an online to offline service
CN109600604B (en) Contrast testing method, device and computer readable storage medium
JP2020123323A (en) Method, apparatus, device, and storage medium for providing visual representation of set of objects
US20210026981A1 (en) Methods and apparatuses for processing data requests and data protection
WO2023035559A1 (en) Method and apparatus for implementing relationship graph, electronic device and storage medium
CN113705363B (en) Method and system for identifying uplink signals of specific satellites
CN112598342B (en) Alarm control method, device, equipment and medium for data display equipment
KR20160103444A (en) Method for image processing and electronic device supporting thereof
KR102457006B1 (en) Apparatus and method for providing information of electronic device
CN111010449B (en) Image information output method, system, device, medium, and electronic apparatus
US10890988B2 (en) Hierarchical menu for application transition

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20210423

WD01 Invention patent application deemed withdrawn after publication