WO2019076076A1 - 模拟摄像机、服务器、监控系统和数据传输、处理方法 - Google Patents

模拟摄像机、服务器、监控系统和数据传输、处理方法 Download PDF

Info

Publication number
WO2019076076A1
WO2019076076A1 PCT/CN2018/092146 CN2018092146W WO2019076076A1 WO 2019076076 A1 WO2019076076 A1 WO 2019076076A1 CN 2018092146 W CN2018092146 W CN 2018092146W WO 2019076076 A1 WO2019076076 A1 WO 2019076076A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
image
video stream
target
stream data
Prior art date
Application number
PCT/CN2018/092146
Other languages
English (en)
French (fr)
Inventor
陈黎明
顾昕宇
Original Assignee
杭州海康威视数字技术股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN201710985838.4A external-priority patent/CN109698933B/zh
Priority claimed from CN201710984275.7A external-priority patent/CN109698932B/zh
Priority claimed from CN201710985839.9A external-priority patent/CN109698895A/zh
Priority claimed from CN201710985130.9A external-priority patent/CN109698900B/zh
Priority claimed from CN201721357044.5U external-priority patent/CN207766402U/zh
Priority claimed from CN201710985836.5A external-priority patent/CN109698923B/zh
Application filed by 杭州海康威视数字技术股份有限公司 filed Critical 杭州海康威视数字技术股份有限公司
Publication of WO2019076076A1 publication Critical patent/WO2019076076A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present application relates to the field of video surveillance technologies, and in particular, to an analog camera, a server, a monitoring system, and a data transmission and processing method.
  • an analog camera is usually provided to collect images of the scene, so that abnormal events such as robbery, traffic accidents, and the like can be processed in time.
  • the existing analog camera only has an image acquisition function, and does not have an image analysis function.
  • the analog camera can only send the collected image to the server, and the server performs analysis and processing.
  • the application provides an analog camera, a server, a monitoring system, and a data transmission and processing method to implement analysis and processing on the collected image.
  • an embodiment of the present application provides an analog camera, including: an image acquisition chip, an image analysis chip, an integrated chip, and a transmission chip, where the image acquisition chip is respectively associated with the image analysis chip and the integrated chip. Connecting, the image analysis chip is connected to the integrated chip, and the integrated chip is connected to the transmitting chip;
  • the image acquisition chip is configured to collect digital video stream data
  • the image analysis chip is configured to analyze the digital video stream data, identify a target existing in the digital video stream data, extract attribute and/or location information of the target, and set an attribute of the target and / or location information is sent to the integrated chip;
  • the integrated chip is configured to insert attribute and/or location information of the target into the digital video stream data according to a preset insertion mode, obtain mixed data, and send the mixed data to the sending chip ;
  • the transmitting chip is configured to convert the mixed data or the digital video stream data in the mixed data into analog data to obtain converted mixed data; and send the converted mixed data.
  • the image acquisition chip is further configured to: copy the digital video stream data to obtain two digital video stream data; and send a digital video stream data to the image analysis chip, and another Digital video stream data is sent to the integrated chip;
  • the integrated chip can be specifically used to:
  • the image acquisition chip is further configured to: copy the digital video stream data to obtain two digital video stream data; and send a digital video stream data to the image analysis chip, and another Digital video stream data is sent to the integrated chip;
  • the integrated chip can be specifically used to:
  • the image acquisition chip is further configured to send the digital video stream data to the integrated chip;
  • the integrated chip is further configured to: capture an image in the digital video stream data, and send the captured image to the image analysis chip;
  • the image analysis chip is specifically configured to:
  • Receiving an image transmitted by the integrated chip analyzing the image, identifying a target existing in the image, extracting attribute and/or location information of the target; and transmitting the attribute and/or location information of the target To the integrated chip.
  • the image acquisition chip is further configured to send the digital video stream data to the integrated chip;
  • the integrated chip is further configured to: capture an image in the digital video stream data, and send the captured image to the image analysis chip;
  • the image analysis chip is specifically configured to:
  • the integrated chip is further configured to compress the captured image to obtain a compressed image; and insert the compressed image, the attribute and/or location information of the target into the preset insertion mode to In the digital video stream data, mixed data is obtained.
  • the integrated chip can also be used to:
  • the sending chip is specifically configured to:
  • the integrated chip comprises a plurality of integrally arranged chips, and the integrated chip comprises the transmitting chip.
  • the integrated chip comprises a plurality of integrally arranged chips, and the integrated chip comprises an image processing chip and an insertion chip;
  • the image processing chip is configured to perform color and/or brightness processing on the digital video stream data
  • the inserting chip is configured to insert the attribute and/or position information of the target into the processed digital video stream data according to a preset insertion mode to obtain mixed data.
  • the embodiment of the present application further provides a data transmission method, which is applied to an analog camera, and the method includes:
  • the method may further include:
  • the inserting the attribute and/or the location information of the target into the digital video stream data according to the preset insertion mode to obtain the mixed data may include:
  • the captured image, the attribute of the target, and/or the location information are inserted into the digital video stream data according to a preset insertion mode to obtain mixed data.
  • the method may further include:
  • Inserting, by the preset insertion mode, the captured image, the attribute of the target, and/or the location information into the digital video stream data to obtain mixed data including:
  • the compressed image, the attribute of the target, and/or the location information are inserted into the digital video stream data according to a preset insertion mode to obtain mixed data.
  • the method further includes:
  • the analyzing the digital video stream data includes:
  • Grasping an image carrying the target in the digital video stream data including:
  • An image carrying the target is captured in another digital video stream data of the two digital video stream data.
  • the attribute and/or location information of the target is inserted into the digital video stream data according to a preset insertion mode to obtain mixed data, including:
  • the attribute and/or location information of the target is inserted into the blanking area of the digital video stream data to obtain mixed data.
  • the converting the digital video stream data in the mixed data or the mixed data into analog data to obtain the converted mixed data including:
  • the data of the preset format is digital-to-analog converted to obtain the converted mixed data.
  • the converting the digital video stream data in the mixed data or the mixed data into analog data to obtain the converted mixed data including:
  • Image analysis data in the blended data is converted to low frequency digital data, the image analysis data including attribute and/or location information of the target.
  • the sending the converted mixed data includes:
  • the second image frame is an image frame that carries video stream data and does not carry target data, where the target data includes attribute and/or location information of the target;
  • the sending the converted mixed data includes:
  • the target data includes attribute and/or location information of the target
  • the target data is used as data carried in the effective image area of the first image frame, and the target data is sent by using a data transmission manner of the effective image area;
  • sending a second image frame where the second image frame is an image frame carrying video stream data and not carrying target data, wherein the first image frame and the second image frame occupy The same transmission channel.
  • the method before the sending the converted mixed data, the method further includes:
  • the coaxial data is used as the data carried in the blanking area of the first type of image frame in which the target data is located, and the coaxial data is transmitted by using a data transmission manner of the blanking area.
  • the coaxial data further includes:
  • the data in the effective image area of the image frame is the coaxial data identification of the target data.
  • the sending the converted mixed data includes:
  • the method before the sending the converted mixed data, the method further includes:
  • the coaxial data is used as data of the image frame in which the video stream data and the target data are located, and the coaxial data is transmitted according to the third position by using a data transmission manner of a blanking area.
  • the sending the converted mixed data includes:
  • the target data is transmitted in accordance with the second location.
  • the embodiment of the present application further provides a data processing method, which is applied to a server in a monitoring system, where the monitoring system further includes an analog camera, and the analog camera is coaxially connected with the server;
  • Methods include:
  • the separated video stream data and the image analysis data are separately processed.
  • the method may further include:
  • Determining whether the to-be-processed analog camera has an image analysis function if yes, performing the step of separating the video stream data and the image analysis data from the received data;
  • analyzing the received video stream data to obtain image analysis data respectively processing the received video stream data and the analyzed image analysis data; wherein the image analysis data includes attributes of the target and / or location information, the target is a target existing in the video stream data.
  • the method before the receiving the data sent by the analog camera to be processed, the method further includes:
  • Determining whether the to-be-processed analog camera has an image analysis function includes:
  • the separating the video stream data and the image analysis data from the received data including:
  • the data in the image area is read, and the video stream data is extracted.
  • an embodiment of the present application further provides an analog camera, including a processor and a memory;
  • a memory for storing a computer program
  • the processor when used to execute a program stored on the memory, implements any of the above methods for data transmission applied to the analog camera side.
  • the embodiment of the present application further provides a server, including a processor and a memory;
  • a memory for storing a computer program
  • the processor when used to execute a program stored on the memory, implements any of the above-described data processing methods applied to the server side.
  • the embodiment of the present application further provides a monitoring system, including: any of the foregoing analog cameras and servers, where
  • the analog camera transmits the converted mixed data to the server.
  • the server includes: a sensing signal receiving chip, an image signal receiving chip, and a signal processing chip, wherein the sensing signal receiving chip and the image signal receiving chip are respectively connected to the signal processing chip;
  • the sensing signal receiving chip is configured to receive a sensing signal sent by the wireless sensor, and send the sensing signal to the signal processing chip;
  • the image signal receiving chip is configured to receive an image signal sent by an analog camera, and send the image signal to the signal processing chip;
  • the signal processing chip is configured to perform correlation processing on the sensing signal and the image signal.
  • the analog camera includes an image analysis chip, and the image analysis chip analyzes the collected digital video stream data, identifies a target existing in the digital video stream data, and extracts attribute and/or position information of the target; Sending the attribute and/or location information of the target to the integrated chip, and the integrated chip inserts the attribute and/or position information of the target into the digital video stream data according to the preset insertion mode to obtain mixed data; visible, the simulation in the solution
  • the camera realizes the analysis and processing of the collected images.
  • FIG. 1 is a schematic diagram of a first structure of an analog camera provided by an embodiment of the present application.
  • FIG. 2 is a schematic diagram of a second structure of an analog camera provided by an embodiment of the present application.
  • FIG. 3 is a schematic diagram of a third structure of an analog camera provided by an embodiment of the present application.
  • FIG. 4 is a schematic flowchart of a data processing method applied to an analog camera side according to an embodiment of the present application
  • FIG. 5 is a schematic diagram of a first flow of a data transmission method applied to an analog camera side according to an embodiment of the present application
  • FIG. 6 is a schematic diagram of a first structure of an image frame
  • FIGS. 7a to 7d are schematic diagrams showing several arrangement forms of two image frames provided by an embodiment of the present application.
  • FIG. 7e is a schematic diagram of a transmission frame for two image frames according to an embodiment of the present application.
  • 7f is a schematic diagram of a method for transmitting video stream data and target data
  • FIG. 8 is a second schematic flowchart of a data transmission method applied to an analog camera side according to an embodiment of the present disclosure
  • FIG. 9 is a schematic diagram of a second structure of an image frame according to an embodiment of the present disclosure.
  • FIG. 9b is a schematic diagram of a transmission frame for an image frame according to an embodiment of the present application.
  • FIG. 10 is a third schematic flowchart of a data transmission method applied to an analog camera side according to an embodiment of the present disclosure.
  • FIG. 11 is a schematic diagram of a third structure of an image frame according to an embodiment of the present application.
  • FIG. 12 is a first schematic flowchart of a data processing method applied to a server side according to an embodiment of the present disclosure
  • FIG. 13 is a schematic diagram of a second process of a data processing method applied to a server side according to an embodiment of the present disclosure
  • FIG. 14 is a third schematic flowchart of a data processing method applied to a server side according to an embodiment of the present disclosure
  • FIG. 15 is a schematic structural diagram of a monitoring system according to an embodiment of the present application.
  • FIG. 16 is a schematic diagram of a first structure of a server according to an embodiment of the present application.
  • FIG. 17 is a schematic diagram of a second structure of a server according to an embodiment of the present application.
  • FIG. 18 is a schematic diagram of a third structure of a server according to an embodiment of the present application.
  • FIG. 19 is a schematic diagram of a fourth structure of a server according to an embodiment of the present application.
  • FIG. 20 is a schematic diagram of a fifth structure of a server according to an embodiment of the present application.
  • FIG. 21 is a schematic diagram of a fourth structure of an analog camera according to an embodiment of the present application.
  • FIG. 22 is a schematic structural diagram of a sixth type of a server according to an embodiment of the present disclosure.
  • the embodiment of the present application provides an analog camera, a server, a monitoring system, and a data transmission and processing method.
  • the following is a detailed description of an analog camera provided by an embodiment of the present application.
  • FIG. 1 is a schematic diagram of a first structure of an analog camera according to an embodiment of the present disclosure, including:
  • the image acquisition chip 100, the image analysis chip 200, the integrated chip 300, and the transmission chip 400 are connected to the image analysis chip 200 and the integrated chip 300, and the image analysis chip 200 is connected to the integrated chip 300.
  • the transmitting chip 400 is connected;
  • the image acquisition chip 100 is configured to collect digital video stream data
  • the image analysis chip 200 is configured to analyze the digital video stream data, identify a target existing in the digital video stream data, extract attribute and/or location information of the target, and set the attribute of the target and/or Or location information is sent to the integrated chip 300;
  • the integrated chip 300 is configured to insert the attribute and/or location information of the target into the digital video stream data according to a preset insertion mode, to obtain mixed data, and send the mixed data to the transmitting chip 400;
  • the transmitting chip 400 is configured to convert the mixed data or the digital video stream data in the mixed data into analog data to obtain converted mixed data; and send the converted mixed data.
  • the analog camera includes an image analysis chip, and the image analysis chip analyzes the collected digital video stream data, identifies the target existing in the digital video stream data, and extracts the attribute of the target and/or Position information; and sending the attribute and/or location information of the target to the integrated chip, and the integrated chip inserts the attribute and/or position information of the target into the digital video stream data according to the preset insertion mode to obtain mixed data;
  • the analog camera in the scheme realizes the analysis and processing of the acquired image.
  • the image acquisition chip 100 converts the optical signal into an image digital signal to obtain digital video stream data.
  • the image capturing chip 100 may be an image sensor, or may be other, and is not limited thereto.
  • the image capture chip 100 may copy the digital video stream data to obtain two pieces of digital video stream data, one of which is sent to the image analysis chip 200, and the other is sent to the integrated chip 300.
  • the image analysis chip 200 analyzes the received digital video stream data and analyzes and identifies the target existing in the digital video stream data.
  • the face recognition may be used to identify the face area in the digital video stream data
  • the license plate recognition may be used to identify the license plate in the digital video stream data, etc., and the specific analysis manner is not limited.
  • the attributes of the target such as face features, license plate numbers, etc., may be extracted, or the position information of the target in the video image may be determined.
  • the attribute and/or position information of the target analyzed by the image analysis chip 200 is referred to as image analysis data.
  • the image analysis chip 200 transmits the image analysis data to the integrated chip 300.
  • the integrated chip 300 receives both the digital video stream data sent by the image acquisition chip 100 and the image analysis data sent by the image analysis chip 200.
  • the integrated chip 300 inserts the image analysis data into the digital video stream data according to a preset insertion mode to obtain mixed data.
  • a preset insertion mode There are various insertion modes, such as inserting image analysis data into the blanking area in the digital video stream data, or using the first half of the digital video stream data, the latter half of the image analysis data insertion mode, or using n frames of digital video.
  • the flow data, the insertion mode of one frame of image analysis data, and the like are not limited in specific terms.
  • the integrated chip 300 may also capture an image carrying the target from the digital video stream data; the target is the same target as the target pointed to by the attribute and/or location information in the image analysis data.
  • the integrated chip 300 and the image analysis chip 200 can simultaneously process the digital video stream data, such that the image captured by the integrated chip 300 and the image analyzed by the image analysis chip 200 are the same image, and therefore, the target recognized by the image analysis chip 200 It exists in the image captured by the integrated chip 300.
  • the attribute and/or position information of the target analyzed by the image analysis chip 200 and the image captured by the integrated chip 300 are referred to as image analysis data.
  • the integrated chip 300 inserts the image analysis data into the digital video stream data according to the preset insertion mode, obtains the mixed data, and transmits the mixed data to the transmitting chip 400.
  • the mixed data includes not only the attribute and/or location information of the target but also the image carrying the target, and the information associated with the target carried in the mixed data is more abundant.
  • the image may be compressed to obtain a compressed image, and the compressed image and target are compressed according to a preset insertion mode.
  • the attribute and/or location information is inserted into the digital video stream data to obtain mixed data, and the mixed data is transmitted to the transmitting chip 400.
  • the image is compressed, and the mixed data occupies less space.
  • the analog camera transmits mixed data, the occupied data bandwidth is smaller.
  • the image capture chip 100 can copy the digital video stream data to obtain two digital video stream data, one copy is sent to the image analysis chip 200, and the other is sent to the integrated chip 300" as another implementation manner.
  • the image capture chip may also not copy the digital video stream data, and only send one digital video stream data to the integrated chip 300.
  • the integrated chip 300 can capture an image in the digital video stream data, and send the captured image to the image analysis chip 200; the image analysis chip 200 receives the image, analyzes the image, and identifies the image. A target existing in the image, extracting attribute and/or location information of the target; and transmitting the attribute and/or location information of the target to the integrated chip 300.
  • the integrated chip 300 may insert the attribute and/or location information of the target into the digital video stream data according to a preset insertion mode to obtain mixed data; or, the integrated chip 300 may capture according to a preset insertion mode.
  • the image, the attribute of the target, and/or the location information are inserted into the digital video stream data to obtain mixed data.
  • the captured image may be compressed to obtain a compressed image; and the compression is performed according to a preset insertion mode.
  • the subsequent image, the attribute of the target, and/or the location information are inserted into the digital video stream data to obtain mixed data.
  • the mixed data takes up less space, and when the analog camera transmits mixed data, the occupied data bandwidth is smaller.
  • the data inserted by the integrated chip in the digital video stream data may be different.
  • only the attribute and/or location information of the target is inserted; in other embodiments, the target is inserted. Attributes and/or location information, and captured images; in other embodiments, the attributes and/or location information of the target are inserted, as well as the compressed image; for ease of description, these inserted data may be referred to as Image analysis data.
  • image analysis data into digital video stream data there are various ways to insert image analysis data into digital video stream data, such as inserting image analysis data into a blanking area in digital video stream data, or inserting digital video stream data in the first field and image data in the second field.
  • the mode, or, the use of n frames of digital video stream data, the insertion mode of one frame of image analysis data, and the like, are not specifically limited.
  • the integrated chip may further arrange the image analysis data inserted into the blanking area according to a preset arrangement manner to obtain the arranged data;
  • the arranged data is converted into data in a preset format;
  • the digital video stream data is converted into data in the same format.
  • the image analysis data may be structurally arranged, for example, information such as the size of the data of the image analysis data, the data type, and the like may be determined, and the information is added as header information to the front of the image analysis data, and the header is added.
  • the image analysis data of the part information is the arranged data. There are many ways to structure the arrangement, and the specifics are not limited.
  • the aligned data is then converted to data in the BT1120 format; the digital video stream data is converted to data in the BT1120 format.
  • the two types of data may be converted into data in other formats, which are not limited.
  • the transmitting chip 400 may convert the mixed data into analog mixed data and transmit the simulated mixed data.
  • the digital video stream data and the image analysis data are all digital-to-analog converted, and the converted analog data is transmitted.
  • the transmitting chip 400 may convert the digital video stream data in the mixed data into analog digital video stream data; convert the image analysis data in the mixed data into low frequency digital data; and send the analog digital video stream data and The low frequency digital data, the image analysis data including attribute and/or location information of the target.
  • the image analysis data may include: “attribute attribute and/or location information", or “target attribute and/or location information, and captured image", or “target attribute and/or location information”. And compressed images”.
  • the digital video stream data is digital-to-analog converted, and the image analysis data is not digital-to-analog converted, so that data loss caused by digital-to-analog conversion and analog-to-digital conversion of the image analysis data can be avoided, or Other unknown errors.
  • the image analysis data is converted into low-frequency digital data for transmission, and the energy consumption is small.
  • the integrated chip 300 may include a plurality of integrated chip sets.
  • the transmitting chip 400 is integrally disposed in the integrated chip 300, or the integrated chip 300 includes The chip 400 is transmitted.
  • the transmitting chip 400 may be separately configured, and is not limited thereto.
  • the integrated chip 300 includes an image processing chip 500 and an insertion chip 600.
  • An image processing chip 500 configured to perform color and/or brightness processing on the digital video stream data
  • the chip 600 is inserted to insert the attribute and/or location information of the target into the processed digital video stream data according to a preset insertion mode to obtain mixed data.
  • the image processing chip 500 may be an ISP (Image Signal Processing) chip, or may be an MCU (Microcontroller Unit).
  • ISP Image Signal Processing
  • MCU Microcontroller Unit
  • the image processing chip 500 performs color and/or brightness processing on the digital video stream data, and may include: AEC (Automatic Exposure Control), AGC (Automatic Gain Control), and AWB (Automatic White Balance). Balance), color correction, etc., are not limited.
  • AEC Automatic Exposure Control
  • AGC Automatic Gain Control
  • AWB Automatic White Balance
  • Balance color correction
  • chips can be integrated in the integrated chip, or the integrated chip can also include other chips, which are not limited. By integrating multiple chips, the space in the analog camera can be reduced, making the analog camera more compact and smaller.
  • the embodiment of the present application further provides a data processing method on the analog camera side. As shown in FIG. 4, the method includes:
  • S401 Collect digital video stream data.
  • S402 Analyze the digital video stream data, identify a target existing in the digital video stream data, and extract attribute and/or location information of the target.
  • S403 Insert the attribute and/or location information of the target into the digital video stream data according to a preset insertion mode to obtain mixed data.
  • S404 Convert the mixed data or the digital video stream data in the mixed data into analog data to obtain converted mixed data.
  • S405 Send the converted mixed data.
  • the analog camera analyzes the collected digital video stream data, identifies the target existing in the digital video stream data, and extracts the attribute and/or position information of the target; according to the preset insertion mode, The attribute and/or position information of the target is inserted into the digital video stream data to obtain mixed data; it can be seen that the analog camera in the solution realizes the analysis and processing of the collected image.
  • FIG. 4 The embodiment shown in FIG. 4 is described in detail below:
  • S401 Collect digital video stream data.
  • An analog camera is usually provided with an image acquisition chip, and the image acquisition chip can convert the optical signal into an image digital signal to obtain digital video stream data.
  • S402 Analyze the digital video stream data, identify a target existing in the digital video stream data, and extract attribute and/or location information of the target.
  • the face recognition may be used to identify the face area in the digital video stream data
  • the license plate recognition may be used to identify the license plate in the digital video stream data, etc.
  • the specific analysis manner is not limited.
  • the attributes of the target such as face features, license plate numbers, etc., may be extracted, or the position information of the target in the video image may be determined.
  • S403 Insert the attribute and/or location information of the target into the digital video stream data according to a preset insertion mode to obtain mixed data.
  • an image carrying the target may also be captured in the digital video stream data.
  • the target is the same target as the target pointed to by the attribute and/or position information in the image analysis data.
  • the collected digital video stream data can be copied to obtain two pieces of digital video stream data.
  • S402 includes: analyzing one of the two digital video stream data to identify a target existing in the digital video stream data, and extracting attribute and/or location information of the target.
  • an image carrying the target is captured in another digital video stream data of the two digital video stream data.
  • S403 includes: inserting the captured image, the attribute of the target, and/or location information into the digital video stream data according to a preset insertion mode to obtain mixed data.
  • the mixed data includes not only the attribute and/or location information of the target but also the image carrying the target, and the information associated with the target carried in the mixed data is richer.
  • the captured image may be compressed to obtain a compressed image.
  • S403 includes: In the insertion mode, the compressed image, the attribute of the target, and/or the location information are inserted into the digital video stream data to obtain mixed data.
  • the image is compressed, and the mixed data occupies less space.
  • the analog camera transmits mixed data, the occupied data bandwidth is smaller.
  • the "target attribute and/or position information” obtained by the above analysis, or “target attribute and/or position information, and captured image”, or “target attribute and/or position information, and The compressed image is referred to as image analysis data.
  • the image analysis data is different.
  • S403 includes: inserting image analysis data into the digital video stream data according to a preset insertion mode to obtain mixed data.
  • insertion modes such as inserting image analysis data into the blanking area in the digital video stream data, or using the first half of the digital video stream data, the latter half of the image analysis data insertion mode, or using n frames of digital video.
  • the flow data, the insertion mode of one frame of image analysis data, and the like are not limited in specific terms.
  • S404 Convert the mixed data or the digital video stream data in the mixed data into analog data to obtain converted mixed data.
  • digital video stream data it is usually converted into analog data for transmission.
  • image analysis data it can be converted into analog data for transmission, or digital data can be transmitted without digital-to-analog conversion. Therefore, in the present embodiment, only the digital video stream data in the mixed data can be converted into analog data, or the mixed data can be all converted into analog data.
  • S404 may include: converting digital video stream data in the mixed data into analog video stream data; and converting image analysis data in the mixed data into low frequency digital data.
  • the image analysis data is also the "target attribute and/or position information" obtained by the above analysis, or “target attribute and/or position information, and captured image", or “target attribute and/or position information”. And compressed images”.
  • the digital video stream data is digital-to-analog converted, and the image analysis data is not digital-to-analog converted, so that data loss caused by digital-to-analog conversion and analog-to-digital conversion of the image analysis data can be avoided, or Other unknown errors.
  • the image analysis data is converted into low-frequency digital data for transmission, and the energy consumption is small.
  • S404 may include:
  • the image analysis data may be structurally arranged, for example, information such as the size of the data of the image analysis data, the data type, and the like may be determined, and the information is added as header information to the front of the image analysis data, and the header is added.
  • the image analysis data of the part information is the arranged data. There are many ways to structure the arrangement, and the specifics are not limited.
  • the aligned data is then converted to data in the BT1120 format; the digital video stream data is converted to data in the BT1120 format.
  • the two types of data may be converted into data in other formats, which are not limited.
  • S405 Send the converted mixed data.
  • the analog camera can transmit the converted mixed data obtained above to the connected server.
  • the embodiment of the present application further provides a plurality of data transmission methods for transmitting the converted mixed data.
  • the first data transmission method will be described in detail below through specific embodiments.
  • the first data transmission method includes the following steps: S501 to S502:
  • the second image frame is an image frame that carries video stream data and does not carry target data, and the target data includes attribute and/or location information of the target obtained above.
  • the image frame transmitted by the analog camera can be as shown in FIG. 6, and the image frame includes an effective image area and a blanking area.
  • the first image frame and the second image frame are different types of image frames, and the second image frame carries video stream data and does not carry target data, and the first image frame effective image area carries target data.
  • the effective image area can also be referred to as a data area.
  • the video stream data is also image data.
  • the analog camera can capture image data (video stream data) and send the captured image data to the server.
  • Coaxial data is notification information for interaction between the server and the analog camera.
  • This coaxial data can also be referred to as PTZ data.
  • the coaxial data exchanged between the video generating end and the video receiving end may include coaxial transmitting data sent by the video generating end to the video receiving end, and may also include video.
  • the coaxial receiving data sent by the receiving end to the video generating end may include shooting mode information sent by the analog camera to the DVR, information ready for upgrading, etc., and may also include between the analog camera and the DVR.
  • DVR Digital Video Recorder
  • Handshake data can be used to send data indicating device type, image resolution, etc.; the coaxial data can also include control information sent by the DVR to the analog camera, and the control information can include control instructions for the camera, such as image parameters. Adjustment command, camera aperture adjustment command, camera rotation adjustment command, resolution switching command, remote upgrade data command, etc.
  • the coaxial data may also include information such as the location of the image data in the image frame and the location of the target data in the image frame.
  • Coaxial data may or may not be included in the image frame.
  • the target data is used as data carried in the effective image area of the first image frame, and the target data is sent by using a data transmission manner of the effective image area; wherein the first image frame and the second image frame occupy The same transmission channel.
  • converted mixed data includes video stream data and target data, and therefore, transmission of "converted mixed data” is realized by S501 and S502.
  • the transmission channel may be a coaxial cable, a twisted pair cable or other transmission materials, which is not specifically limited in this application.
  • the same transmission channel can be understood as the same line.
  • the first image frame and the second image frame may be alternately transmitted between the analog camera and the server.
  • the effective image area of the first type of image frame is used to transmit target data
  • the effective image area of the second type of image frame is used to transmit image data, that is, video stream data. In this way, the transmission of the video stream data and the target data can be realized through the same transmission channel.
  • Coaxial cable or twisted pair cable can be used to transmit video stream data between the analog camera and the server. In order to transmit the target data while avoiding additional wiring, it can be transmitted through the same transmission channel as used when transmitting the video stream data.
  • the target data to be sent can be obtained in this embodiment, and the target data is used as the data carried in the effective image area of the first image frame by using the same transmission channel used when transmitting the video stream data, and the effective image area is adopted.
  • the data transmission method is to send the target data.
  • the type of the first type of image frame is different from the type of the second type of image frame that carries the video stream data and does not carry the target data. Therefore, the solution provided by the embodiment can enable the analog camera to transmit the target data to the server through the same transmission channel used when transmitting the video stream data, without additional wiring, thereby saving equipment cost.
  • the blanking area of the first image frame can also be used to transmit target data.
  • the remaining target data can be transmitted through a partial position in the blanking area.
  • the partial position may be a position in the blanking area other than the position for transmitting the coaxial data.
  • the first type of image frame and the second type of image frame may be transmitted at intervals according to a preset regularity, or may be randomly transmitted. See the following examples for details.
  • S502 may include: determining whether a preset number of second image frames have been continuously transmitted, and determining that the target data is the first type when it is determined that a preset number of second image frames have been continuously transmitted.
  • the data carried in the effective image area of the image frame is transmitted by using the data transmission mode of the effective image area, wherein the first image frame and the second image frame occupy the same transmission channel.
  • the number of the first type of image frames used for transmitting the target data may be preset or may be determined according to the amount of data of the target data.
  • the number of the first type of image frames used to transmit the target data may be one or more.
  • a preset number of second image frames may also be continuously transmitted.
  • each of the N second image frames carrying the image data may be sent, and the first image frame carrying the target data may be sent, and then the N second image frames carrying the image data may be sent. repeat.
  • the value of N may or may not be fixed, and the value of M may or may not be fixed.
  • N is a fixed value and M is an unfixed value.
  • M may be a quantity obtained by dividing the data amount of the target data by the data amount of the effective image area of the single first image frame.
  • the effective image area of a single first image frame can transmit 2000 lines of data, and each line can transmit 100 kB of data, and the effective image area of a single first type of image frame can transmit about 0.2 MB of data.
  • FIG. 7a is a schematic diagram of an arrangement of the first image frame and the second image frame at the time of transmission.
  • the open rectangle represents the second image frame
  • the solid rectangle represents the first image frame.
  • the value of N is a fixed 5 frames
  • the value of M is not fixed, and may be 2, 1, 4, 1, 3, etc.
  • N is a fixed value and M is a fixed value.
  • the image data and the target data are transmitted at regular intervals.
  • the first image frame and the second image frame transmitted in this manner are evenly arranged, and the smoothness of the image preview can be ensured when the server receives the image data.
  • FIG. 7b is another schematic diagram of the arrangement of the first image frame and the second image frame at the time of transmission.
  • the open rectangle represents the second image frame
  • the solid rectangle represents the first image frame.
  • the value of N is a fixed 5 frames
  • the value of M is a fixed 2 frames.
  • the number N of the first image frame and the number M of the second image frame may both be random, that is, N is an unfixed value, and M is also an unfixed value.
  • the time at which the target data is acquired may be not fixed, and the amount of data of the target data acquired each time may also be unfixed.
  • the arrangement of the two image frames can be disorganized.
  • FIG. 7c is another schematic diagram of the arrangement of the first image frame and the second image frame at the time of transmission.
  • the open rectangle represents the second image frame
  • the solid rectangle represents the first image frame. It can be seen that the values of N and M in Figure 7c are all random.
  • N is an unfixed value and M is a fixed value.
  • the time at which the target data is acquired may not be fixed, but the amount of data of the target data acquired each time may be fixed.
  • FIG. 7d is another schematic diagram of the arrangement of the first image frame and the second image frame at the time of transmission.
  • the open rectangle represents the second image frame
  • the solid rectangle represents the first image frame. It can be seen that the value of N in FIG. 7c is not fixed, and the value of M is fixed at 2.
  • the image frame may be identified as the first type of image frame by adding coaxial data in the image frame, so that the server determines the first type from the received image frame.
  • Image frame may include data indicating that the data in the effective image area of the image frame is the target data, and the image frame may be determined to be the first type of image frame according to the data.
  • the sending the converted mixed data may include:
  • the target data includes attribute and/or location information of the target
  • the target data is used as data carried in the effective image area of the first image frame, and the target data is sent by using a data transmission manner of the effective image area;
  • sending a second image frame where the second image frame is an image frame carrying video stream data and not carrying target data, wherein the first image frame and the second image frame occupy The same transmission channel.
  • the preset data amount threshold may be a preset value, for example, may be a maximum amount of data that can be stored in an effective image area of a single image frame, or may be a times the maximum amount of data, and a is a positive integer. Alternatively, the preset data amount threshold may also be any other value.
  • the target data can be transmitted with one first image frame at a time.
  • the preset data amount threshold is a times the maximum data amount, a first type of image frame may be continuously transmitted each time. If the value of a is greater than the specified value, in order to ensure the image fluency of the server, the second image frame cannot be used for a long time. In this case, the first image frame can be transmitted intermittently with the second image frame.
  • the location of the target data in the effective image area may be fixed or unfixed.
  • the position of the target data may be determined by the header identifier of the target data start position carried by the target data and the tail identifier indicating the position of the end of the target data.
  • the header identifier may be a first preset bit string
  • the tail identifier may be a second preset bit string.
  • the first first predetermined number of bits of the target data may be a first preset bit string
  • the second second predetermined number of bits of the target data may be a second preset bit string.
  • the method before the transmitting the mixed data, may further include: acquiring coaxial data to be sent; thus, using the coaxial data as the first type with the target data
  • the data carried in the blanking area of the image frame uses the data transmission mode of the blanking area to transmit the coaxial data.
  • Figure 7e is a transmission architecture diagram of an analog camera transmitting two image frames to a server.
  • the first image frame includes target data and coaxial data
  • the second image frame includes video stream data (image data) and coaxial data.
  • the coaxial data is optional, and the two image frames may not include coaxial data.
  • both the coaxial data and the target data can be sent to the server through the first image frame, which can make the data type of the transmission more abundant and the data transmission efficiency is higher.
  • the coaxial data may further include: a coaxial data identifier indicating that the data in the effective image area of the image frame is the target data.
  • the server can determine the data of the effective image area of the image frame as the target data according to the coaxial data identifier in the coaxial data.
  • the coaxial data may also include an identification that the data representing the effective image area of the image frame is not the target data. When the data of the effective image area is not the target data, the data of the effective image area may be image data, or other data or no data.
  • the data indicating the effective image area of the image frame is the target data
  • the data indicating the effective image area of the image frame is not the target data.
  • the server receives the image frame, the coaxial data can be acquired from the image frame, and whether the data of the effective image area of the image frame is the target data is determined based on the data of the specified position of the coaxial data.
  • the first image frame and the second image frame are irregularly arranged, that is, when the values of N and/or M are not fixed, in order to be able to determine which image frame carries the target data, it may be based on whether the coaxial data contains the image frame.
  • the data in the effective image area is judged by the data of the target data.
  • the image frame is considered to be the first type of image frame; otherwise, the image frame is considered to be The second image frame.
  • the coaxial data identifier indicating that the data in the effective image area of the image frame is the target data may be implemented by using a specified identifier, or may be implemented by other methods, which is not specifically limited in this application.
  • the coaxial data may further include data indicating the position of the target data in the effective image area.
  • the target data may occupy the effective image area or may not occupy the effective image area.
  • the data indicating the position of the target data in the effective image area in the coaxial data may be the position of the effective image area.
  • the data indicating the position of the target data in the effective image area in the coaxial data may be the actual position of the target data.
  • the server can determine the location of the target data in the effective image area based on the data in the coaxial data, thereby obtaining the target data more accurately.
  • the coaxial data in this embodiment may include a coaxial data identifier indicating that the data in the effective image area of the image frame is the target data, and determining that the image frame belongs to the first image frame or the second image according to the coaxial data identifier. Frames enable the server to obtain target data more accurately.
  • transmission parameters can be improved to ensure transmission of image data and target data.
  • the original transmission parameter is 2MP25, which can be understood as an image of 2 megapixels transmitted 25 frames per second.
  • the transmission of image data can be realized by transmitting parameters with 2MP25.
  • 2MP25 can be increased to 2MP30.
  • a transmission method of image frames carrying video stream data and target data, respectively, is shown in Fig. 7f.
  • the embodiment of the present application further provides a second data transmission method for transmitting the converted mixed data.
  • the second data transmission method will be described in detail below through specific embodiments. As shown in FIG. 8, the method includes the following steps: S801 to S802:
  • S801 Determine a first position of the video stream data in an effective image area of the image frame, and determine a second position of the target data in an effective image area of the image frame, the target data including attributes and/or locations of the target information.
  • the target data is data different from the video stream data and the coaxial data.
  • the coaxial data is notification information for interaction between the server and the analog camera, and the coaxial data may also be referred to as PTZ data.
  • the coaxial data exchanged between the video generating end and the video receiving end may include coaxial transmitting data sent by the video generating end to the video receiving end, and may also include video.
  • the coaxial data may include shooting mode information sent by the analog camera to the DVR, information ready for upgrading, etc., and may also include between the analog camera and the DVR.
  • DVR Digital Video Recorder
  • Handshake data can be used to send data indicating device type, image resolution, etc.; the coaxial data can also include control information sent by the DVR to the analog camera, the control information can include control instructions for the analog camera, such as Image parameter adjustment command, camera aperture adjustment command, camera rotation adjustment command, resolution switching command, remote upgrade data command, etc.
  • the coaxial data may also include information such as the location of the video stream data in the image frame and the location of the target data in the image frame.
  • Coaxial data may or may not be included in the image frame.
  • the video stream data and the target data are used as the data of the same image frame, and the data stream transmission mode of the effective image area is used, the video stream data is sent according to the first location, and the target data is sent according to the second location.
  • converted mixed data includes video stream data and target data, and therefore, transmission of "converted mixed data” is realized by S801 and S802.
  • the first preset position when determining the first position of the video stream data in the effective image area of the image frame, the first preset position may be determined as the first position of the video stream data in the effective image area of the image frame, the preset position Includes start position and end position.
  • the first position of the video stream data in the effective image area of the image frame may also be determined based on the amount of data of the video stream data.
  • the first position of the video stream data in the effective image area of the image frame may be determined as:
  • the first preset initial position starts to a first end position; wherein the first end position is: a position obtained by adding a data amount of the video stream data to the first preset initial position.
  • the first preset initial position and the first end position are both located in the effective image area.
  • the second preset position may be determined as the second position of the target data in the effective image area. It is also possible to determine the second position of the target data in the effective image area based on the amount of data of the target data.
  • the second position of the target data in the effective image area may be determined as: starting from the second preset initial position, to a second end position; wherein the second end position is: a position obtained by adding a data amount of the target data to the second preset initial position.
  • the second preset initial position and the second end position are both located in the effective image area.
  • the second position may be a fixed position of the effective image area of the image frame, or may be an unfixed position.
  • the second position of the target data in the effective image area may be a position other than the position of the video stream data in the effective image area.
  • the second location may be a location of a partial area other than the location of the video stream data in the effective image area, or may be a location of all areas except the location of the video stream data in the effective image area. No specific restrictions.
  • the video stream data is also image data
  • the analog camera can collect image data (video stream data) and send the collected image data to the server.
  • the resolution specification of the image frame transmitted by the analog camera and the server can be improved, and the high specification transmission frequency can be used to transmit the low specification resolution.
  • the transmission parameters of 4MP30 can be used to transmit data originally transmitted using the transmission parameters of 2MP30.
  • the transmission parameter xMPy can be understood as transmitting y frames of x megapixels of data per second.
  • the server and the analog camera may pre-arrange the second location, so that the server can acquire the target data according to the agreed second location, and improve the accuracy when acquiring the target data.
  • the target data may include a header identifier indicating a start position of the target data, and a tail identifier indicating a trailing position of the target data.
  • the start position of the target data may be determined according to the foregoing header identifier
  • the end position of the target data is determined according to the tail identifier, according to the determined start position of the target data and the end position of the target data.
  • the target data is acquired from the effective image area, so that the target data can be accurately acquired when the second position is not fixed. In this embodiment, the flexibility of transmitting target data is relatively large.
  • the header identifier may be a first preset bit string, and the tail identifier may be a second preset bit string.
  • the first first predetermined number of bits of the target data may be a first preset bit string, and the second second predetermined number of bits of the target data may be a second preset bit string.
  • the amount of data in the effective image area is much larger than the amount of data in the blanking area. If a part of the effective image area is assigned to the target data, the target data may be data of a large amount of data.
  • the video stream data and the target data are transmitted to the electronic device at the back end through the same image frame, and the above-mentioned "converted mixed data" can be transmitted through the same cable.
  • the data in the image frame may be transmitted in the form of a data stream row by row, or the data in the image frame may be transmitted in the form of a data stream column by column.
  • the embodiment determines the first position of the video stream data in the effective image area of the image frame, and determines the second position of the target data in the effective image area of the image frame, and uses the video stream data and the target data as The data of the same image frame is sent according to the first position, and the target data is transmitted according to the second position. Therefore, in this embodiment, the above-mentioned "converted mixed data" can be transmitted to the electronic device in the same image frame, and no additional wiring is required, thereby saving equipment cost.
  • the target data may be carried in all image frames, or the target data may be carried in part of the image frames.
  • the image frame carrying the video stream data and the target data in the effective image area may be transmitted to the server.
  • the image frame in which the effective image area carries the video stream data and the effective image area does not carry the target data may be transmitted to the server.
  • the method before the video stream data and the target data are sent, the method may further include the following steps 1 to 2:
  • Step 1 Obtain the coaxial data to be transmitted, and determine the third position of the coaxial data in the blanking area of the image frame.
  • the image frame includes video stream data, target data, and coaxial data.
  • Determining the coaxial data in the third position of the blanking area of the image frame may include: determining a third preset position in the blanking area of the image frame as the third position of the coaxial data. For example, the positions of the second to fourth lines preset in the blanking area may be determined as the third position.
  • the blanking area may further include a location of the coaxial data sent by the server to the analog camera, where the location may be a location different from the third location in the blanking zone.
  • the blanking area includes a field blanking area and a line blanking area.
  • the third position may be located in the field blanking area, or in the line blanking area, or part of the presence blanking area, and the other part in the blanking area.
  • the amount of storable data of the field blanking area is greater than the amount of storable data of the line blanking area, so the third position can be determined from the field blanking area to improve the storable coaxial data. The amount of data.
  • the step of acquiring the coaxial data to be sent may include: acquiring coaxial data including the data indicating the second location to be transmitted.
  • the server and the analog camera can pre-agreed the third position, so that when the server acquires the coaxial data, it can be acquired according to the agreed third position.
  • the second location may be determined according to the data indicating the second location included in the coaxial data, and the target data is acquired from the second location, thereby improving accuracy when acquiring the target data.
  • the coaxial data may further include other data than the data indicating the second location, which is not specifically limited in the present application.
  • Step 2 The coaxial data is used as the data of the image frame corresponding to the video stream data and the target data, and the data transmission mode of the blanking area is adopted, and the coaxial data is transmitted according to the third position.
  • Fig. 9a is a schematic diagram of data corresponding to an effective image area and a blanking area of an image frame.
  • the middle realized rectangular frame area is an effective image area
  • the effective image area is divided into two parts by a broken line, wherein one part is the area where the video stream data is located, and the other part is the area where the target data is located.
  • the portion outside the effective image area is the blanking area, and the blanking area is the area where the coaxial data is located.
  • Figure 9b is a transmission architecture diagram of an analog camera sending an image frame to a server.
  • one image frame includes video stream data, coaxial data and target data.
  • the embodiment can obtain the coaxial data, and send the video stream data, the target data, and the coaxial data as data of the same image frame to the server, so that the server receives the to-be-sent video stream data, the target data, and the coaxial through the same image frame. Data to improve data transmission efficiency.
  • the position of the data representing the second location in the blanking area of the image frame may be located before the second location.
  • the third position may be located before the second position, so that the data indicating the second position is ensured before the position of the blanking area of the image frame is located at the second position.
  • the coaxial data may be located in the blanking area above the active image area such that the location of the data at the second location in the blanking region of the image frame is before the second position in the active image area.
  • the server acquires data according to the data acquisition manner from the front to the back when receiving the data carried by the image frame, the server can be made to have the data indicating the second location before the second position in the blanking area of the image frame. Obtaining data representing the second location in the coaxial data, and then acquiring the target data from the effective image area according to the data representing the second location, thereby improving the efficiency of acquiring the target data.
  • video stream data may be superimposed on an effective image area in an analog signal manner, and coaxial data may be superimposed on a blanking area in a digital signal manner.
  • the resolution images of each specification have their corresponding transmission parameters, and the higher resolution images can use higher transmission parameters.
  • the transmission parameter can be 2MP30, which can be understood as an image of 2 million pixels transmitted 30 frames per second.
  • higher resolution transmission parameters can be used to transmit lower resolution images.
  • 4MP30 transmission parameters can be used to transmit image frames that could originally be transmitted with 2MP30 transmission parameters.
  • there are large free areas of the effective image area of the image frame which can be used to transfer the target data.
  • the coaxial data remains unchanged in the original transmission mode, and the video stream data occupies a part of the effective image area, and the target data occupies another part of the effective image area.
  • the analog camera can agree with the server where the target data is stored in the image frame.
  • the storage location of the target data in the image frame may also be placed in the coaxial data.
  • the server can more easily parse out the storage location from the coaxial data.
  • the data transmission mode provided in this embodiment can make corresponding image data exist in each image frame, so the real-time and synchronization of the data is good, and the amount of data that can be stored is also relatively large.
  • the embodiment does not limit the transmission material.
  • the transmission material may be a coaxial cable, a twisted pair or other materials, and no additional wiring is required, thereby reducing equipment costs.
  • the embodiment of the present application further provides a third data transmission method for transmitting the converted mixed data.
  • the third data transmission method will be described in detail below through specific embodiments. As shown in FIG. 10, the method includes the following steps: S1001 to S1002:
  • S1001 determining a first position of the video stream data in an effective image area of the image frame, and determining a second position of the target data in a blanking area of the image frame, the target data including attributes and/or locations of the target information.
  • the target data is data different from the video stream data and the coaxial data.
  • the coaxial data is notification information for interaction between the server and the analog camera, and the coaxial data may also be referred to as PTZ data.
  • the coaxial data exchanged between the video generating end and the video receiving end may include coaxial transmitting data sent by the video generating end to the video receiving end, and may also include video.
  • the coaxial data may include shooting mode information sent by the analog camera to the DVR, information ready for upgrading, etc., and may also include between the analog camera and the DVR.
  • DVR Digital Video Recorder
  • Handshake data which can be used to send data representing device type, image resolution, and the like.
  • the coaxial data may further include control information sent by the DVR to the analog camera, etc., and the control information may include control instructions for the camera, such as an image parameter adjustment instruction, a camera aperture adjustment instruction, a camera rotation adjustment instruction, a resolution switching instruction, and a remote upgrade. Data instructions, etc.
  • the coaxial data may also include information such as the location of the video stream data in the image frame and the location of the target data in the image frame.
  • Coaxial data may or may not be included in the image frame.
  • S1002 using the video stream data and the target data as the data of the same image frame, using the data transmission mode of the effective image area, transmitting the video stream data according to the first position, and transmitting the data according to the data transmission mode of the blanking area according to the second location. data.
  • converted mixed data includes video stream data and target data, and therefore, transmission of "converted mixed data” is realized by S1001 and S1002.
  • the first preset position when determining the first position of the video stream data in the effective image area of the image frame, the first preset position may be determined as the first position of the video stream data in the effective image area of the image frame, the preset position Includes start position and end position.
  • the first position of the video stream data in the effective image area of the image frame may also be determined based on the amount of data of the video stream data.
  • the first position of the video stream data in the effective image area of the image frame may be determined as:
  • the first preset initial position starts to a first end position; wherein the first end position is: a position obtained by adding a data amount of the video stream data to the first preset initial position.
  • the first preset initial position and the first end position are both located in the effective image area.
  • the second preset position may be determined as the second position of the target data in the blanking area of the image frame. It is also possible to determine the second position of the target data in the blanking area of the image frame based on the amount of data of the target data.
  • the second position of the target data in the blanking area of the image frame may be determined as: The initial position is set to the second end position; wherein the second end position is: the second preset initial position plus the data amount of the target data.
  • the second preset initial position and the second end position are both located in the blanking zone.
  • the second position may be a fixed position of the image frame blanking area, or may be an unfixed position.
  • the position of the coaxial data in the blanking area is a fixed position, so the second position of the target data in the blanking area may be a position other than the position of the coaxial data in the blanking area.
  • the second position may be a position of a partial area other than the position of the coaxial data in the blanking area, or may be a position of all areas except the position of the coaxial data in the blanking area, and the present application No specific restrictions.
  • the blanking area includes a field blanking area and a line blanking area.
  • the second position may be located in the field blanking area, or in the line blanking area, or part of the presence blanking area, and the other part in the blanking area.
  • the storable data amount of the field blanking area is greater than the storable data amount of the line blanking area, so the second position can be determined from the field blanking area to improve the data of the storable target data. the amount.
  • both the coaxial data and the location of the target data can be in the field blanking zone.
  • a field blanking zone of a 1920*1080 image frame is known to contain 36 lines.
  • the amount of data of the coaxial data is generally small, and is a byte-level data amount. Two to four rows of the blanking area of the field can be allocated for the coaxial data, and the remaining 32 lines can be allocated to the target data, so the blanking area can be stored.
  • the amount of data for the target data is on the order of a few hundred bytes.
  • the specification of the transmitted image frame can be improved. For example, an image frame of 2 megapixels is increased to an image frame of 3 megapixels.
  • the server and the analog camera may pre-arrange the second location, so that the server can acquire the target data according to the agreed second location, and improve the accuracy when acquiring the target data.
  • the target data may include a header identifier indicating a start position of the target data, and a tail identifier indicating a trailing position of the target data.
  • the start position of the target data may be determined according to the foregoing header identifier
  • the end position of the target data is determined according to the tail identifier, according to the determined start position of the target data and the end position of the target data.
  • the target data is acquired from the blanking area, so that the target data can be accurately acquired when the second position is not fixed. In this embodiment, the flexibility of transmitting target data is relatively large.
  • the header identifier may be a first preset bit string, and the tail identifier may be a second preset bit string.
  • the first first predetermined number of bits of the target data may be a first preset bit string, and the second second predetermined number of bits of the target data may be a second preset bit string.
  • the data in the image frame may be transmitted in the form of a data stream row by row, or the data in the image frame may be transmitted in the form of a data stream column by column.
  • the server may parse the image frame according to a preset data storage rule to obtain video stream data and target data in the image frame.
  • the embodiment determines the first position of the video stream data in the effective image area of the image frame, and determines the second position of the target data in the blanking area of the image frame, and uses the video stream data and the target data as
  • the data of the same image frame adopts the data transmission mode of the effective image area, the video stream data is transmitted according to the first position, and the data transmission mode of the blanking area is adopted, and the target data is transmitted according to the second position. Therefore, the embodiment of the present application can implement the above-mentioned “converted mixed data” to be sent to the electronic device in the same image frame without additional wiring, thereby saving equipment cost.
  • the target data may be carried in all image frames, or the target data may be carried in part of the image frames.
  • the image frame in which the effective image area carries the video stream data and the blanking area carries the target data may be sent to the server.
  • the image frame in which the effective image area carries the video stream data and the blanking area does not carry the target data may be sent to the server.
  • the method before the video stream data and the target data are sent, the method may further include the following steps 1 to 2:
  • Step 1 Obtain the coaxial data to be transmitted, and determine the third position of the coaxial data in the blanking area of the image frame.
  • the image frame includes video stream data, target data, and coaxial data.
  • Determining the coaxial data in the third position of the blanking area of the image frame may include determining the third preset position as the third position of the coaxial data in the blanking area of the image frame. For example, the positions of the second to fourth lines preset in the blanking area may be determined as the third position.
  • the blanking area may further include a location of the coaxial data sent by the server to the analog camera, where the location may be a location different from the third location in the blanking zone.
  • the third position may be before the second position or after the second position. Since the amount of data in the field blanking area is much larger than the amount of data in the line blanking area, the third position can be determined from the field blanking area.
  • the step of acquiring the coaxial data to be sent may include: acquiring coaxial data including the data indicating the second location to be transmitted.
  • the server and the analog camera can pre-agreed the third position, so that when the server acquires the coaxial data, it can be acquired according to the agreed third position.
  • the second location may be determined according to the data indicating the second location included in the coaxial data, and the target data is acquired from the second location, thereby improving accuracy when acquiring the target data.
  • the coaxial data may further include other data than the data indicating the second location, which is not specifically limited in the present application.
  • Step 2 The coaxial data is used as the data of the image frame corresponding to the video stream data and the target data, and the data transmission mode of the blanking area is adopted, and the coaxial data is transmitted according to the third position.
  • FIG. 11a is a schematic diagram of data corresponding to an effective image area and a blanking area of an image frame.
  • the vertical line shadow area is an effective image area
  • the blank area outside the vertical line shadow area is a blanking area.
  • the blanking area above and below the effective image area is the field blanking area
  • the blanking area on the left and right sides of the effective image area is the blanking area.
  • the storage location of the coaxial data is located in the upper field blanking zone
  • the storage location of the target data is located in the lower field blanking zone.
  • Figure 9b is a transmission architecture diagram of an analog camera sending an image frame to a server.
  • one image frame includes video stream data, coaxial data and target data.
  • the embodiment can obtain the coaxial data, and send the video stream data, the target data, and the coaxial data as data of the same image frame to the server, so that the server receives the video stream data, the target data, and the coaxial data through the same image frame. Improve data transfer efficiency.
  • the position of the data representing the second location in the blanking area of the image frame may be located before the second location.
  • the third position may be located before the second position, so that the data indicating the second position is ensured before the position of the blanking area of the image frame is located at the second position.
  • the server acquires data according to the data acquisition manner from the front to the back when receiving the data carried by the image frame, the server can be made to have the data indicating the second location before the second position in the blanking area of the image frame. Obtaining the data representing the second location in the coaxial data, and then acquiring the target data from the blanking area according to the data representing the second location, thereby improving the efficiency of acquiring the target data.
  • video stream data may be superimposed on an effective image area in an analog signal manner, and coaxial data may be superimposed on a blanking area in a digital signal manner.
  • Coaxial data has a small amount of data, which is roughly byte level (can be 6 to 24 bytes), so there are a large number of free blanking areas that can be used to store target data.
  • the target data can be stored in an idle blanking area.
  • the video stream data and the coaxial data remain unchanged in the original transmission mode, and some blanking areas that have not been used are used to fill the target data.
  • the analog camera can agree with the server where the target data is stored in the image frame.
  • the storage location of the target data in the image frame may also be placed in the coaxial data.
  • the server can more easily parse out the storage location from the coaxial data.
  • the data transmission mode provided in this embodiment can make corresponding target data exist in each image frame, so the real-time and synchronization of the data is good.
  • the embodiment does not limit the transmission material.
  • the transmission material may be a coaxial cable, a twisted pair or other materials, and no additional wiring is required, thereby reducing equipment costs.
  • the embodiment of the present application further provides a data processing method applied to a server side, where the server is connected to an analog camera.
  • the data processing method will be described in detail below through specific embodiments.
  • FIG. 12 is a first schematic flowchart of a data processing method applied to a server side according to an embodiment of the present disclosure, including:
  • S1201 Receive data sent by the analog camera to be processed.
  • an analog camera that transmits data to a server is referred to as an analog camera to be processed.
  • S1202 Separate the video stream data and the image analysis data from the received data.
  • the server and the analog camera can pre-agreed the way of superimposing and separating the video stream data and the image analysis data. For example, if the analog camera superimposes the image analysis data in the blanking area of the video stream data, the server reads the image in the blanking area. analyze data.
  • the server may determine a blanking area and an image area in the received data according to the blanking area identifier; read data in the blanking area, extract image analysis data; and read the image area Data, extract video stream data.
  • the analog camera and the server may also agree on the first few behaviors of the video stream data, and the first few acts to analyze the data, so that the server may also separate the video stream data and the image analysis data from the received data, and the specific separation method is not performed. limited.
  • the server separates the video stream data and the image analysis data
  • the two types of data are processed separately.
  • the video stream data can be encoded, stored, and the like, and is not limited.
  • the image analysis data is data obtained by analyzing the acquired image by an analog camera.
  • the image analysis data may include attribute and/or location information of the target, where the target is a target existing in the video stream data.
  • the attribute and/or location information of the target may be read in the image analysis data, and the target is a target existing in the video stream data.
  • the image analysis data may also include other content, such as an image carrying the target, and is not limited.
  • the analog camera can perform face recognition on the image and send the recognized face feature as image analysis data to the server. After the server separates the facial features, the facial features can be compared with the facial features stored in the database to determine the identity information corresponding to the facial.
  • the position information of the face in the image is determined, and the position information is sent to the server as image analysis data.
  • the server separates the location information, the location information can be used to directly obtain the face region in the image.
  • an analog camera can perform license plate recognition on an image and transmit the identified license plate number as image analysis data to the server. After the server separates the license plate number, the owner information corresponding to the license plate number can be determined, or the travel track corresponding to the license plate number can be found in the database.
  • the position information of the license plate in the image is determined, and the position information is sent to the server as image analysis data.
  • the server separates the location information, the license plate area can be directly obtained in the image through the location information.
  • the server processes the video stream data and the image analysis data in various ways, and can be set according to the actual situation, and is not limited.
  • an analog camera with an image analysis function can use a plurality of different transmission methods when transmitting video stream data and image analysis data, for example, first, converting video stream data and image analysis data into analog data. Sending; Second, converting only video stream data into analog data, transmitting analog video stream data, and digital image analysis data; 3. Converting video stream data into analog data, converting image analysis data into low frequency digital data, and transmitting simulation
  • the video stream data, the low frequency digital data, and the like, the specific transmission method is not limited.
  • the data received by the server is analog data
  • the server may perform analog-to-digital conversion on the received data to obtain converted digital data; and separate the digital data from the converted digital data.
  • Video stream data and digital image analysis data may also separate the analog video stream data and the analog image analysis data from the received data; perform analog-to-digital conversion on the analog video stream data and the analog image analysis data to obtain digital video stream data and digital image analysis. data.
  • the server separates the analog video stream data and the digital image analysis data from the received data; and performs analog-to-digital conversion on the analog video stream data to obtain digital video stream data.
  • the server separates the analog video stream data and the digital image analysis data from the received data; and performs analog-to-digital conversion on the analog video stream data to obtain digital video stream data;
  • the digital image analysis data is subjected to low frequency sampling to obtain data in a preset format, and the data of the preset format is marked.
  • the preset format may be the BT1120 format, or may be other formats, which are not limited. Data in a preset format can be tagged, or digital video stream data can be tagged to distinguish between the two types of digital data during subsequent processing.
  • FIG. 13 is a second schematic flowchart of a data processing method applied to a server side according to an embodiment of the present disclosure, including:
  • S1301 Receive data sent by the analog camera to be processed.
  • an analog camera that transmits data to a server is referred to as an analog camera to be processed.
  • the to-be-processed analog camera may be a device having an image analysis function or a device having no image analysis function.
  • S1302 Determine whether the to-be-processed analog camera has an image analysis function; if yes, execute S1303, if no, execute S1304.
  • a first type processing mode is used to process data sent by an analog camera having an image analysis function
  • the second type of processing mode is used to process data transmitted by an analog camera that does not have an image analysis function.
  • the data sent by the analog camera without image analysis function only contains the collected video stream data, and does not include the image analysis data.
  • the data sent by the analog camera with image analysis function includes both the collected video stream data and the image analysis. data. Therefore, the first type of processing mode can be understood as: a mode for processing the case where the data received by the S1301 includes video stream data and image analysis data, and the second type of processing mode can be understood as: not for the data received by the S1301. A mode in which the case where image analysis data is included is processed.
  • S1303 triggering the first type of processing mode: separating video stream data and image analysis data from the received data; and separately processing the separated video stream data and the image analysis data.
  • the server and the analog camera can pre-agreed the way of superimposing and separating the video stream data and the image analysis data. For example, if the analog camera superimposes the image analysis data in the blanking area of the video stream data, the server reads the image in the blanking area. analyze data.
  • the server may determine a blanking area and an image area in the received data according to the blanking area identifier; read data in the blanking area, extract image analysis data; and read the image area Data, extract video stream data.
  • the analog camera and the server may also agree on the first few behaviors of the video stream data, and the first few acts to analyze the data, so that the server may also separate the video stream data and the image analysis data from the received data, and the specific separation method is not performed. limited.
  • the server separates the video stream data and the image analysis data
  • the two types of data are processed separately.
  • the video stream data can be encoded, stored, and the like, and is not limited.
  • the image analysis data is data obtained by analyzing the acquired image by an analog camera.
  • the image analysis data may include attribute and/or location information of the target, where the target is a target existing in the video stream data.
  • the attribute and/or location information of the target may be read in the image analysis data, and the target is a target existing in the video stream data.
  • the image analysis data may also include other content, such as an image carrying the target, and is not limited.
  • the analog camera can perform face recognition on the image and send the recognized face feature as image analysis data to the server. After the server separates the facial features, the facial features can be compared with the facial features stored in the database to determine the identity information corresponding to the facial.
  • the position information of the face in the image is determined, and the position information is sent to the server as image analysis data.
  • the server separates the location information, the location information can be used to directly obtain the face region in the image.
  • an analog camera can perform license plate recognition on an image and transmit the identified license plate number as image analysis data to the server. After the server separates the license plate number, the owner information corresponding to the license plate number can be determined, or the travel track corresponding to the license plate number can be found in the database.
  • the position information of the license plate in the image is determined, and the position information is sent to the server as image analysis data.
  • the server separates the location information, the license plate area can be directly obtained in the image through the location information.
  • the server processes the video stream data and the image analysis data in various ways, and can be set according to actual conditions, and is not limited.
  • an analog camera with an image analysis function can use a plurality of different transmission methods when transmitting video stream data and image analysis data, for example, first, converting video stream data and image analysis data into analog data. Sending; Second, converting only video stream data into analog data, transmitting analog video stream data, and digital image analysis data; 3. Converting video stream data into analog data, converting image analysis data into low frequency digital data, and transmitting simulation
  • the video stream data, the low frequency digital data, and the like, the specific transmission method is not limited.
  • the data received by the server is analog data
  • the server may perform analog-to-digital conversion on the received data to obtain converted digital data; and separate the digital data from the converted digital data.
  • Video stream data and digital image analysis data may also separate the analog video stream data and the analog image analysis data from the received data; perform analog-to-digital conversion on the analog video stream data and the analog image analysis data to obtain digital video stream data and digital image analysis. data.
  • the server separates the analog video stream data and the digital image analysis data from the received data; and performs analog-to-digital conversion on the analog video stream data to obtain digital video stream data.
  • the server separates the analog video stream data and the digital image analysis data from the received data; and performs analog-to-digital conversion on the analog video stream data to obtain digital video stream data;
  • the digital image analysis data is subjected to low frequency sampling to obtain data in a preset format, and the data of the preset format is marked.
  • the preset format may be the BT1120 format, or may be other formats, which are not limited. Data in a preset format can be tagged, or digital video stream data can be tagged to distinguish between the two types of digital data during subsequent processing.
  • S1304 triggering the second type of processing mode: analyzing the received video stream data to obtain image analysis data; and processing the received video stream data and the analyzed image analysis data respectively.
  • the data sent by the analog camera that does not have the image analysis function does not include image analysis data, and the server analyzes the video stream data to obtain image analysis data.
  • the analysis may be face recognition, license plate recognition, feature extraction, and the like, and is not limited in specific terms.
  • the server processes the received video stream data and the analyzed image analysis data separately.
  • the video stream data can be encoded, stored, and the like, and is not limited.
  • the server performs face recognition on the video stream data and obtains the face feature
  • the face feature can be compared with the face feature stored in the database to determine the identity information corresponding to the face.
  • the server performs license plate recognition on the video stream data to obtain the license plate number
  • the vehicle owner information corresponding to the license plate number may be determined, or the travel track corresponding to the license plate number may be found in the database.
  • the server processes the video stream data and the image analysis data in various ways, and can be set according to actual conditions, and is not limited.
  • FIG. 14 is a third schematic flowchart of a data processing method applied to a server side according to an embodiment of the present disclosure, including:
  • S1401 Send an attribute request instruction to an analog camera in the system, and receive device attributes fed back by the analog camera in the system.
  • S1402 Record the device attribute of the received analog camera feedback.
  • S1403 Receive data sent by the analog camera to be processed.
  • S1405 Determine, according to the found device attribute, whether the to-be-processed analog camera has an image analysis function; if yes, execute S1406, if no, execute S1407.
  • S1406 triggering the first type of processing mode: separating video stream data and image analysis data from the received data; and separately processing the separated video stream data and the image analysis data.
  • S1407 triggering the second type of processing mode: analyzing the received video stream data to obtain image analysis data; and processing the received video stream data and the analyzed image analysis data separately.
  • the server may send an attribute request instruction to the analog camera in the system, or may send an attribute request instruction to the new analog camera only after detecting that the new analog camera is accessed in the system.
  • the analog camera After receiving the attribute request command, the analog camera sends its own device attribute to the server.
  • the device attribute may include: a device model, or a device hardware performance.
  • the server may determine whether the analog camera has an image analysis function according to the device model or the device hardware performance, and if yes, trigger the first type of processing mode. If not, triggers the second type of processing mode.
  • the device attribute may also include information that can directly reflect whether the analog camera has an image analysis function, and the server may directly determine, according to the information, whether the analog camera has an image analysis function, and if so, trigger the first type of processing mode. If not, triggers the second type of processing mode.
  • the device attribute may also include an analog camera name, or an ID, etc.
  • the server stores the name of the analog camera or device function information corresponding to the ID, and the server may determine the analog camera according to the name or ID of the analog camera. Whether it has an image analysis function, if it has, triggers the first type of processing mode, and if not, triggers the second type of processing mode.
  • the server pre-acquires the device attributes of the analog camera in the system, and determines whether the analog camera has an image analysis function according to the device attribute, and adopts different processing modes for the analog cameras with different functions, and applies the solution to the server. It can process data sent by analog cameras with image analysis functions, and can also process data sent by analog cameras without image analysis functions, thus solving system incompatibility problems.
  • the embodiment of the present application further provides a monitoring system, as shown in FIG. 15, including an analog camera 10 and a server 20, where
  • the analog camera 10 includes an image capturing chip 100, an image analyzing chip 200, an integrated chip 300, and a transmitting chip 400.
  • the image capturing chip 100 is connected to the image analyzing chip 200 and the integrated chip 300, respectively, and the image analyzing chip 200 is connected to the integrated chip 300.
  • the transmitting chip 400 is connected to the integrated chip 300;
  • the image acquisition chip 100 is configured to collect digital video stream data
  • An image analysis chip 200 configured to analyze the digital video stream data, identify a target existing in the digital video stream data, extract attribute and/or position information of the target as image analysis data; Image analysis data is sent to the integrated chip 300;
  • the integrated chip 300 is configured to insert the image analysis data into the digital video stream data according to a preset insertion mode, to obtain mixed data, and send the mixed data to the transmitting chip 400;
  • the transmitting chip 400 is configured to convert the mixed data or the digital video stream data in the mixed data into analog data to obtain converted mixed data; and send the converted mixed data to the server 20;
  • the server 20 is configured to receive the mixed data, and separate the mixed data into video stream data and image analysis data according to a separation mode corresponding to the preset insertion mode.
  • the analog camera in the monitoring system provided by the embodiment of the present application may be any analog camera provided in the embodiment of the present application, and details are not described herein.
  • the integrated chip 300 may have different data inserted in the digital video stream data, for example, in some embodiments, only the attributes and/or location information of the target are inserted; in other embodiments Inserting the attribute and/or position information of the target, and the captured image; in other embodiments, inserting the attribute and/or position information of the target, and the compressed image; for convenience of description, inserting the inserted data They are called image analysis data.
  • the server 20 and the analog camera 10 can pre-agreed the superimposing manner (insertion mode) and separation mode of the video stream data and the image analysis data. For example, if the analog camera 10 superimposes the image analysis data in the blanking area of the digital video stream data, the server 20 The image analysis data is then read in the blanking area.
  • the server 20 may determine a blanking area and an image area in the received data according to the blanking area identifier; read data in the blanking area as image analysis data; and read the image area The data in the data as the video stream.
  • the analog camera 10 and the server 20 may also agree on the first few behaviors of the video stream data, and the first few acts to analyze the data, so that the server 20 may also separate the video stream data and the image analysis data from the received data, and separate the data.
  • the method is not limited.
  • the server 20 separates the video stream data and the image analysis data, the two types of data are separately processed.
  • the video stream data can be encoded, stored, and the like, and is not limited.
  • the server 20 may compare the face feature with the face feature stored in the database to determine the identity information corresponding to the face.
  • the server 20 may determine the owner information corresponding to the license plate number, or search the database for the travel track corresponding to the license plate number.
  • the server may determine a face region in the image according to the location information, and the server may The face area is sent to the display device for display.
  • the server may extract facial features and the like after determining the face area.
  • the server 20 can process the video stream data and the image analysis data in various manners, and can be set according to actual conditions, and is not limited.
  • the analog camera sends the image analysis data to the server, and the server directly obtains the related information of the target according to the image analysis data, such as attributes, location information, images, etc., and the server does not need to perform the target in the image. Identification reduces the amount of computation of the server.
  • the transmitting chip 400 may convert the digital video stream data in the mixed data into analog video stream data; convert the image analysis data in the mixed data into low frequency digital data; and the analog video stream data and the low frequency The digital data is sent to the server 20.
  • the server 20 may perform analog-to-digital conversion on the separated video stream data to obtain first digital data, and perform low-frequency sampling on the separated image analysis data to obtain second digital data. The server 20 then processes the first digital data and the second digital data separately.
  • the separation process and the conversion process can be performed simultaneously.
  • the server 20 determines that the read data is video stream data according to an agreement with the analog camera 10, and then performs analog-to-digital conversion on the read data; the server 20 according to the analog camera 10
  • the agreement is to determine that the read data is image analysis data, and the read data can be sampled at a low frequency.
  • the separation process may be performed first, and then the conversion process may be performed, which is not limited.
  • the transmitting chip 400 converts the mixed data into analog mixed data and transmits the simulated mixed data to the server 20.
  • the server 20 may first separate the received mixed data into video stream data and image analysis data, and then perform analog-to-digital conversion on the separated video stream data and image analysis data to obtain digital data.
  • the server may first perform analog-to-digital conversion to obtain digital mixed data, and then separate the digital mixed data to obtain digital video stream data and image analysis data.
  • the separation process and the analog-to-digital conversion process may be performed at the same time, and the specifics are not limited.
  • the server 20 may convert the digital video stream data and the image analysis data into data in a preset format, such as data in the BT656 format, and the like.
  • the digital data converted by the video stream data is referred to as first digital data
  • the digital data converted by the image analysis data is referred to as second digital data.
  • the server 20 may mark the first digital data and/or the second digital data to distinguish the two types of digital data during subsequent processing.
  • the specific marking manner is not limited. For example, a special marking bit or the like may be added to the data header. If the first digital data and the second digital data are marked, the marking manner is different.
  • the analog camera 10 and the server 20 may be connected by a coaxial cable, or may be connected by other means, for example, by a twisted pair connection, a wireless connection, or the like, which is not limited.
  • the analog camera includes an image analysis chip, and the image analysis chip analyzes the collected digital video stream data, identifies the target existing in the digital video stream data, and extracts the attribute of the target and/or Position information; and sending the attribute and/or location information of the target to the integrated chip, and the integrated chip inserts the attribute and/or position information of the target into the digital video stream data according to the preset insertion mode to obtain mixed data;
  • the analog camera in the scheme realizes the analysis and processing of the acquired image.
  • the analog camera sends the image analysis data to the server, and the server directly obtains the related information of the target according to the image analysis data, such as attributes, location information, images, etc., and the server does not need to be in the image.
  • Target recognition reduces the amount of computation of the server.
  • FIG. 16 is a schematic structural diagram of a server according to an embodiment of the present disclosure, including:
  • the sensing signal receiving chip 110, the image signal receiving chip 120 and the signal processing chip 130, the sensing signal receiving chip 110 and the image signal receiving chip 120 are respectively connected to the signal processing chip 130;
  • the sensing signal receiving chip 110 is configured to receive the sensing signal sent by the wireless sensor, and send the sensing signal to the signal processing chip 130;
  • the image signal receiving chip 120 is configured to receive an image signal sent by the analog camera, and send the image signal to the signal processing chip 130;
  • the signal processing chip 130 is configured to perform correlation processing on the sensing signal and the image signal.
  • the sensing signal receiving chip in the server receives the sensing signal sent by the wireless sensor, and the image signal receiving chip receives the image signal sent by the analog camera, and the signal processing chip detects the sensing signal and the image.
  • the signal is correlated; it can be seen that the server in the solution realizes the correlation processing between the sensing signal and the image signal in the same scene.
  • the sensing signal receiving chip is pre-connected with one or more wireless sensors, so that the sensing signal receiving chip can receive and process the sensing signals transmitted by the wireless sensor.
  • the server may include multiple sensing signal receiving chips, and each sensing signal receiving chip is respectively paired with a wireless sensor; or, as another implementation manner, the server may include a sensing A signal receiving chip, the sensing signal receiving chip being paired with one or more wireless sensors.
  • the sensing signal receiving chip 110 includes an electromagnetic receiving antenna 1101 and a processor 1102, and the electromagnetic receiving antenna 1101 is configured to receive a sensing signal in the form of electromagnetic waves emitted by the wireless transmitting sensor, and The sensing signal in the form of an electromagnetic wave is sent to the processor 1102; the processor 1102 is configured to convert the sensing signal in the form of an electromagnetic wave into a sensing signal in the form of an electrical signal.
  • the wireless sensor generally transmits the sensing signal in the form of electromagnetic waves, and after receiving the electromagnetic wave, the sensing signal receiving chip 110 needs to convert the electromagnetic wave into an electrical signal that can be processed by itself.
  • the sensing signal receiving chip 110 includes an electromagnetic receiving antenna 1101, a processor 1102, and a classifier 1103 for receiving electromagnetic waves in the form of electromagnetic waves emitted by the wireless transmitting sensor. Signaling, and transmitting the sensing signal in the form of electromagnetic wave to the processor 1102; the processor 1102 is configured to convert the sensing signal in the form of electromagnetic waves into a sensing signal in the form of an electrical signal; and the classifier 1103 is configured to receive the processor
  • the sensing signal in the form of an electrical signal transmitted by 1102 determines the type information of the sensing signal in the form of the electrical signal, and transmits the sensing signal in the form of the electrical signal and the type information to the signal processing chip 130.
  • the wireless sensor in the embodiment of the present application may include any one or more of the following: a magnetic sensor, an infrared detector, a smoke sensor, a temperature sensor, a humidity sensor, or the like, and may be other specifics.
  • Different types of wireless sensors collect different types of environmental information.
  • the magnetic sensor can collect information on whether displacement occurs between the magnetic field and the magnet.
  • the infrared detector can collect information on whether there is a person in the preset area, and the smoke sensor can Collecting smoke information in the environment, the temperature sensor can collect temperature information in the environment, the humidity sensor can collect humidity information in the environment, and so on, and will not be enumerated one by one.
  • these different types of wireless sensors convert the collected environmental information into different types of sensing signals.
  • the magnetic sensor converts information about whether displacement between the magnetic field and the magnet is generated into the magnetic sensing.
  • the signal, the infrared detector converts whether the information of the person in the preset area is converted into the infrared sensing signal
  • the smoke sensor converts the smoke information in the environment into the smoke sensing signal
  • the temperature sensor converts the temperature information in the environment into the temperature sensing Signals
  • humidity sensors convert humidity information in the environment into humidity sensing signals, etc., are not listed one by one.
  • the classifier 1103 in FIG. 18 can classify the sensing signals sent by the processor 1102, that is, determine the class information of the sensing signals, and send the class information together with the sensing signals to the signal processing chip 130.
  • the signal processing chip 130 first performs analog-to-digital conversion on the image signal to obtain a digital image signal, and then performs format conversion on the digital image signal to obtain a preset format number.
  • Image signal For example, the digital image signal can be converted to a standard parallel data format and then correlated with the sensing signal.
  • the signal processing chip 130 performs correlation processing on the sensing signal and the image signal. For example, it is assumed that the signal processing chip 130 analyzes the image signal, and the analysis result indicates that there is a blurred target in the current scene, but the image is not determined according to the image signal.
  • the target is a human body or an object; in addition, the signal processing chip 130 receives the infrared sensing signal sent by the infrared detector, and analyzes the infrared sensing signal, and the analysis result indicates that there is a human target in the current scene; the signal processing chip 130 can The fuzzy target in the image signal analysis result is matched with the human target in the sensing signal analysis result. If the matching is successful, the fuzzy target in the image signal analysis result is a human body. Thus, the signal processing chip 130 implements the association processing of the sensing signal and the image signal.
  • the signal processing chip 130 analyzes the image signal only according to the analysis result of the image signal. It can only judge the fire in the scene, and can not judge the current fire size; in addition, the signal processing chip 130 receives the temperature sensing signal sent by the temperature sensor, analyzes the temperature sensing signal, and obtains the ambient temperature in the current scene; The processing chip 130 combines the image signal analysis result and the temperature sensing signal analysis result to determine the fire intensity more accurately.
  • the signal processing chip 130 receives the smoke sensing signal sent by the smoke sensor, and analyzes the smoke sensing signal.
  • the analysis result indicates that there is smoke in the current scene, but the scene cannot be determined based on the smoke sensing signal only.
  • the signal processing chip 130 receives the image signal and analyzes the image signal, and the analysis result indicates that there is no fire in the current scene, and there is a human target in smoking; the signal processing chip 130 combines Image signal analysis results, temperature sensing signal analysis results, can be determined that no fire has occurred, only one person is smoking.
  • the server may further include a display 140 connected to the signal processing chip 130.
  • the display 140 may be added to FIG. 17 and FIG. 18, and is not limited thereto.
  • the correlation processing result is transmitted to the display 140, and the display 140 displays the received correlation processing result.
  • the signal processing chip 130 performs position matching on the fuzzy target in the image signal analysis result and the human target in the sensing signal analysis result. If the matching is successful, the correlation processing result may be: a certain person in a certain area. Entering, or similar information; the signal processing chip 130 sends the associated processing result to the display 140 for display.
  • the signal processing chip 130 combines the image signal analysis result and the temperature sensing signal analysis result to determine the current fire magnitude, and the correlation processing result may be the current fire level; the signal processing chip 130 sends the correlation processing result to the The display 140 performs display.
  • the signal processing chip 130 combines the image signal analysis result and the temperature sensing signal analysis result to determine that a person is smoking, and the related processing result may be information indicating that smoking is prohibited, or the like; the signal processing chip The association processing result is sent to the display 140 for display.
  • the server may further include an alarm 150, and the alarm 150 is connected to the signal processing chip 130.
  • the information processing chip 130 can also be used to control the alarm 150 to perform an alarm according to the result of the association processing.
  • the alarm device 150 may be added to the basis of FIG. 17, FIG. 18, and FIG. 19, and is not limited thereto.
  • the alarm device can be a flashing light, a buzzer, etc., and the alarm mode can be a flashing light or a buzzing sound, which is not limited.
  • the server may pre-store alarm conditions, such as an alarm when a person enters the designated area, or an alarm when there is a fire in the scene, or an alarm when someone in the scene smokes, and so on.
  • the information processing chip 130 determines whether or not the pre-stored alarm condition is met based on the result of the association processing described above, and if so, the control alarm 150 performs an alarm.
  • the alarm device may also be an independent device, and the alarm device is communicatively connected with the server.
  • the information processing chip 130 determines whether the alarm condition is met according to the associated processing result, and if yes, sends an alarm message to the alarm device, and the alarm device receives the alarm. After the message, an alarm is issued.
  • the alarm device may be integrally provided with the analog camera, and the information processing chip 130 determines whether the alarm condition is met according to the result of the association processing, and if yes, sends an alarm message to the analog camera, and the analog camera receives the alarm information and controls the alarm device. Make an alarm.
  • the alarm device may be integrally provided with the wireless sensor, and the information processing chip 130 determines whether the alarm condition is met according to the result of the association processing, and if yes, sends an alarm message to the wireless sensor, and the wireless sensor controls the alarm device after receiving the alarm information. Make an alarm.
  • the sensing signal receiving chip in the server receives the sensing signal sent by the wireless sensor, and the image signal receiving chip receives the image signal sent by the analog camera, and the signal processing chip detects the sensing signal and the image.
  • the signal is correlated; it can be seen that the server in the solution realizes the correlation processing between the sensing signal and the image signal in the same scene.
  • the embodiment of the present application further provides an analog camera, as shown in FIG. 21, including a processor 2101 and a memory 2102;
  • a memory 2102 configured to store a computer program
  • the processor 2101 is configured to implement any of the above-described data transmission methods applied to the analog camera side when executing the program stored on the memory 2102.
  • the embodiment of the present application further provides a server, as shown in FIG. 22, including a processor 2201 and a memory 2202;
  • a memory 2202 configured to store a computer program
  • the processor 2201 is configured to implement any of the above-described data processing methods applied to the server side when executing the program stored on the memory 2202.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

本申请实施例提供了一种模拟摄像机、服务器、监控系统和数据传输、处理方法,该模拟摄像机中包含图像分析芯片,该图像分析芯片对采集到的数字视频流数据进行分析,识别数字视频流数据中存在的目标,提取目标的属性和/或位置信息;并将目标的属性和/或位置信息发送到集成芯片,集成芯片根据预设插入模式,将目标的属性和/或位置信息插入至数字视频流数据中,得到混合数据;可见,本方案中的模拟摄像机实现了对采集到的图像进行分析处理。

Description

模拟摄像机、服务器、监控系统和数据传输、处理方法
本申请要求于2017年10月20日提交中国专利局的以下六篇中国专利申请的优先权:(1)申请号为201710985836.5、发明名称为“数据传输方法及摄像机、电子设备”的中国专利申请;(2)申请号为201710984275.7、发明名称为“数据传输方法及摄像机、电子设备”的中国专利申请;(3)申请号为201710985838.4、发明名称为“数据传输方法及摄像机、电子设备”的中国专利申请;(4)申请号为201710985130.9、发明名称为“一种数据处理方法、装置及监控系统”的中国专利申请;(5)申请号为201710985839.9、发明名称为“一种模拟摄像机、监控系统及数据发送方法”的中国专利申请;(6)申请号为201721357044.5、发明名称为“一种服务器及监控系统”的中国专利申请;以上全部内容通过引用结合在本申请中。
技术领域
本申请涉及视频监控技术领域,特别是涉及一种模拟摄像机、服务器、监控系统和数据传输、处理方法。
背景技术
在一些场景中,比如,楼道、路口等场景中,通常设置有模拟摄像机对场景进行图像采集,以便场景中发生异常事件时,比如抢劫、交通事故等异常事件,能够及时对异常事件进行处理。
现有的模拟摄像机仅具有图像采集功能,而不具有图像分析功能,模拟摄像机只能将采集到的图像发送给服务器,由服务器进行分析处理。
发明内容
本申请提供了一种模拟摄像机、服务器、监控系统和数据传输、处理方法,以实现对采集到的图像进行分析处理。
为达到上述目的,本申请实施例提供了一种模拟摄像机,包括:图像采集芯片、图像分析芯片、集成芯片和发送芯片,所述图像采集芯片分别与所述图像分析芯片、所述集成芯片相连接,所述图像分析芯片与所述集成芯片相连接,所述集成芯片与所述发送芯片相连接;
所述图像采集芯片,用于采集数字视频流数据;
所述图像分析芯片,用于对所述数字视频流数据进行分析,识别所述数字视频流数据中存在的目标,提取所述目标的属性和/或位置信息;并将所述目标的属性和/或位置信息发送到所述集成芯片;
所述集成芯片,用于根据预设插入模式,将所述目标的属性和/或位置信息插入至所述数字视频流数据中,得到混合数据,并将所述混合数据发送到所述发送芯片;
所述发送芯片,用于将所述混合数据、或者所述混合数据中的数字视频流数据转换为模拟数据,得到转换后的混合数据;发送所述转换后的混合数据。
可选的,所述图像采集芯片,还用于将所述数字视频流数据进行复制,得到两份数字视频流数据;并将一份数字视频流数据发送到所述图像分析芯片,将另一份数字视频流数据发送到所述集成芯片;
所述集成芯片,具体可以用于:
在所述数字视频流数据中抓取携带有所述目标的图像;
根据预设插入模式,将所抓取的图像、所述目标的属性和/或位置信息插入至所述数字视频流数据中,得到混合数据,并将所述混合数据发送到所述发送芯片。
可选的,所述图像采集芯片,还用于将所述数字视频流数据进行复制,得到两份数字视频流数据;并将一份数字视频流数据发送到所述图像分析芯片,将另一份数字视频流数据发送到所述集成芯片;
所述集成芯片,具体可以用于:
在所述数字视频流数据中抓取携带有所述目标的图像;
对所抓取的图像进行压缩,得到压缩后的图像;
根据预设插入模式,将所述压缩后的图像、所述目标的属性和/或位置信息插入至所述数字视频流数据中,得到混合数据,并将所述混合数据发送到所述发送芯片。
可选的,所述图像采集芯片,还用于将所述数字视频流数据发送到所述集成芯片;
所述集成芯片,还用于在所述数字视频流数据中抓取图像,并将所抓取 的图像发送给所述图像分析芯片;
所述图像分析芯片,具体用于:
接收所述集成芯片发送的图像,对所述图像进行分析,识别所述图像中存在的目标,提取所述目标的属性和/或位置信息;并将所述目标的属性和/或位置信息发送到所述集成芯片。
可选的,所述图像采集芯片,还用于将所述数字视频流数据发送到所述集成芯片;
所述集成芯片,还用于在所述数字视频流数据中抓取图像,并将所抓取的图像发送给所述图像分析芯片;
所述图像分析芯片,具体用于:
接收所述集成芯片发送的图像,对所述图像进行分析,识别所述图像中存在的目标,提取所述目标的属性和/或位置信息;并将所述目标的属性和/或位置信息发送到所述集成芯片;
所述集成芯片,还用于对所抓取的图像进行压缩,得到压缩后的图像;并根据预设插入模式,将所述压缩后的图像、所述目标的属性和/或位置信息插入至所述数字视频流数据中,得到混合数据。
可选的,所述集成芯片,还可以用于:
将所述压缩后的图像、所述目标的属性和/或位置信息插入所述数字视频流数据的消隐区;
根据预设排列方式,对插入所述消隐区的所述压缩后的图像、所述目标的属性和/或位置信息进行排列,得到排列后数据;
将所述排列后数据转换为预设格式的数据;
将所述数字视频流数据转换为所述预设格式的数据。
可选的,所述发送芯片,具体可以用于:
将所述混合数据中的数字视频流数据转换为模拟视频流数据,将所述混合数据中的图像分析数据转换为低频数字数据,所述图像分析数据包括所述目标的属性和/或位置信息;发送所述模拟视频流数据及所述低频数字数据。
可选的,所述集成芯片包含多个集成设置的芯片,所述集成芯片包含所述发送芯片。
可选的,所述集成芯片包含多个集成设置的芯片,所述集成芯片包含图像处理芯片和插入芯片;
所述图像处理芯片,用于对所述数字视频流数据进行色彩和/或亮度处理;
所述插入芯片,用于根据预设插入模式,将所述目标的属性和/或位置信息插入至处理后的数字视频流数据中,得到混合数据。
为达到上述目的,本申请实施例还提供了一种数据传输方法,应用于模拟摄像机,所述方法包括:
采集数字视频流数据;
对所述数字视频流数据进行分析,识别所述数字视频流数据中存在的目标,提取所述目标的属性和/或位置信息;
根据预设插入模式,将所述目标的属性和/或位置信息插入至所述数字视频流数据中,得到混合数据;
将所述混合数据、或者所述混合数据中的数字视频流数据转换为模拟数据,得到转换后的混合数据;发送所述转换后的混合数据。
可选的,在所述提取所述目标的属性和/或位置信息之后,还可以包括:
在所述数字视频流数据中抓取携带有所述目标的图像;
所述根据预设插入模式,将所述目标的属性和/或位置信息插入至所述数字视频流数据中,得到混合数据,可以包括:
根据预设插入模式,将所抓取的图像、所述目标的属性和/或位置信息插入至所述数字视频流数据中,得到混合数据。
可选的,在所述数字视频流数据中抓取携带有所述目标的图像之后,还可以包括:
对所抓取的图像进行压缩,得到压缩后的图像;
所述根据预设插入模式,将所抓取的图像、所述目标的属性和/或位置信息插入至所述数字视频流数据中,得到混合数据,包括:
根据预设插入模式,将所述压缩后的图像、所述目标的属性和/或位置信息插入至所述数字视频流数据中,得到混合数据。
可选的,在所述采集数字视频流数据之后,还包括:
将所述数字视频流数据进行复制,得到两份数字视频流数据;
所述对所述数字视频流数据进行分析,包括:
对所述两份数字视频流数据中的一份数字视频流数据进行分析;
在所述数字视频流数据中抓取携带有所述目标的图像,包括:
在所述两份数字视频流数据中的另一份数字视频流数据中抓取携带有所述目标的图像。
可选的,所述根据预设插入模式,将所述目标的属性和/或位置信息插入至所述数字视频流数据中,得到混合数据,包括:
将所述目标的属性和/或位置信息插入所述数字视频流数据的消隐区,得到混合数据。
可选的,所述将所述混合数据、或者所述混合数据中的数字视频流数据转换为模拟数据,得到转换后的混合数据,包括:
根据预设排列方式,对插入所述消隐区的所述目标的属性和/或位置信息进行排列,得到排列后数据;
将所述排列后数据、以及所述数字视频流数据转换为预设格式的数据;
将所述预设格式的数据进行数模转换,得到转换后的混合数据。
可选的,所述将所述混合数据、或者所述混合数据中的数字视频流数据转换为模拟数据,得到转换后的混合数据,包括:
将所述混合数据中的数字视频流数据转换为模拟视频流数据;
将所述混合数据中的图像分析数据转换为低频数字数据,所述图像分析数据包括所述目标的属性和/或位置信息。
可选的,所述发送所述转换后的混合数据,包括:
连续发送预设数量个第二种图像帧,所述第二种图像帧为携带视频流数据且不携带目标数据的图像帧,所述目标数据包括所述目标的属性和/或位置信息;
将所述目标数据作为第一种图像帧的有效图像区携带的数据,采用有效图像区的数据发送方式,发送所述目标数据;其中,所述第一种图像帧与所述第二种图像帧占用同一传输通道。
可选的,所述发送所述转换后的混合数据,包括:
判断待发送的目标数据的数据量是否达到预设数据量阈值;所述目标数 据包括所述目标的属性和/或位置信息;
如果达到,将所述目标数据作为第一种图像帧的有效图像区携带的数据,采用有效图像区的数据发送方式,发送所述目标数据;
如果未达到,发送第二种图像帧,所述第二种图像帧为携带视频流数据且不携带目标数据的图像帧,其中,所述第一种图像帧与所述第二种图像帧占用同一传输通道。
可选的,在发送所述转换后的混合数据之前,所述方法还包括:
获取待发送的同轴数据;
将所述同轴数据作为与所述目标数据所在的第一种图像帧的消隐区携带的数据,采用消隐区的数据发送方式,发送所述同轴数据。
可选的,所述同轴数据还包括:
图像帧的有效图像区中的数据为目标数据的同轴数据标识。
可选的,所述发送所述转换后的混合数据,包括:
确定视频流数据在图像帧的有效图像区中的第一位置,以及确定目标数据在图像帧的有效图像区中的第二位置,所述目标数据包括所述目标的属性和/或位置信息;
将所述视频流数据以及所述目标数据作为同一图像帧的数据,采用有效图像区的数据发送方式,按照所述第一位置发送所述视频流数据,按照所述第二位置发送所述目标数据。
可选的,在所述发送所述转换后的混合数据之前,所述方法还包括:
获取待发送的同轴数据,以及确定所述同轴数据在图像帧的消隐区的第三位置;
将所述同轴数据作为与所述视频流数据和目标数据所在图像帧的数据,采用消隐区的数据发送方式,按照所述第三位置发送所述同轴数据。
可选的,所述发送所述转换后的混合数据,包括:
确定视频流数据在图像帧的有效图像区中的第一位置,以及确定目标数据在图像帧的消隐区中的第二位置,所述目标数据包括所述目标的属性和/或位置信息;
将所述视频流数据以及所述目标数据作为同一图像帧的数据,采用有效 图像区的数据发送方式,按照所述第一位置发送所述视频流数据,以及采用消隐区的数据发送方式,按照所述第二位置发送所述目标数据。
为达到上述目的,本申请实施例还提供了一种数据处理方法,应用于监控系统中的服务器,所述监控系统中还包括模拟摄像机,所述模拟摄像机与所述服务器同轴连接;所述方法包括:
接收待处理模拟摄像机发送的数据;
从所接收到的数据中分离得到视频流数据及图像分析数据;
对分离得到的视频流数据及所述图像分析数据分别进行处理。
可选的,在所述接收待处理模拟摄像机发送的数据之后,还可以包括:
判断所述待处理模拟摄像机是否具有图像分析功能;如果是,则执行所述从所接收到的数据中分离得到视频流数据及图像分析数据的步骤;
如果否,则对接收到的视频流数据进行分析,得到图像分析数据;对接收到的视频流数据、以及分析得到的图像分析数据分别进行处理;其中,所述图像分析数据包括目标的属性和/或位置信息,所述目标为视频流数据中存在的目标。
可选的,在所述接收待处理模拟摄像机发送的数据之前,还包括:
向所述系统中的模拟摄像机发送属性请求指令,并接收所述系统中的模拟摄像机反馈的设备属性;
对接收到的模拟摄像机反馈的设备属性进行记录;
所述判断所述待处理模拟摄像机是否具有图像分析功能,包括:
在所记录的设备属性中,查找所述待处理模拟摄像机反馈的设备属性;
根据查找到的设备属性,判断所述待处理模拟摄像机是否具有图像分析功能。
可选的,所述从所接收到的数据中分离得到视频流数据及图像分析数据,包括:
根据消隐区标识,在所接收到的数据中确定消隐区和图像区;
读取所述消隐区中的数据,提取图像分析数据;
读取所述图像区中的数据,提取视频流数据。
为达到上述目的,本申请实施例还提供了一种模拟摄像机,包括处理器 和存储器;
存储器,用于存放计算机程序;
处理器,用于执行存储器上所存放的程序时,实现上述任一种应用于模拟摄像机侧的数据传输方法。
为达到上述目的,本申请实施例还提供了一种服务器,包括处理器和存储器;
存储器,用于存放计算机程序;
处理器,用于执行存储器上所存放的程序时,实现上述任一种应用于服务器侧的数据处理方法。
为达到上述目的,本申请实施例还提供了一种监控系统,包括:上述任一种模拟摄像机和服务器,其中,
所述模拟摄像机将所述所述转换后的混合数据发送至所述服务器。
可选的,所述服务器包括:传感信号接收芯片、图像信号接收芯片和信号处理芯片,所述传感信号接收芯片及图像信号接收芯片分别与所述信号处理芯片相连接;
所述传感信号接收芯片,用于接收无线传感器发送的传感信号,并将所述传感信号发送至所述信号处理芯片;
所述图像信号接收芯片,用于接收模拟摄像机发送的图像信号,并将所述图像信号发送至所述信号处理芯片;
所述信号处理芯片,用于对所述传感信号及所述图像信号进行关联处理。
应用本申请实施例,模拟摄像机中包含图像分析芯片,该图像分析芯片对采集到的数字视频流数据进行分析,识别数字视频流数据中存在的目标,提取目标的属性和/或位置信息;并将目标的属性和/或位置信息发送到集成芯片,集成芯片根据预设插入模式,将目标的属性和/或位置信息插入至数字视频流数据中,得到混合数据;可见,本方案中的模拟摄像机实现了对采集到的图像进行分析处理。
附图说明
为了更清楚地说明本申请实施例和现有技术的技术方案,下面对实施例 和现有技术中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本申请实施例提供的模拟摄像机的第一种结构示意图;
图2为本申请实施例提供的模拟摄像机的第二种结构示意图;
图3为本申请实施例提供的模拟摄像机的第三种结构示意图;
图4为本申请实施例提供的一种应用于模拟摄像机侧的数据处理方法的流程示意图;
图5为本申请实施例提供的应用于模拟摄像机侧的数据传输方法的第一种流程示意图;
图6为图像帧的第一种结构示意图;
图7a~图7d分别为本申请实施例提供的两种图像帧的几种排列形式示意图;
图7e为本申请实施例提供的针对两种图像帧的传输框架示意图;
图7f为一种视频流数据和目标数据的发送方式示意图;
图8为本申请实施例提供的应用于模拟摄像机侧的数据传输方法的第二种流程示意图;
图9a为本申请实施例提供的图像帧的第二种结构示意图;
图9b为本申请实施例提供的针对图像帧的一种传输框架示意图;
图10为本申请实施例提供的应用于模拟摄像机侧的数据传输方法的第三种流程示意图;
图11a为本申请实施例提供的图像帧的第三种结构示意图;
图12为本申请实施例提供的应用于服务器侧的数据处理方法的第一种流程示意图;
图13为本申请实施例提供的应用于服务器侧的数据处理方法的第二种流 程示意图;
图14为本申请实施例提供的应用于服务器侧的数据处理方法的第三种流程示意图;
图15为本申请实施例提供的一种监控系统的结构示意图;
图16为本申请实施例提供的服务器的第一种结构示意图;
图17为本申请实施例提供的服务器的第二种结构示意图;
图18为本申请实施例提供的服务器的第三种结构示意图;
图19为本申请实施例提供的服务器的第四种结构示意图;
图20为本申请实施例提供的服务器的第五种结构示意图;
图21为本申请实施例提供的模拟摄像机的第四种结构示意图;
图22为本申请实施例提供的服务器的第六种结构示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
为了解决上述技术问题,本申请实施例提供了一种模拟摄像机、服务器、监控系统和数据传输、处理方法。下面首先对本申请实施例提供的一种模拟摄像机进行详细说明。
图1为本申请实施例提供的模拟摄像机的第一种结构示意图,包括:
图像采集芯片100、图像分析芯片200、集成芯片300和发送芯片400,图像采集芯片100分别与图像分析芯片200、集成芯片300相连接,图像分析芯片200与集成芯片300相连接,集成芯片300与发送芯片400相连接;
图像采集芯片100,用于采集数字视频流数据;
图像分析芯片200,用于对所述数字视频流数据进行分析,识别所述数字 视频流数据中存在的目标,提取所述目标的属性和/或位置信息;并将所述目标的属性和/或位置信息发送到集成芯片300;
集成芯片300,用于根据预设插入模式,将所述目标的属性和/或位置信息插入至所述数字视频流数据中,得到混合数据,并将所述混合数据发送到发送芯片400;
发送芯片400,用于将所述混合数据、或者所述混合数据中的数字视频流数据转换为模拟数据,得到转换后的混合数据;发送所述转换后的混合数据。
应用本申请图1所示实施例,模拟摄像机中包含图像分析芯片,该图像分析芯片对采集到的数字视频流数据进行分析,识别数字视频流数据中存在的目标,提取目标的属性和/或位置信息;并将目标的属性和/或位置信息发送到集成芯片,集成芯片根据预设插入模式,将目标的属性和/或位置信息插入至数字视频流数据中,得到混合数据;可见,本方案中的模拟摄像机实现了对采集到的图像进行分析处理。
下面对图1所示实施例进行详细说明:
图像采集芯片100将光信号转化为图像数字信号,得到数字视频流数据。图像采集芯片100可以为图像sensor(传感器),或者,也可以为其他,具体不做限定。
作为一种实施方式,图像采集芯片100可以将数字视频流数据进行复制,得到两份数字视频流数据,一份发送到图像分析芯片200,另一份发送到集成芯片300。
图像分析芯片200对接收到的数字视频流数据进行分析,通过分析识别数字视频流数据中存在的目标。举例来说,可以利用人脸识别,识别出数字视频流数据中的人脸区域,或者,可以利用车牌识别,识别出数字视频流数据中的车牌,等等,具体分析方式不做限定。识别出目标后,可以提取目标的属性,比如,人脸特征、车牌号等,或者,也可以确定目标在视频图像中的位置信息。
为了方便描述,将图像分析芯片200分析得到的目标的属性和/或位置信息称为图像分析数据。图像分析芯片200将该图像分析数据发送给集成芯片 300。这样,集成芯片300既接收到了图像采集芯片100发送的数字视频流数据,又接收到了图像分析芯片200发送的图像分析数据。
集成芯片300根据预设插入模式,将该图像分析数据插入至数字视频流数据中,得到混合数据。插入模式有多种,比如,将图像分析数据插入数字视频流数据中的消隐区,或者,采用前半帧数字视频流数据、后半帧图像分析数据的插入模式,或者,采用n帧数字视频流数据、一帧图像分析数据的插入模式,等等,具体不做限定。
作为一种实施方式,集成芯片300还可以从数字视频流数据中抓取携带有该目标的图像;该目标与图像分析数据中的属性和/位置信息指向的目标为同一目标。
集成芯片300与图像分析芯片200可以同时对数字视频流数据进行处理,这样,集成芯片300抓取的图像与图像分析芯片200分析的图像为同一张图像,因此,图像分析芯片200识别出的目标存在于集成芯片300抓取的图像中。
为了方便描述,本实施方式中,将图像分析芯片200分析得到的目标的属性和/或位置信息、以及集成芯片300抓取的图像都称为图像分析数据。然后,集成芯片300根据预设插入模式,将图像分析数据插入至数字视频流数据中,得到混合数据,并将混合数据发送到所述发送芯片400。
本实施方式中,混合数据中不仅包含目标的属性和/或位置信息,还包含携带有目标的图像,混合数据中携带的与目标相关联的信息更丰富。
作为一种实施方式,集成芯片抓取到图像、并将该图像发送至图像分析芯片后,可以对该图像进行压缩,得到压缩后的图像,根据预设插入模式,将压缩后的图像、目标的属性和/或位置信息插入至数字视频流数据中,得到混合数据,并将混合数据发送到所述发送芯片400。
应用这种实施方式,将图像进行压缩,混合数据占用的空间更小,模拟摄像机对混合数据进行发送时,占用的数据带宽更小。
上面内容中“图像采集芯片100可以将数字视频流数据进行复制,得到两 份数字视频流数据,一份发送到图像分析芯片200,另一份发送到集成芯片300”,作为另一种实施方式,图像采集芯片也可以不对数字视频流数据进行复制,仅将一份数字视频流数据发送到集成芯片300。
这种实施方式中,集成芯片300可以在数字视频流数据中抓取图像,将所抓取的图像发送给图像分析芯片200;图像分析芯片200接收该图像,对该图像进行分析,识别所述图像中存在的目标,提取所述目标的属性和/或位置信息;并将所述目标的属性和/或位置信息发送到集成芯片300。
集成芯片300可以根据预设插入模式,将所述目标的属性和/或位置信息插入至所述数字视频流数据中,得到混合数据;或者,集成芯片300可以根据预设插入模式,将抓取的图像、所述目标的属性和/或位置信息插入至所述数字视频流数据中,得到混合数据。
这种实施方式中,集成芯片300在将抓取的图像发送给图像分析芯片200后,可以将所抓取的图像进行压缩,得到压缩后的图像;并根据预设插入模式,将所述压缩后的图像、所述目标的属性和/或位置信息插入至所述数字视频流数据中,得到混合数据。这样,混合数据占用的空间更小,模拟摄像机对混合数据进行发送时,占用的数据带宽更小。
在上述几种实施方式中,集成芯片在数字视频流数据中插入的数据或有不同,比如,一些实施方式中,仅插入目标的属性和/或位置信息;在另一些实施方式中,插入目标的属性和/或位置信息、以及抓取的图像;在另一些实施方式中,插入目标的属性和/或位置信息、以及压缩后的图像;为了方便描述,可以将这些插入的数据都称为图像分析数据。
将图像分析数据插入数字视频流数据的方式有多种,比如,将图像分析数据插入数字视频流数据中的消隐区,或者,采用前半帧数字视频流数据、后半帧图像分析数据的插入模式,或者,采用n帧数字视频流数据、一帧图像分析数据的插入模式,等等,具体不做限定。
作为一种实施方式,集成芯片在将图像分析数据插入数字视频流数据后,还可以根据预设排列方式,对插入所述消隐区的图像分析数据进行排列,得到排列后数据;将所述排列后数据转换为预设格式的数据;将所述数字视频 流数据转换为相同格式的数据。
举例来说,可以将图像分析数据进行结构化排列,比如,可以确定图像分析数据的数据量的大小、数据类型等信息,将这些信息作为头部信息添加至图像分析数据的前面,添加了头部信息的图像分析数据即为排列后的数据。结构化排列的方式有多种,具体不做限定。
然后将排列后的数据转换为BT1120格式的数据;将数字视频流数据转换为BT1120格式的数据。或者,也可以将这两种数据转换为其他格式的数据,具体不做限定。
对于数字视频流数据来说,通常需要将其转换为模拟数据进行发送;而对于图像分析数据来说,也就是上述分析得到的“目标的属性和/或位置信息”、或者“目标的属性和/或位置信息、及抓取的图像”、或者“目标的属性和/或位置信息、及压缩后的图像”,可以将图像分析数据转换为模拟数据进行发送,也可以不进行数模转换,发送数字数据。
作为一种实施方式,发送芯片400可以将混合数据转换为模拟混合数据,发送该模拟混合数据。
这种实施方式中,将数字视频流数据及图像分析数据都进行数模转换,发送转换后的模拟数据。
作为另一种实施方式,发送芯片400可以将混合数据中的数字视频流数据转换为模拟数字视频流数据;将混合数据中的图像分析数据转换为低频数字数据;发送该模拟数字视频流数据及该低频数字数据,所述图像分析数据包括所述目标的属性和/或位置信息。
根据前面内容描述,图像分析数据可以包括:“目标的属性和/或位置信息”、或者“目标的属性和/或位置信息、及抓取的图像”、或者“目标的属性和/或位置信息、及压缩后的图像”。
应用这种实施方式,一方面仅将数字视频流数据进行数模转换,不对图像分析数据进行数模转换,这样,可以避免对图像分析数据进行数模转换、 模数转换造成的数据丢失、或其他未知错误。另一方面,将图像分析数据转换为低频数字数据进行传输,消耗能量较小。
在本申请实施例中,集成芯片300可以包含多个集成设置的芯片,作为一种实施方式,可以如图2所示,发送芯片400集成设置在集成芯片300中,或者说,集成芯片300包含发送芯片400。
或者,在其他实施方式中,发送芯片400也可以单独设置,具体不做限定。
作为一种实施方式,可以如图3所示,集成芯片300中包含图像处理芯片500和插入芯片600,
图像处理芯片500,用于对所述数字视频流数据进行色彩和/或亮度处理;
插入芯片600,用于根据预设插入模式,将所述目标的属性和/或位置信息插入至处理后的数字视频流数据中,得到混合数据。
举例来说,图像处理芯片500可以为ISP(Image Signal Processing,图像信号处理)芯片,或者,也可以为MCU(Microcontroller Unit,微控制单元)。
图像处理芯片500对数字视频流数据进行色彩和/或亮度处理,可以包括:AEC(Automatic Exposure Control,自动曝光控制)、AGC(Automatic Gain Control,自动增益控制)、AWB(Automatic white balance,自动白平衡)、色彩校正等处理,具体不做限定。
集成芯片中还可以集成其他芯片,或者说,集成芯片还可以包含其他芯片,具体不做限定。将多个芯片集成设置,可以减少占用的模拟摄像机中的空间,使模拟摄像机结构更紧凑,体积更小。
本申请实施例还提供一种模拟摄像机侧的数据处理方法,如图4所示,该方法包括:
S401:采集数字视频流数据。
S402:对所述数字视频流数据进行分析,识别所述数字视频流数据中存在的目标,提取所述目标的属性和/或位置信息。
S403:根据预设插入模式,将所述目标的属性和/或位置信息插入至所述 数字视频流数据中,得到混合数据。
S404:将所述混合数据、或者所述混合数据中的数字视频流数据转换为模拟数据,得到转换后的混合数据。
S405:发送所述转换后的混合数据。
应用本申请图4所示实施例,模拟摄像机对采集到的数字视频流数据进行分析,识别数字视频流数据中存在的目标,提取目标的属性和/或位置信息;根据预设插入模式,将目标的属性和/或位置信息插入至数字视频流数据中,得到混合数据;可见,本方案中的模拟摄像机实现了对采集到的图像进行分析处理。
下面对图4所示实施例进行详细说明:
S401:采集数字视频流数据。
模拟摄像机中通常设置有图像采集芯片,图像采集芯片可以将光信号转化为图像数字信号,得到数字视频流数据。
S402:对所述数字视频流数据进行分析,识别所述数字视频流数据中存在的目标,提取所述目标的属性和/或位置信息。
举例来说,可以利用人脸识别,识别出数字视频流数据中的人脸区域,或者,可以利用车牌识别,识别出数字视频流数据中的车牌,等等,具体分析方式不做限定。识别出目标后,可以提取目标的属性,比如,人脸特征、车牌号等,或者,也可以确定目标在视频图像中的位置信息。
S403:根据预设插入模式,将所述目标的属性和/或位置信息插入至所述数字视频流数据中,得到混合数据。
作为一种实施方式,在S402之后,还可以在所述数字视频流数据中抓取携带有该目标的图像。该目标与图像分析数据中的属性和/位置信息指向的目标为同一目标。
举例来说,在S401之后,可以将采集到的数字视频流数据进行复制,得到两份数字视频流数据。这样,S402包括:对所述两份数字视频流数据中的一份数字视频流数据进行分析,识别该一份数字视频流数据中存在的目标, 提取所述目标的属性和/或位置信息。另外,在所述两份数字视频流数据中的另一份数字视频流数据中抓取携带有所述目标的图像。
这种实施方式中,S403包括:根据预设插入模式,将所抓取的图像、所述目标的属性和/或位置信息插入至所述数字视频流数据中,得到混合数据。
可见,本实施方式中,混合数据中不仅包含目标的属性和/或位置信息,还包含携带有目标的图像,混合数据中携带的与目标相关联的信息更丰富。
作为一种实施方式,在数字视频流数据中抓取携带有该目标的图像之后,还可以对所抓取的图像进行压缩,得到压缩后的图像;这种实施方式中,S403包括:根据预设插入模式,将所述压缩后的图像、所述目标的属性和/或位置信息插入至所述数字视频流数据中,得到混合数据。
应用这种实施方式,将图像进行压缩,混合数据占用的空间更小,模拟摄像机对混合数据进行发送时,占用的数据带宽更小。
为了方便描述,将上述分析得到的“目标的属性和/或位置信息”、或者“目标的属性和/或位置信息、及抓取的图像”、或者“目标的属性和/或位置信息、及压缩后的图像”都称为图像分析数据。上述不同的实施方式中,图像分析数据不同。这样,S403包括:根据预设插入模式,将图像分析数据插入至所述数字视频流数据中,得到混合数据。
插入模式有多种,比如,将图像分析数据插入数字视频流数据中的消隐区,或者,采用前半帧数字视频流数据、后半帧图像分析数据的插入模式,或者,采用n帧数字视频流数据、一帧图像分析数据的插入模式,等等,具体不做限定。
S404:将所述混合数据、或者所述混合数据中的数字视频流数据转换为模拟数据,得到转换后的混合数据。
对于数字视频流数据来说,通常需要将其转换为模拟数据进行发送;而对于图像分析数据来说,可以将其转换为模拟数据进行发送,也可以不进行 数模转换,发送数字数据。因此,在本实施例中,可以仅将混合数据中的数字视频流数据转换为模拟数据,也可以将混合数据全部转换为模拟数据。
作为一种实施方式,S404可以包括:将所述混合数据中的数字视频流数据转换为模拟视频流数据;将所述混合数据中的图像分析数据转换为低频数字数据。所述图像分析数据也就是上述分析得到的“目标的属性和/或位置信息”、或者“目标的属性和/或位置信息、及抓取的图像”、或者“目标的属性和/或位置信息、及压缩后的图像”。
应用这种实施方式,一方面仅将数字视频流数据进行数模转换,不对图像分析数据进行数模转换,这样,可以避免对图像分析数据进行数模转换、模数转换造成的数据丢失、或其他未知错误。另一方面,将图像分析数据转换为低频数字数据进行传输,消耗能量较小。
作为一种实施方式,S404可以包括:
根据预设排列方式,对插入所述消隐区的图像分析数据进行排列,得到排列后数据;将所述排列后数据、以及所述数字视频流数据转换为预设格式的数据;将所述预设格式的数据进行数模转换,得到转换后的混合数据。
举例来说,可以将图像分析数据进行结构化排列,比如,可以确定图像分析数据的数据量的大小、数据类型等信息,将这些信息作为头部信息添加至图像分析数据的前面,添加了头部信息的图像分析数据即为排列后的数据。结构化排列的方式有多种,具体不做限定。
然后将排列后的数据转换为BT1120格式的数据;将数字视频流数据转换为BT1120格式的数据。或者,也可以将这两种数据转换为其他格式的数据,具体不做限定。
S405:发送所述转换后的混合数据。
作为一种实施方式,模拟摄像机可以将上述得到的转换后的混合数据发送至相连的服务器。
本申请实施例还提供了多种数据传输方法,以传输上述转换后的混合数据。下面通过具体实施例,对第一种数据传输方法进行详细说明。第一种数据传输方法包括如下步骤:S501~S502:
S501:连续发送预设数量个第二种图像帧。第二种图像帧为携带视频流数据且不携带目标数据的图像帧,目标数据包括上述得到的目标的属性和/或位置信息。
模拟摄像机发送的图像帧可以如图6所示,该图像帧包括有效图像区和消隐区。本实施例中,第一种图像帧与第二种图像帧为不同种类的图像帧,第二种图像帧携带视频流数据且不携带目标数据,而第一种图像帧有效图像区携带目标数据。有效图像区也可以称为数据区。
本实施例中,视频流数据也就是图像数据。模拟摄像机可以采集图像数据(视频流数据)并将采集的图像数据发送至服务器。
同轴数据为服务器和模拟摄像机之间交互的通知信息。该同轴数据也可以称为PTZ数据。当模拟摄像机和服务器分别为视频生成端和视频接收端时,视频生成端和视频接收端之间交互的同轴数据可以包括视频生成端向视频接收端发送的同轴发送数据,也可以包括视频接收端向视频生成端发送的同轴接收数据。例如,视频接收端为硬盘录像机(Digital Video Recorder,DVR)时,同轴数据可以包括模拟摄像机向DVR发送的拍摄模式信息、升级时准备就绪的信息等,也可以包括模拟摄像机与DVR之间的握手数据,该握手数据可以用于发送表示设备类型、图像分辨率的数据等;同轴数据还可以包括DVR向模拟摄像机发送的控制信息,该控制信息可以包括针对摄像机的控制指令,例如图像参数调节指令、摄像机光圈调节指令、摄像机旋转调节指令、分辨率切换指令、远程升级数据指令等。同轴数据还可以包括图像数据在图像帧中的位置以及目标数据在图像帧中的位置等信息。
图像帧中可以包括同轴数据,也可以不包括同轴数据。
S502:将目标数据作为第一种图像帧的有效图像区携带的数据,采用有效图像区的数据发送方式,发送目标数据;其中,所述第一种图像帧与所述第二种图像帧占用同一传输通道。
上述“转换后的混合数据”包括视频流数据和目标数据,因此,通过S501和S502,实现了对“转换后的混合数据”的发送。
其中,传输通道可以为同轴线缆、双绞线或其他传输材料,本申请对此不做具体限定。同一传输通道可以理解为同一根线。
在本实施例中,模拟摄像机与服务器之间可以交替地发送第一种图像帧和第二种图像帧。第一种图像帧的有效图像区用于发送目标数据,第二种图像帧的有效图像区用于发送图像数据,也就是视频流数据。这样,能够通过同一传输通道,实现对视频流数据和目标数据的发送。
模拟摄像机与服务器之间传输视频流数据时可以使用同轴线缆或双绞线。为了发送目标数据,同时避免另外布线,可以通过与传输视频流数据时使用的同一传输通道进行传输。
由上述内容可知,本实施例可以获取待发送的目标数据,通过与传输视频流数据时使用的同一传输通道,将目标数据作为第一种图像帧的有效图像区携带的数据,采用有效图像区的数据发送方式,发送该目标数据。其中,第一种图像帧的种类与携带视频流数据且不携带目标数据的第二种图像帧的种类不同。因此,本实施例提供的方案,能够实现模拟摄像机将目标数据通过与传输视频流数据时使用的同一传输通道发送至服务器,无需另外布线,从而能够节省设备成本。
在本实施例中,第一种图像帧的消隐区也可以用来传输目标数据。例如,当目标数据的数据量很大时,第一种图像帧的有效图像区无法发送全部的目标数据时,可以通过消隐区中的部分位置发送剩余的目标数据。该部分位置可以为消隐区中除了用于发送同轴数据的位置之外的位置。第一种图像帧和第二种图像帧可以是按照预设规律间隔性地发送的,也可以是随机发送的。详见以下实施例。
在本实施例中,S502可以包括:确定是否已经连续发送预设数量个第二种图像帧,当确定已经连续发送预设数量个第二种图像帧时,将所述目标数据作为第一种图像帧的有效图像区携带的数据,采用有效图像区的数据发送方式,发送所述目标数据;其中,所述第一种图像帧与所述第二种图像帧占 用同一传输通道。
本实施例中,用于发送目标数据的第一种图像帧的数量可以是预先设定的,也可以是根据目标数据的数据量确定的。用于发送目标数据的第一种图像帧的数量,可以为一个,也可以为多个。
当目标数据发送完成时,还可以连续发送预设数量个第二种图像帧。
在本实施例中,可以每发送N个携带图像数据的第二种图像帧,发送M个携带目标数据的第一种图像帧,再发送N个携带图像数据的第二种图像帧,依此重复。其中,N的值可以固定也可以不固定,M的值可以固定也可以不固定。
第一种情况中,N为固定的值,M为不固定的值。具体的,M可以为:目标数据的数据量除以单个第一种图像帧的有效图像区的数据量之后得到的数量。例如,已知单个第一种图像帧的有效图像区可以发送2000行数据,每行可以传输100kB数据,则单个第一种图像帧的有效图像区大概可以传输0.2MB的数据。当目标数据的数据量为2MB时,则传输这些目标数据需要采用M个第一种图像帧,M=2MB/0.2MB=10。因此,需要10个第一种图像帧传输这些目标数据。
作为一个例子,图7a为第一种图像帧和第二种图像帧在发送时的一种排列示意图。其中,空心矩形代表第二种图像帧,实心矩形代表第一种图像帧。N的值是固定的5帧,M的值是不固定的,可能为2、1、4、1、3等值。
第二种情况中,N为固定的值,M为固定的值。在这种实施方式中,图像数据和目标数据以固定的间隔进行发送。采用这种方式发送的第一种图像帧和第二种图像帧是均匀排列的,在服务器接收图像数据时,可以保证图像预览时的流畅度。
作为一个例子,图7b为第一种图像帧和第二种图像帧在发送时的另一种排列示意图。其中,空心矩形代表第二种图像帧,实心矩形代表第一种图像帧。N的值是固定的5帧,M的值为固定的2帧。
第三种情况中,第一种图像帧的数量N和第二种图像帧的数量M均可以是随机的,即N为不固定的值,M也为不固定的值。在这种情况中,获取目 标数据的时刻可以是不固定的,每次获取的目标数据的数据量也可以是不固定的。两种图像帧的排列可以是杂乱无章的。
作为一个例子,图7c为第一种图像帧和第二种图像帧在发送时的又一种排列示意图。其中,空心矩形代表第二种图像帧,实心矩形代表第一种图像帧。可见,该图7c中N和M的值均为随机的。
第四种情况中,N为不固定的值,M为固定的值。在这种情况中,获取目标数据的时刻可以是不固定的,但每次获取的目标数据的数据量可以是固定的。
作为一个例子,图7d为第一种图像帧和第二种图像帧在发送时的再一种排列示意图。其中,空心矩形代表第二种图像帧,实心矩形代表第一种图像帧。可见,该图7c中N的值为不固定的,M的值为固定的2。
当N和/或M的值为不固定的时,可以通过在图像帧中添加同轴数据的方式标识该图像帧为第一种图像帧,以使服务器从接收的图像帧中确定第一种图像帧。具体的,同轴数据可以包括表示图像帧的有效图像区中的数据为目标数据的数据,根据该数据可以确定该图像帧为第一种图像帧。
在本申请的另一实施例中,发送所述转换后的混合数据,可以包括:
判断待发送的目标数据的数据量是否达到预设数据量阈值;所述目标数据包括所述目标的属性和/或位置信息;
如果达到,将所述目标数据作为第一种图像帧的有效图像区携带的数据,采用有效图像区的数据发送方式,发送所述目标数据;
如果未达到,发送第二种图像帧,所述第二种图像帧为携带视频流数据且不携带目标数据的图像帧,其中,所述第一种图像帧与所述第二种图像帧占用同一传输通道。
其中,预设数据量阈值可以是预设值,例如可以是单个图像帧的有效图像区能够存储的最大数据量,也可以为该最大数据量的a倍,a为正整数。或者,预设数据量阈值也可以为其他任意值。
当预设数据量阈值为最大数据量时,可以每次采用一个第一种图像帧发 送目标数据。当预设数据量阈值为最大数据量的a倍时,可以每次连续发送a个第一种图像帧。如果a值大于指定数值时,为了保证服务器的图像流畅性,第二种图像帧不能长时间没有,这种情况下第一种图像帧可以与第二种图像帧间隔性发送。
目标数据可以在有效图像区中的位置,可以是固定的,也可以是不固定的。当目标数据在有效图像区中的位置不固定时,可以通过目标数据携带的所述目标数据起始位置的头标识,以及表示目标数据末尾位置的尾标识来确定目标数据的位置。其中,头标识可以为第一预设比特串,尾标识可以为第二预设比特串。目标数据的前第一预设数量个比特位可以为第一预设比特串,目标数据的后第二预设数量个比特位可以为第二预设比特串。
在本申请的另一实施例中,在发送所述转换后的混合数据之前,该方法还可以包括:获取待发送的同轴数据;这样,将同轴数据作为与目标数据所在的第一种图像帧的消隐区携带的数据,采用消隐区的数据发送方式,发送同轴数据。
作为一个例子,图7e为模拟摄像机向服务器发送两种图像帧的一种传输架构图。其中,第一种图像帧包括目标数据和同轴数据,第二种图像帧包括视频流数据(图像数据)和同轴数据。其中,同轴数据是可选的,这两种图像帧也可以不包括同轴数据。
可见,本实施例可以将同轴数据和目标数据均通过第一种图像帧发送至服务器,能够使传输的数据种类更丰富,数据传输效率更高。
在上述实施例中,同轴数据还可以包括:表示图像帧的有效图像区中的数据为目标数据的同轴数据标识。这样,服务器可以根据同轴数据中的同轴数据标识确定图像帧的有效图像区的数据为目标数据。在另一实施方式中,同轴数据也可以包括表示图像帧的有效图像区的数据不为目标数据的标识。有效图像区的数据不为目标数据时,有效图像区的数据可以为图像数据,也可以为其他数据或没有数据等。例如,同轴数据的指定位置的数据为1时,表示图像帧的有效图像区的数据为目标数据,当该指定位置的数据为0时,表示图像帧的有效图像区的数据不为目标数据。当服务器接收到图像帧时,可以从该图像帧中获取同轴数据,根据同轴数据的指定位置的数据确定图像 帧的有效图像区的数据是否为目标数据。
当第一种图像帧和第二种图像帧排列不规则时,即N和/或M的值不固定时,为了能够确定哪个图像帧携带目标数据,可以根据同轴数据是否包含表示图像帧的有效图像区中的数据为目标数据的数据来判断,当确定包含表示图像帧的有效图像区中的数据为目标数据的数据时,认为图像帧为第一种图像帧,否则,认为图像帧为第二种图像帧。
其中,表示图像帧的有效图像区中的数据为目标数据的同轴数据标识,可以以指定标识来实现,也可以通过其他方式实现,本申请对此不作具体限定。
在上述实施例中,同轴数据还可以包括表示目标数据在有效图像区的位置的数据。目标数据可以占满有效图像区,也可以不占满有效图像区。当目标数据占满有效图像区时,同轴数据中表示目标数据在有效图像区的位置的数据可以为有效图像区的位置。当目标数据没有占满有效图像区时,同轴数据中表示目标数据在有效图像区的位置的数据可以为目标数据的实际位置。服务器可以根据同轴数据中的数据确定有效图像区中目标数据的位置,从而更准确地获取目标数据。
可见,本实施例中同轴数据可以包括表示图像帧的有效图像区中的数据为目标数据的同轴数据标识,根据该同轴数据标识确定图像帧属于第一种图像帧或第二种图像帧,能够使服务器更准确地获取目标数据。
本实施例中,为了能够实现采用完整的一图像帧传输目标数据,可以提高传输参数,以使保证对图像数据和目标数据的传输。例如,原来的传输参数为2MP25,该参数可以理解为2百万像素的图像每秒传输25帧。原来以2MP25传输参数可以实现对图像数据的传输。为了传输目标数据,可以将2MP25提高到2MP30。这样,每传输30个携带图像数据的图像帧,就会有5个图像帧是不需要存储图像数据的。这些闲置的图像帧就可以用来发送目标数据。
作为一个例子,图7f中给出了分别携带视频流数据和目标数据的图像帧的一种发送方式。
本申请实施例还提供了第二种数据传输方法,以传输上述转换后的混合数据。下面通过具体实施例,对第二种数据传输方法进行详细说明。如图8所示,该方法包括如下步骤:S801~S802:
S801:确定视频流数据在图像帧的有效图像区中的第一位置,以及确定目标数据在图像帧的有效图像区中的第二位置,所述目标数据包括所述目标的属性和/或位置信息。
其中,目标数据为不同于视频流数据和同轴数据的数据。同轴数据为服务器和模拟摄像机之间交互的通知信息,该同轴数据也可以称为PTZ数据。当模拟摄像机和服务器分别为视频生成端和视频接收端时,视频生成端和视频接收端之间交互的同轴数据可以包括视频生成端向视频接收端发送的同轴发送数据,也可以包括视频接收端向视频生成端发送的同轴接收数据。例如,当视频接收端为硬盘录像机(Digital Video Recorder,DVR)时,同轴数据可以包括模拟摄像机向DVR发送的拍摄模式信息、升级时准备就绪的信息等,也可以包括模拟摄像机与DVR之间的握手数据,该握手数据可以用于发送表示设备类型、图像分辨率的数据等;同轴数据还可以包括DVR向模拟摄像机发送的控制信息,该控制信息可以包括针对模拟摄像机的控制指令,例如图像参数调节指令、摄像机光圈调节指令、摄像机旋转调节指令、分辨率切换指令、远程升级数据指令等。同轴数据还可以包括视频流数据在图像帧中的位置以及目标数据在图像帧中的位置等信息。
图像帧中可以包括同轴数据,也可以不包括同轴数据。
S802:将视频流数据以及目标数据作为同一图像帧的数据,采用有效图像区的数据发送方式,按照第一位置发送视频流数据,按照第二位置发送目标数据。
上述“转换后的混合数据”包括视频流数据和目标数据,因此,通过S801和S802,实现了对“转换后的混合数据”的发送。
在S801中,确定视频流数据在图像帧的有效图像区中的第一位置时,可以将第一预设位置确定为视频流数据在图像帧的有效图像区中的第一位置,预设位置包括起始位置和末尾位置。也可以根据视频流数据的数据量,确定 视频流数据在图像帧的有效图像区中的第一位置。
根据视频流数据的数据量,确定视频流数据在图像帧的有效图像区中的第一位置时,具体可以为,将视频流数据在图像帧的有效图像区中的第一位置确定为:从第一预设初始位置开始,到第一末尾位置;其中,第一末尾位置为:第一预设初始位置加视频流数据的数据量后得到的位置。第一预设初始位置和第一末尾位置均位于有效图像区。
在S801中,确定目标数据在有效图像区中的第二位置时,可以将第二预设位置确定为目标数据在有效图像区中的第二位置。也可以根据目标数据的数据量,确定目标数据在有效图像区中的第二位置。
根据目标数据的数据量,确定目标数据在有效图像区中的第二位置时,具体可以为,将目标数据在有效图像区中的第二位置确定为:从第二预设初始位置开始,到第二末尾位置;其中,第二末尾位置为:第二预设初始位置加目标数据的数据量后得到的位置。第二预设初始位置和第二末尾位置均位于有效图像区。
上述第二位置可以是图像帧有效图像区的固定位置,也可以是不固定的位置。当有效图像区中视频流数据的位置是固定位置时,目标数据在有效图像区的第二位置可以是有效图像区中除视频流数据的位置之外的位置。例如,第二位置可以为有效图像区中除视频流数据的位置之外的部分区域的位置,也可以为有效图像区中除视频流数据的位置之外的全部区域的位置,本申请对此不做具体限定。
本实施例中,视频流数据也就是图像数据,模拟摄像机可以采集图像数据(视频流数据)并将采集的图像数据发送至服务器。
为了能够在有效图像区中同时传输图像数据和目标数据,可以提高模拟摄像机与服务器传输的图像帧的分辨率规格,可以使用高规格的传输频率来传输低规格的分辨率。例如,可以使用4MP30的传输参数来传输原来用2MP30的传输参数传输的数据。其中,传输参数xMPy可以理解为,每秒传输y帧x百万像素的数据。在使用4MP30的传输参数传输数据时,有效图像区传输的行数增加了,每一行能够传输的数据量也增加了,因此在传输图像数据时, 可以使有效图像区出现大片的闲置区,这些闲置区可以用来存储目标数据。
当第二位置为固定的位置时,服务器和模拟摄像机可以预先约定第二位置,从而使服务器可以按照约定的第二位置获取目标数据,提高获取目标数据时的准确性。
当第二位置为不固定的位置时,目标数据可以包括表示目标数据的起始位置的头标识,以及表示目标数据的末尾位置的尾标识。这样,当服务器在获取目标数据时,可以根据上述头标识确定目标数据的起始位置,根据上述尾标识确定目标数据的末尾位置,根据确定的目标数据的起始位置和目标数据的末尾位置,从有效图像区获取目标数据,从而在第二位置不固定时能够准确地获取目标数据。这种实施方式中目标数据发送时的灵活性比较大。
其中,头标识可以为第一预设比特串,尾标识可以为第二预设比特串。目标数据的前第一预设数量个比特位可以为第一预设比特串,目标数据的后第二预设数量个比特位可以为第二预设比特串。
在图像帧中,有效图像区的数据量远大于消隐区的数据量。如果将有效图像区的一部分分配给目标数据,则目标数据可以为数据量很大的数据。
在本实施例中,将视频流数据和目标数据通过同一图像帧发送至后端的电子设备,可以通过同一线缆实现对上述“转换后的混合数据”的发送。
在发送上述图像帧时,可以将图像帧中的数据逐行地以数据流的形式进行发送,也可以将图像帧中的数据逐列地以数据流的形式进行发送。
由上述内容可知,本实施例确定视频流数据在图像帧的有效图像区中的第一位置,以及确定目标数据在图像帧的有效图像区中的第二位置,将视频流数据以及目标数据作为同一图像帧的数据,按照第一位置发送视频流数据,按照第二位置发送目标数据。因此,本实施例可以实现将上述“转换后的混合数据”以同一图像帧发送至电子设备,无需另外布线,能够节省设备成本。
在模拟摄像机向服务器发送的大量图像帧中,可以在所有图像帧中均携带目标数据,也可以在部分图像帧中携带目标数据。当获取到目标数据时,可以将有效图像区携带视频流数据和目标数据的图像帧发送至服务器。当没有获取到目标数据时,可以将有效图像区携带视频流数据、有效图像区不携 带目标数据的图像帧发送至服务器。
在本申请的另一实施例中,在发送视频流数据和目标数据之前,该方法还可以包括以下步骤1~步骤2:
步骤1:获取待发送的同轴数据,以及确定同轴数据在图像帧的消隐区的第三位置。
本实施例中,图像帧中包括视频流数据、目标数据和同轴数据。
确定同轴数据在图像帧的消隐区的第三位置时,可以包括:将图像帧的消隐区中的第三预设位置确定为同轴数据的第三位置。例如,可以将消隐区中预设的第2行到第4行的位置确定为第三位置。在另一种实施方式中,消隐区中还可以包括服务器向模拟摄像机发送的同轴数据的位置,该位置可以为消隐区中不同于第三位置的位置。
消隐区包括场消隐区和行消隐区。第三位置可以位于场消隐区,也可以位于行消隐区,也可以一部分在场消隐区,另一部分在行消隐区。在一种具体实施方式中,场消隐区的可存储数据量大于行消隐区的可存储数据量,因此可以从场消隐区中确定第三位置,以提高可存储的同轴数据的数据量。
当第二位置不是固定位置时,获取待发送的同轴数据的步骤,可以包括:获取待发送的包括表示第二位置的数据的同轴数据。
由于同轴数据的第三位置可以为固定位置,服务器和模拟摄像机可以预先约定第三位置,从而在服务器获取同轴数据时,可以按照约定的第三位置获取。在获取同轴数据之后,可以根据同轴数据中包括的表示第二位置的数据,确定第二位置,从第二位置中获取目标数据,提高获取目标数据时的准确性。
在本实施例中,同轴数据还可以包括除了表示第二位置的数据之外的其他数据,本申请对此不作具体限定。
步骤2:将同轴数据作为与视频流数据和目标数据所在图像帧的数据,采用消隐区的数据发送方式,按照第三位置发送同轴数据。
作为一个例子,图9a为图像帧的有效图像区和消隐区所对应的数据的一 种示意图。其中,中间实现矩形框区为有效图像区,有效图像区由虚线分割成两部分,其中一部分为视频流数据的位置所在区域,另一部分为目标数据的位置所在区域。有效图像区之外的部分为消隐区,消隐区为同轴数据的位置所在区域。
作为一个例子,图9b为模拟摄像机向服务器发送图像帧的一种传输架构图。其中,一个图像帧中包括视频流数据、同轴数据和目标数据。
可见,本实施例可以获取同轴数据,将视频流数据、目标数据和同轴数据作为同一图像帧的数据发送至服务器,使服务器通过同一图像帧接收待发送视频流数据、目标数据和同轴数据,提高数据传输效率。
在上述实施例的一种具体实施方式中,表示第二位置的数据在图像帧的消隐区中的位置,可以位于第二位置之前。
本实施例具体可以为,第三位置位于第二位置之前,这样可以保证表示第二位置的数据在图像帧的消隐区的位置位于第二位置之前。
例如,在图9a中,同轴数据可以位于有效图像区上方的消隐区中,这样表示第二位置的数据在图像帧的消隐区中的位置位于有效图像区中第二位置之前。
如果服务器在接收图像帧携带的数据时,按照从前向后的数据获取方式获取数据,则使表示第二位置的数据在图像帧的消隐区中的位置位于第二位置之前,能够使服务器先获取同轴数据中表示第二位置的数据,然后可以根据该表示第二位置的数据从有效图像区中获取目标数据,从而提高获取目标数据的效率。
在相关技术中,视频流数据可以以模拟信号的方式叠加在有效图像区,而同轴数据可以以数字信号的方式叠加在消隐区。每种规格的分辨率图像均存在其对应的传输参数,较高分辨率的图像可以使用较高的传输参数。传输参数可以为2MP30,该参数可以理解为每秒传输30帧200万像素的图像。在本实施例中,可以使用更高规格的传输参数来传输较低分辨率的图像,例如可以使用4MP30的传输参数来传输原来可以用2MP30的传输参数来传输的图像帧。这样,图像帧的有效图像区会存在大片的闲置区,这些闲置区可以用来 传输目标数据。
在本实施例中,同轴数据仍然保持原来的传输方式不变,而视频流数据占用有效图像区的一部分区域,目标数据占用有效图像区的另一部分区域。模拟摄像机可以与服务器约定目标数据在图像帧中的存储位置。或者,也可以将目标数据在图像帧中的存储位置放置在同轴数据中。服务器可以比较容易地从同轴数据中解析出该存储位置。
本实施例提供的数据传输方式,可以使每一图像帧均存在相应的目标数据,因此数据的实时性和同步性较好,可存储的数据量也比较大。本实施例也不限制传输材料,传输材料可以为同轴线缆、双绞线或其他材料,无需另外布线,减少了设备成本。
本申请实施例还提供了第三种数据传输方法,以传输上述转换后的混合数据。下面通过具体实施例,对第三种数据传输方法进行详细说明。如图10所示,该方法包括如下步骤:S1001~S1002:
S1001:确定视频流数据在图像帧的有效图像区中的第一位置,以及确定目标数据在图像帧的消隐区中的第二位置,所述目标数据包括所述目标的属性和/或位置信息。
其中,目标数据为不同于视频流数据和同轴数据的数据。同轴数据为服务器和模拟摄像机之间交互的通知信息,该同轴数据也可以称为PTZ数据。当模拟摄像机和服务器分别为视频生成端和视频接收端时,视频生成端和视频接收端之间交互的同轴数据可以包括视频生成端向视频接收端发送的同轴发送数据,也可以包括视频接收端向视频生成端发送的同轴接收数据。例如,视频接收端为硬盘录像机(Digital Video Recorder,DVR)时,同轴数据可以包括模拟摄像机向DVR发送的拍摄模式信息、升级时准备就绪的信息等,也可以包括模拟摄像机与DVR之间的握手数据,该握手数据可以用于发送表示设备类型、图像分辨率的数据等。同轴数据还可以包括DVR向模拟摄像机发送的控制信息等,该控制信息可以包括针对摄像机的控制指令,例如图像参数调节指令、摄像机光圈调节指令、摄像机旋转调节指令、分辨率切换指令、 远程升级数据指令等。同轴数据还可以包括视频流数据在图像帧中的位置以及目标数据在图像帧中的位置等信息。
图像帧中可以包括同轴数据,也可以不包括同轴数据。
S1002:将视频流数据以及目标数据作为同一图像帧的数据,采用有效图像区的数据发送方式,按照第一位置发送视频流数据,以及采用消隐区的数据发送方式,按照第二位置发送目标数据。
上述“转换后的混合数据”包括视频流数据和目标数据,因此,通过S1001和S1002,实现了对“转换后的混合数据”的发送。
在S1001中,确定视频流数据在图像帧的有效图像区中的第一位置时,可以将第一预设位置确定为视频流数据在图像帧的有效图像区中的第一位置,预设位置包括起始位置和末尾位置。也可以根据视频流数据的数据量,确定视频流数据在图像帧的有效图像区中的第一位置。
根据视频流数据的数据量,确定视频流数据在图像帧的有效图像区中的第一位置时,具体可以为,将视频流数据在图像帧的有效图像区中的第一位置确定为:从第一预设初始位置开始,到第一末尾位置;其中,第一末尾位置为:第一预设初始位置加视频流数据的数据量后得到的位置。第一预设初始位置和第一末尾位置均位于有效图像区。
在S1001中,确定目标数据在图像帧的消隐区中的第二位置时,可以将第二预设位置确定为目标数据在图像帧的消隐区中的第二位置。也可以根据目标数据的数据量,确定目标数据在图像帧的消隐区中的第二位置。
根据目标数据的数据量,确定目标数据在图像帧的消隐区中的第二位置时,具体可以为,将目标数据在图像帧的消隐区中的第二位置确定为:从第二预设初始位置开始,到第二末尾位置;其中,第二末尾位置为:第二预设初始位置加目标数据的数据量后得到的位置。第二预设初始位置和第二末尾位置均位于消隐区。
上述第二位置可以是图像帧消隐区的固定位置,也可以是不固定的位置。消隐区中同轴数据的位置是固定位置,因此目标数据在消隐区的第二位置可以是消隐区中除同轴数据的位置之外的位置。例如,第二位置可以为消隐区 中除同轴数据的位置之外的部分区域的位置,也可以为消隐区中除同轴数据的位置之外的全部区域的位置,本申请对此不做具体限定。
消隐区包括场消隐区和行消隐区。第二位置可以位于场消隐区,也可以位于行消隐区,也可以一部分在场消隐区,另一部分在行消隐区。在一种具体实施方式中,场消隐区的可存储数据量大于行消隐区的可存储数据量,因此可以从场消隐区中确定第二位置,以提高可存储的目标数据的数据量。
在一种具体实施方式中,同轴数据和目标数据的位置均可以在场消隐区中。例如,已知1920*1080图像帧的场消隐区包含36行。同轴数据的数据量一般很小,为字节级别的数据量,可以为同轴数据分配场消隐区的2~4行,剩余的32行可以分配给目标数据,因此消隐区可存储的目标数据的数据量大概在几百个字节数量级别。
为了提高消隐区的可存储数据量,可以提高传输的图像帧的规格。例如,从200万像素的图像帧提高到300万像素的图像帧。
当第二位置为固定的位置时,服务器和模拟摄像机可以预先约定第二位置,从而使服务器可以按照约定的第二位置获取目标数据,提高获取目标数据时的准确性。
当第二位置为不固定的位置时,目标数据可以包括表示目标数据的起始位置的头标识,以及表示目标数据的末尾位置的尾标识。这样,当服务器在获取目标数据时,可以根据上述头标识确定目标数据的起始位置,根据上述尾标识确定目标数据的末尾位置,根据确定的目标数据的起始位置和目标数据的末尾位置,从消隐区获取目标数据,从而在第二位置不固定时能够准确地获取目标数据。这种实施方式中目标数据发送时的灵活性比较大。
其中,头标识可以为第一预设比特串,尾标识可以为第二预设比特串。目标数据的前第一预设数量个比特位可以为第一预设比特串,目标数据的后第二预设数量个比特位可以为第二预设比特串。
在发送上述图像帧时,可以将图像帧中的数据逐行地以数据流的形式进行发送,也可以将图像帧中的数据逐列地以数据流的形式进行发送。
服务器在接收到模拟摄像机发送的图像帧时,可以按照预设的数据存储 规则,对图像帧进行解析,获得图像帧中的视频流数据和目标数据。
由上述内容可知,本实施例确定视频流数据在图像帧的有效图像区中的第一位置,以及确定目标数据在图像帧的消隐区中的第二位置,将视频流数据以及目标数据作为同一图像帧的数据,采用有效图像区的数据发送方式,按照第一位置发送视频流数据,以及采用消隐区的数据发送方式,按照第二位置发送目标数据。因此,本申请实施例可以实现将上述“转换后的混合数据”以同一图像帧发送至电子设备,无需另外布线,能够节省设备成本。
在模拟摄像机向服务器发送的大量图像帧中,可以在所有图像帧中均携带目标数据,也可以在部分图像帧中携带目标数据。当获取到目标数据时,可以将有效图像区携带视频流数据、消隐区携带目标数据的图像帧发送至服务器。当没有获取到目标数据时,可以将有效图像区携带视频流数据、消隐区不携带目标数据的图像帧发送至服务器。
在本申请的另一实施例中,在发送视频流数据和目标数据之前,该方法还可以包括以下步骤1~步骤2:
步骤1:获取待发送的同轴数据,以及确定同轴数据在图像帧的消隐区的第三位置。
本实施例中,图像帧中包括视频流数据、目标数据和同轴数据。
确定同轴数据在图像帧的消隐区的第三位置时,可以包括:将第三预设位置确定为同轴数据在图像帧的消隐区中的第三位置。例如,可以将消隐区中预设的第2行到第4行的位置确定为第三位置。在另一种实施方式中,消隐区中还可以包括服务器向模拟摄像机发送的同轴数据的位置,该位置可以为消隐区中不同于第三位置的位置。
在消隐区中,第三位置可以在第二位置之前,也可以在第二位置之后。由于场消隐区的数据量远大于行消隐区的数据量,可以从场消隐区确定第三位置。
当第二位置不是固定位置时,获取待发送的同轴数据的步骤,可以包括:获取待发送的包括表示第二位置的数据的同轴数据。
由于同轴数据的第三位置可以为固定位置,服务器和模拟摄像机可以预先约定第三位置,从而在服务器获取同轴数据时,可以按照约定的第三位置获取。在获取同轴数据之后,可以根据同轴数据中包括的表示第二位置的数据,确定第二位置,从第二位置中获取目标数据,提高获取目标数据时的准确性。
在本实施例中,同轴数据还可以包括除了表示第二位置的数据之外的其他数据,本申请对此不作具体限定。
步骤2:将同轴数据作为与视频流数据和目标数据所在图像帧的数据,采用消隐区的数据发送方式,按照第三位置发送同轴数据。
作为一个例子,图11a为图像帧的有效图像区和消隐区所对应的数据的一种示意图。其中,竖线阴影区为有效图像区,竖线阴影区之外的空白区为消隐区。有效图像区上下方的消隐区为场消隐区,有效图像区左右两侧的消隐区为行消隐区。在图11a所示例子中,同轴数据的存储位置位于上方的场消隐区,目标数据的存储位置位于下方的场消隐区。
作为一个例子,图9b为模拟摄像机向服务器发送图像帧的一种传输架构图。其中,一个图像帧中包括视频流数据、同轴数据和目标数据。
可见,本实施例可以获取同轴数据,将视频流数据、目标数据和同轴数据作为同一图像帧的数据发送至服务器,使服务器通过同一图像帧接收视频流数据、目标数据和同轴数据,提高数据传输效率。
在上述实施例的一种具体实施方式中,表示第二位置的数据在图像帧的消隐区中的位置,可以位于第二位置之前。
本实施例具体可以为,第三位置位于第二位置之前,这样可以保证表示第二位置的数据在图像帧的消隐区的位置位于第二位置之前。
如果服务器在接收图像帧携带的数据时,按照从前向后的数据获取方式获取数据,则使表示第二位置的数据在图像帧的消隐区中的位置位于第二位置之前,能够使服务器先获取同轴数据中表示第二位置的数据,然后可以根据该表示第二位置的数据从消隐区中获取目标数据,从而提高获取目标数据的效率。
在相关技术中,视频流数据可以以模拟信号的方式叠加在有效图像区,而同轴数据可以以数字信号的方式叠加在消隐区。同轴数据的数据量很小,大概为字节级别(可以为6~24个字节),因此还有大量空闲的消隐区可以用于存储目标数据。在本实施例中,可以将目标数据存储在闲置的消隐区中。
在本实施例中,视频流数据和同轴数据仍然保持原来的传输方式不变,将未曾使用的一些消隐区用于填充目标数据。模拟摄像机可以与服务器约定目标数据在图像帧中的存储位置。或者,也可以将目标数据在图像帧中的存储位置放置在同轴数据中。服务器可以比较容易地从同轴数据中解析出该存储位置。
本实施例提供的数据传输方式,可以使每一图像帧均存在相应的目标数据,因此数据的实时性和同步性较好。本实施例也不限制传输材料,传输材料可以为同轴线缆、双绞线或其他材料,无需另外布线,减少了设备成本。
本申请实施例还提供一种应用于服务器侧的数据处理方法,该服务器与模拟摄像机相连接。下面通过具体实施例,对该数据处理方法进行详细说明。
图12为本申请实施例提供的应用于服务器侧的数据处理方法的第一种流程示意图,包括:
S1201:接收待处理模拟摄像机发送的数据。
在本实施例中,为了方便描述,将向服务器发送数据的模拟摄像机称为待处理模拟摄像机。
S1202:从所接收到的数据中分离得到视频流数据及图像分析数据。
服务器与模拟摄像机可以预先约定视频流数据及图像分析数据的叠加方式及分离方式,比如,如果模拟摄像机在视频流数据的消隐区中叠加图像分析数据,服务器则在消隐区中读取图像分析数据。
具体来说,服务器可以根据消隐区标识,在所接收到的数据中确定消隐区和图像区;读取所述消隐区中的数据,提取图像分析数据;读取所述图像区中的数据,提取视频流数据。
或者,模拟摄像机与服务器也可以约定第几行为视频流数据,第几行为图像分析数据,这样,服务器也可以从所接收到的数据中分离得到视频流数据及图像分析数据,具体分离方式不做限定。
S1203:对分离得到的视频流数据及所述图像分析数据分别进行处理。
服务器分离得到视频流数据及图像分析数据后,对这两种数据分别进行处理。比如,对视频流数据可以进行编码、存储等处理,具体不做限定。
图像分析数据为模拟摄像机对采集到的图像进行分析后得到的数据。作为一种实施方式,图像分析数据中可以包含目标的属性和/或位置信息,所述目标为所述视频流数据中存在的目标。这种实施方式中,对分离得到图像分析数据后,可以在图像分析数据中读取目标的属性和/或位置信息,所述目标为所述视频流数据中存在的目标。
图像分析数据中还可以包含其他内容,比如携带有所述目标的图像等,具体不做限定。
举例来说,模拟摄像机可以对图像进行人脸识别,将识别出的人脸特征作为图像分析数据发送给服务器。服务器分离得到人脸特征后,可以将该人脸特征与数据库中存储的人脸特征进行对比,进而确定出该人脸对应的身份信息。
或者,模拟摄像机对图像进行人脸识别后,确定人脸在图像中的位置信息,将该位置信息作为图像分析数据发送给服务器。服务器分离得到该位置信息后,可以通过该位置信息直接在图像中获取到人脸区域。
再举一例,模拟摄像机可以对图像进行车牌识别,将识别出的车牌号作为图像分析数据发送给服务器。服务器分离得到该车牌号后,可以确定该车牌号对应的车主信息,或者,在数据库中查找该车牌号对应的行驶轨迹等。
或者,模拟摄像机对图像进行车牌识别后,确定车牌在图像中的位置信息,将该位置信息作为图像分析数据发送给服务器。服务器分离得到该位置信息后,可以通过该位置信息直接在图像中获取到车牌区域。
服务器对视频流数据及图像分析数据的处理方式有多种,可以根据实际 情况进行设定,具体不做限定。
作为一种实施方式,具有图像分析功能的模拟摄像机在发送视频流数据及图像分析数据时,可以采用多种不同的发送方式,比如:一、将视频流数据、图像分析数据都转换为模拟数据进行发送;二、仅将视频流数据转换为模拟数据,发送模拟视频流数据、以及数字图像分析数据;三、将视频流数据转换为模拟数据,将图像分析数据转换为低频数字数据,发送模拟视频流数据、以及该低频数字数据,等等,具体发送方式不做限定。
如果为第一种发送方式,则服务器接收到的数据为模拟数据,服务器可以对所接收到的数据进行模数转换,得到转换后的数字数据;从所述转换后的数字数据中分离得到数字视频流数据及数字图像分析数据。或者,服务器也可以从所接收到的数据中分离得到模拟视频流数据及模拟图像分析数据;将所述模拟视频流数据及模拟图像分析数据进行模数转换,得到数字视频流数据及数字图像分析数据。
如果为第二种发送方式,则服务器从所接收到的数据中分离得到模拟视频流数据及数字图像分析数据;将所述模拟视频流数据进行模数转换,得到数字视频流数据。
如果为第三种发送方式,则服务器从所接收到的数据中分离得到模拟视频流数据及数字图像分析数据;将所述模拟视频流数据进行模数转换,得到数字视频流数据;对所述数字图像分析数据进行低频采样,得到预设格式的数据,并对所述预设格式的数据进行标记。
该预设格式可以为BT1120格式,或者也可以为其他格式,具体不做限定。可以对预设格式的数据进行标记,也可以对数字视频流数据进行标记,以便在后续处理过程中,对这两种数字数据进行区分。
图13为本申请实施例提供的应用于服务器侧的数据处理方法的第二种流程示意图,包括:
S1301:接收待处理模拟摄像机发送的数据。
在本实施例中,为了方便描述,将向服务器发送数据的模拟摄像机称为待处理模拟摄像机。该待处理模拟摄像机可以为具有图像分析功能的设备, 也可以为不具有图像分析功能的设备。
S1302:判断该待处理模拟摄像机是否具有图像分析功能;如果是,执行S1303,如果否,执行S1304。
在本申请实施例中,服务器中存在两类模式:第一类处理模式和第二类处理模式。其中,第一类处理模式用于处理具有图像分析功能的模拟摄像机发送的数据,第二类处理模式用于处理不具有图像分析功能的模拟摄像机发送的数据。
不具有图像分析功能的模拟摄像机发送的数据中仅包含采集的视频流数据,不包含图像分析数据;而具有图像分析功能的模拟摄像机发送的数据中既包含采集的视频流数据,又包含图像分析数据。因此,第一类处理模式可以理解为:针对S1301接收到的数据中包含视频流数据及图像分析数据的情况进行处理的模式,第二类处理模式可以理解为:针对S1301接收到的数据中不包含图像分析数据的情况进行处理的模式。
S1303:触发第一类处理模式:从所接收到的数据中分离得到视频流数据及图像分析数据;对分离得到的视频流数据及所述图像分析数据分别进行处理。
服务器与模拟摄像机可以预先约定视频流数据及图像分析数据的叠加方式及分离方式,比如,如果模拟摄像机在视频流数据的消隐区中叠加图像分析数据,服务器则在消隐区中读取图像分析数据。
具体来说,服务器可以根据消隐区标识,在所接收到的数据中确定消隐区和图像区;读取所述消隐区中的数据,提取图像分析数据;读取所述图像区中的数据,提取视频流数据。
或者,模拟摄像机与服务器也可以约定第几行为视频流数据,第几行为图像分析数据,这样,服务器也可以从所接收到的数据中分离得到视频流数据及图像分析数据,具体分离方式不做限定。
服务器分离得到视频流数据及图像分析数据后,对这两种数据分别进行处理。比如,对视频流数据可以进行编码、存储等处理,具体不做限定。
图像分析数据为模拟摄像机对采集到的图像进行分析后得到的数据。作为一种实施方式,图像分析数据中可以包含目标的属性和/或位置信息,所述目标为所述视频流数据中存在的目标。这种实施方式中,对分离得到图像分析数据后,可以在图像分析数据中读取目标的属性和/或位置信息,所述目标为所述视频流数据中存在的目标。
图像分析数据中还可以包含其他内容,比如携带有所述目标的图像等,具体不做限定。
举例来说,模拟摄像机可以对图像进行人脸识别,将识别出的人脸特征作为图像分析数据发送给服务器。服务器分离得到人脸特征后,可以将该人脸特征与数据库中存储的人脸特征进行对比,进而确定出该人脸对应的身份信息。
或者,模拟摄像机对图像进行人脸识别后,确定人脸在图像中的位置信息,将该位置信息作为图像分析数据发送给服务器。服务器分离得到该位置信息后,可以通过该位置信息直接在图像中获取到人脸区域。
再举一例,模拟摄像机可以对图像进行车牌识别,将识别出的车牌号作为图像分析数据发送给服务器。服务器分离得到该车牌号后,可以确定该车牌号对应的车主信息,或者,在数据库中查找该车牌号对应的行驶轨迹等。
或者,模拟摄像机对图像进行车牌识别后,确定车牌在图像中的位置信息,将该位置信息作为图像分析数据发送给服务器。服务器分离得到该位置信息后,可以通过该位置信息直接在图像中获取到车牌区域。
服务器对视频流数据及图像分析数据的处理方式有多种,可以根据实际情况进行设定,具体不做限定。
作为一种实施方式,具有图像分析功能的模拟摄像机在发送视频流数据及图像分析数据时,可以采用多种不同的发送方式,比如:一、将视频流数据、图像分析数据都转换为模拟数据进行发送;二、仅将视频流数据转换为模拟数据,发送模拟视频流数据、以及数字图像分析数据;三、将视频流数据转换为模拟数据,将图像分析数据转换为低频数字数据,发送模拟视频流数据、以及该低频数字数据,等等,具体发送方式不做限定。
如果为第一种发送方式,则服务器接收到的数据为模拟数据,服务器可以对所接收到的数据进行模数转换,得到转换后的数字数据;从所述转换后的数字数据中分离得到数字视频流数据及数字图像分析数据。或者,服务器也可以从所接收到的数据中分离得到模拟视频流数据及模拟图像分析数据;将所述模拟视频流数据及模拟图像分析数据进行模数转换,得到数字视频流数据及数字图像分析数据。
如果为第二种发送方式,则服务器从所接收到的数据中分离得到模拟视频流数据及数字图像分析数据;将所述模拟视频流数据进行模数转换,得到数字视频流数据。
如果为第三种发送方式,则服务器从所接收到的数据中分离得到模拟视频流数据及数字图像分析数据;将所述模拟视频流数据进行模数转换,得到数字视频流数据;对所述数字图像分析数据进行低频采样,得到预设格式的数据,并对所述预设格式的数据进行标记。
该预设格式可以为BT1120格式,或者也可以为其他格式,具体不做限定。可以对预设格式的数据进行标记,也可以对数字视频流数据进行标记,以便在后续处理过程中,对这两种数字数据进行区分。
S1304:触发第二类处理模式:对接收到的视频流数据进行分析,得到图像分析数据;对接收到的视频流数据、以及分析得到的图像分析数据分别进行处理。
不具有图像分析功能的模拟摄像机发送的数据中不包含图像分析数据,服务器对该视频流数据进行分析,得到图像分析数据。具体的,该分析可以为人脸识别、车牌识别、特征提取等等,具体不做限定。
服务器对接收到的视频流数据、以及分析得到的图像分析数据分别进行处理。比如,对视频流数据可以进行编码、存储等处理,具体不做限定。
举例来说,如果服务器对该视频流数据进行人脸识别,得到人脸特征后,可以将该人脸特征与数据库中存储的人脸特征进行对比,进而确定出该人脸对应的身份信息。再举一例,如果服务器对该视频流数据进行车牌识别,得到车牌号后,可以确定该车牌号对应的车主信息,或者,在数据库中查找该 车牌号对应的行驶轨迹等。
服务器对视频流数据及图像分析数据的处理方式有多种,可以根据实际情况进行设定,具体不做限定。
图14为本申请实施例提供的应用于服务器侧的数据处理方法的第三种流程示意图,包括:
S1401:向系统中的模拟摄像机发送属性请求指令,并接收系统中的模拟摄像机反馈的设备属性。
S1402:对接收到的模拟摄像机反馈的设备属性进行记录。
S1403:接收待处理模拟摄像机发送的数据。
S1404:在所记录的设备属性中,查找该待处理模拟摄像机反馈的设备属性。
S1405:根据查找到的设备属性,判断该待处理模拟摄像机是否具有图像分析功能;如果是,执行S1406,如果否,执行S1407。
S1406:触发第一类处理模式:从所接收到的数据中分离得到视频流数据及图像分析数据;对分离得到的视频流数据及所述图像分析数据分别进行处理。
S1407:触发第二类处理模式:对接收到的视频流数据进行分析,得到图像分析数据;对接收到的视频流数据、以及分析得到的图像分析数据分别进行处理。
在图14所示实施例中,服务器可以向系统中模拟摄像机发送属性请求指令,也可以仅在检测到系统中接入新的模拟摄像机后,向该新的模拟摄像机发送属性请求指令。
模拟摄像机接收到该属性请求指令后,将自身设备属性发送给服务器。举例来说,该设备属性可以包括:设备型号、或者设备硬件性能等,服务器可以根据设备型号、或者设备硬件性能,判断该模拟摄像机是否具有图像分析功能,如果具有,则触发第一类处理模式,如果不具有,则触发第二类处理模式。
再举一例,该设备属性也可以包括能够直接反应模拟摄像机是否具有图像分析功能的信息,服务器可以直接根据该信息,判断该模拟摄像机是否具有图像分析功能,如果具有,则触发第一类处理模式,如果不具有,则触发第二类处理模式。
再举一例,该设备属性也可以包括模拟摄像机的名称、或ID等,服务器中存储有模拟摄像机的名称或ID对应的设备功能信息,服务器可以根据该模拟摄像机的名称或ID,判断该模拟摄像机是否具有图像分析功能,如果具有,则触发第一类处理模式,如果不具有,则触发第二类处理模式。
应用本申请图14所示实施例,服务器预先获取系统中模拟摄像机的设备属性,根据设备属性判断模拟摄像机是否具有图像分析功能,针对功能不同的模拟摄像机采用不同的处理模式,应用本方案,服务器既能够对具有图像分析功能的模拟摄像机发送的数据进行处理,也能够对不具有图像分析功能的模拟摄像机发送的数据进行处理,解决了系统的不兼容问题。
本申请实施例还提供了一种监控系统,如图15所示,包括模拟摄像机10和服务器20,其中,
模拟摄像机10包括:图像采集芯片100、图像分析芯片200、集成芯片300和发送芯片400,图像采集芯片100分别与图像分析芯片200、集成芯片300相连接,图像分析芯片200与集成芯片300相连接,发送芯片400与集成芯片300相连接;
图像采集芯片100,用于采集数字视频流数据;
图像分析芯片200,用于对所述数字视频流数据进行分析,识别所述数字视频流数据中存在的目标,提取所述目标的属性和/或位置信息,作为图像分析数据;并将所述图像分析数据发送到集成芯片300;
集成芯片300,用于根据预设插入模式,将所述图像分析数据插入至所述数字视频流数据中,得到混合数据,并将所述混合数据发送到发送芯片400;
发送芯片400,用于将所述混合数据、或者所述混合数据中的数字视频流数据转换为模拟数据,得到转换后的混合数据;将所述转换后的混合数据发送到服务器20;
服务器20,用于接收所述混合数据,根据所述预设插入模式对应的分离模式,将所述混合数据分离为视频流数据和图像分析数据。
本申请实施例提供的监控系统中的模拟摄像机可以为本申请实施例提供的任一种模拟摄像机,不再赘述。
在上述模拟摄像机的各种实施方式中,集成芯片300在数字视频流数据中插入的数据或有不同,比如,一些实施方式中,仅插入目标的属性和/或位置信息;在另一些实施方式中,插入目标的属性和/或位置信息、以及抓取的图像;在另一些实施方式中,插入目标的属性和/或位置信息、以及压缩后的图像;为了方便描述,将这些插入的数据都称为图像分析数据。
服务器20与模拟摄像机10可以预先约定视频流数据及图像分析数据的叠加方式(插入模式)及分离方式,比如,如果模拟摄像机10在数字视频流数据的消隐区中叠加图像分析数据,服务器20则在消隐区中读取图像分析数据。
具体来说,服务器20可以根据消隐区标识,在所接收到的数据中确定消隐区和图像区;读取所述消隐区中的数据,作为图像分析数据;读取所述图像区中的数据,作为视频流数据。
或者,模拟摄像机10与服务器20也可以约定第几行为视频流数据,第几行为图像分析数据,这样,服务器20也可以从所接收到的数据中分离得到视频流数据及图像分析数据,具体分离方式不做限定。
服务器20分离得到视频流数据及图像分析数据后,对这两种数据分别进行处理。比如,对视频流数据可以进行编码、存储等处理,具体不做限定。
举例来说,假设服务器20分离得到的图像分析数据中包含人脸特征,则服务器20可以将该人脸特征与数据库中存储的人脸特征进行对比,进而确定出该人脸对应的身份信息。
再举一例,假设服务器20分离得到的图像分析数据中包含车牌号,则服务器20可以确定该车牌号对应的车主信息,或者,在数据库中查找该车牌号对应的行驶轨迹等。
再举一例,假设服务器20分离得到的图像分析数据中包含图像、以及人 脸在该图像中的位置信息,则服务器可以根据该位置信息,在该图像中确定出人脸区域,服务器可以将该人脸区域发送给显示设备进行显示。或者,服务器也可以在确定出人脸区域后,提取人脸特征等。
服务器20对视频流数据及图像分析数据的处理方式有多种,可以根据实际情况进行设定,具体不做限定。
可见,在本实施例中,模拟摄像机将图像分析数据发送给服务器,服务器根据该图像分析数据,直接得到目标的关联信息,比如,属性、位置信息、图像等,服务器不需要在图像中进行目标识别,减少了服务器的运算量。
在一种实施方式中,发送芯片400可以将混合数据中的数字视频流数据转换为模拟视频流数据;将混合数据中的图像分析数据转换为低频数字数据;将该模拟视频流数据及该低频数字数据发送到服务器20。
这种情况下,服务器20可以对分离得到的视频流数据进行模数转换,得到第一数字数据;对分离得到的图像分析数据进行低频采样,得到第二数字数据。然后服务器20再对第一数字数据和第二数字数据分别进行处理。
分离过程与转换过程可以同时进行,比如,服务器20根据与模拟摄像机10的约定,确定读取的数据为视频流数据,则可以对读取的数据进行模数转换;服务器20根据与模拟摄像机10的约定,确定读取的数据为图像分析数据,则可以对读取的数据进行低频采样。
或者,也可以先进行分离过程,再进行转换过程,具体不做限定。
在另一种实施方式中,发送芯片400将混合数据转换为模拟混合数据,将模拟混合数据发送到服务器20。这种情况下,服务器20可以先将接收到的混合数据分离为视频流数据和图像分析数据,再对分离得到的视频流数据和图像分析数据都进行模数转换,得到数字数据。或者,服务器也可以先进行模数转换,得到数字混合数据,再对数字混合数据进行分离,得到数字视频流数据和图像分析数据。或者,也可以同时进行分离过程与模数转换过程,具体不做限定。
作为一种实施方式,服务器20可以将数字视频流数据、图像分析数据均转换为预设格式的数据,比如BT656格式的数据等,具体不做限定。
为了方便描述,本实施例中,将视频流数据转换的数字数据称为第一数字数据,将图像分析数据转换的数字数据称为第二数字数据。作为一种实施方式,服务器20可以对第一数字数据和/或第二数字数据进行标记,以便在后续处理过程中,对这两种数字数据进行区分。
具体的标记方式不做限定,比如,可以在数据头部增加特殊标记位等,如果第一数字数据与第二数字数据都进行标记,则标记方式不同。
作为一种实施方式,模拟摄像机10与服务器20可以通过同轴线缆相连接,或者,也可以通过其他方式通信连接,比如,通过双绞线相连接、或者无线连接等,具体不做限定。
应用本申请实施例提供的模拟摄像机,模拟摄像机中包含图像分析芯片,该图像分析芯片对采集到的数字视频流数据进行分析,识别数字视频流数据中存在的目标,提取目标的属性和/或位置信息;并将目标的属性和/或位置信息发送到集成芯片,集成芯片根据预设插入模式,将目标的属性和/或位置信息插入至数字视频流数据中,得到混合数据;可见,本方案中的模拟摄像机实现了对采集到的图像进行分析处理。
应用本申请实施例提供的监控系统,模拟摄像机将图像分析数据发送给服务器,服务器根据该图像分析数据,直接得到目标的关联信息,比如,属性、位置信息、图像等,服务器不需要在图像中进行目标识别,减少了服务器的运算量。
图16为本申请实施例提供的一种服务器的结构示意图,包括:
传感信号接收芯片110、图像信号接收芯片120和信号处理芯片130,传感信号接收芯片110及图像信号接收芯片120分别与信号处理芯片130相连接;
传感信号接收芯片110,用于接收无线传感器发送的传感信号,并将传感信号发送至信号处理芯片130;
图像信号接收芯片120,用于接收模拟摄像机发送的图像信号,并将图像 信号发送至信号处理芯片130;
信号处理芯片130,用于对传感信号及图像信号进行关联处理。
应用本申请图16所示实施例,服务器中的传感信号接收芯片接收无线传感器发送的传感信号,图像信号接收芯片接收模拟摄像机发送的图像信号,信号处理芯片对该传感信号及该图像信号进行关联处理;可见,本方案中的服务器实现了将同一场景中的传感信号及该图像信号进行关联处理。
传感信号接收芯片预先与一个或多个无线传感器进行配对连接,这样,传感信号接收芯片便可以接收并处理无线传感器发射的传感信号。作为一种实施方式,服务器中可以包含多个传感信号接收芯片,每个传感信号接收芯片分别与一个无线传感器进行配对连接;或者,作为另一种实施方式,服务器中可以包含一个传感信号接收芯片,该传感信号接收芯片与一个或多个无线传感器配对连接。
作为一种实施方式,如图17所示,传感信号接收芯片110包括电磁接收天线1101和处理器1102,电磁接收天线1101,用于接收无线发射传感器发射的电磁波形式的传感信号,并将该电磁波形式的传感信号发送至处理器1102;处理器1102,用于将该电磁波形式的传感信号转换为电信号形式的传感信号。
可以理解,无线传感器一般以电磁波的形式发送传感信号,传感信号接收芯片110接收到该电磁波后,需要将该电磁波转换为自身可处理的电信号。
作为一种实施方式,如图18所示,传感信号接收芯片110包括电磁接收天线1101、处理器1102和分类器1103,电磁接收天线1101,用于接收无线发射传感器发射的电磁波形式的传感信号,并将该电磁波形式的传感信号发送至处理器1102;处理器1102,用于将该电磁波形式的传感信号转换为电信号形式的传感信号;分类器1103,用于接收处理器1102发送的电信号形式的传感信号,确定该电信号形式的传感信号的类别信息,将该电信号形式的传感信号及该类别信息发送到信号处理芯片130。
本申请实施例中的无线传感器可以包括以下任意一种或多种:门磁传感器、红外探测器、烟雾传感器、温度传感器、湿度传感器,或者,也可以为其他,具体不做限定。不同类别的无线传感器,采集不同类别的环境信息, 比如,门磁传感器可以采集门磁与磁体之间是否发生位移的信息,红外探测器可以采集预设区域中是否存在人员的信息,烟雾传感器可以采集环境中的烟雾信息,温度传感器可以采集环境中的温度信息,湿度传感器可以采集环境中的湿度信息,等等,不再一一列举。
相对应的,这些不同类别的无线传感器将采集到的各类别的环境信息转换为不同类别的传感信号,比如,门磁传感器将门磁与磁体之间是否发生位移的信息转换为门磁传感信号,红外探测器将预设区域中是否存在人员的信息转换为红外传感信号,烟雾传感器将环境中的烟雾信息转换为烟雾传感信号,温度传感器将环境中的温度信息转换为温度传感信号,湿度传感器将环境中的湿度信息转换为湿度传感信号,等等,不再一一列举。
对于服务器来说,可以对不同类别的传感信号进行分类处理。图18中的分类器1103即可以对处理器1102发送的传感信号进行分类标记,也就是确定传感信号的类别信息,并将该类别信息与传感信号一并发送至信号处理芯片130。
如果信号处理芯片130接收到的图像信号为模拟信号,则信号处理芯片130先将该图像信号进行模数转换,得到数字图像信号,再将该数字图像信号进行格式转换,得到预设格式的数字图像信号。举例来说,可以将该数字图像信号转换为标准并行数据格式后,再将其与传感信号进行关联处理。
信号处理芯片130对传感信号及图像信号进行关联处理,举例来说,假设信号处理芯片130对图像信号进行分析,分析结果表示当前场景中存在一个模糊的目标,但仅根据图像信号并不能确定该目标是人体还是物体;此外,信号处理芯片130接收到红外探测器发送的红外传感信号,对该红外传感信号进行分析,分析结果表示当前场景中存在一个人体目标;信号处理芯片130可以对图像信号分析结果中的模糊目标与传感信号分析结果中的人体目标进行位置匹配,如果匹配成功,表示图像信号分析结果中的模糊目标为一人体。这样,信号处理芯片130实现了对传感信号及图像信号的关联处理。
再举一例,假设场景中发生火灾,同时存在较大烟雾,由于烟雾影响,模拟摄像机采集到的图像信号画面内容不清楚,信号处理芯片130对图像信号进行分析,仅根据图像信号的分析结果,只能判断场景中失火了,并不能判 断当前火势大小;此外,信号处理芯片130接收到温度传感器发送的温度传感信号,对该温度传感信号进行分析,得到当前场景中的环境温度;信号处理芯片130结合图像信号分析结果、温度传感信号分析结果,可以较准确地对火势大小进行判断。
再举一例,假设信号处理芯片130接收到烟雾传感器发送的烟雾传感信号,对该烟雾传感信号进行分析,分析结果表示当前场景中存在烟雾,但仅根据烟雾传感信号并不能确定场景中有人吸烟还是场景中有火灾;此外,信号处理芯片130接收到图像信号,对该图像信号进行分析,分析结果表示当前场景中不存在火灾,而存在一个在吸烟的人体目标;信号处理芯片130结合图像信号分析结果、温度传感信号分析结果,可以确定并未发生火灾,只是一个人在吸烟。
作为一种实施方式,如图19所示,服务器中还可以包括显示器140,显示器140与信号处理芯片130相连接。或者,也可以在图17、图18的基础上增加显示器140,具体不做限定。
在本实施方式中,信号处理芯片140对传感信号及图像信号进行关联处理后,将关联处理结果发送至显示器140,显示器140对接收到的关联处理结果进行显示。
比如,上述例子中,信号处理芯片130对图像信号分析结果中的模糊目标与传感信号分析结果中的人体目标进行位置匹配,如果匹配成功,则关联处理结果可以为:某某区域内有人员进入,或者类似的信息;信号处理芯片130将该关联处理结果发送至显示器140进行显示。
再比如,上述例子中,信号处理芯片130结合图像信号分析结果、温度传感信号分析结果,判断当前火势大小,则关联处理结果可以为当前火势等级;信号处理芯片130将该关联处理结果发送至显示器140进行显示。
再比如,上述例子中,信号处理芯片130结合图像信号分析结果、温度传感信号分析结果,确定一个人在吸烟,则关联处理结果可以为提示禁止吸烟的信息,或者类似的信息;信号处理芯片130将该关联处理结果发送至显示器140进行显示。
作为一种实施方式,如图20所示,服务器中还可以包括报警器150,报警器150与信号处理芯片130相连接。在本实施方式中,信息处理芯片130,还可以用于根据关联处理结果,控制报警器150进行报警。或者,也可以在图17、图18、图19的基础上增加报警器150,具体不做限定。
该报警器可以为闪烁灯、蜂鸣器等,相对应的,报警方式可以为灯光闪烁、蜂鸣声等,具体不做限定。
服务器中可以预先存储报警条件,比如,当指定区域内有人员进入时进行报警,或者,当场景中有火情时进行报警,或者,当场景中有人吸烟时进行报警,等等。信息处理芯片130根据上述关联处理结果,判断是否符合预先存储的报警条件,如果符合,控制报警器150进行报警。
或者,报警器也可以为一台独立的设备,报警器与服务器通信连接,信息处理芯片130根据关联处理结果,判断是否符合报警条件,如果符合,向报警器发送报警信息,报警器接收到报警信息后进行报警。
或者,报警器也可以与该模拟摄像机一体设置,信息处理芯片130根据关联处理结果,判断是否符合报警条件,如果符合,向该模拟摄像机发送报警信息,该模拟摄像机接收到报警信息后控制报警器进行报警。
或者,报警器也可以与该无线传感器一体设置,信息处理芯片130根据关联处理结果,判断是否符合报警条件,如果符合,向该无线传感器发送报警信息,该无线传感器接收到报警信息后控制报警器进行报警。
应用本申请图16所示实施例,服务器中的传感信号接收芯片接收无线传感器发送的传感信号,图像信号接收芯片接收模拟摄像机发送的图像信号,信号处理芯片对该传感信号及该图像信号进行关联处理;可见,本方案中的服务器实现了将同一场景中的传感信号及该图像信号进行关联处理。
本申请实施例还提供一种模拟摄像机,如图21所示,包括处理器2101和存储器2102;
存储器2102,用于存放计算机程序;
处理器2101,用于执行存储器2102上所存放的程序时,实现上述任一种 应用于模拟摄像机侧的数据传输方法。
本申请实施例还提供一种服务器,如图22所示,包括处理器2201和存储器2202;
存储器2202,用于存放计算机程序;
处理器2201,用于执行存储器2202上所存放的程序时,实现上述任一种应用于服务器侧的数据处理方法。
需要说明的是,在本文中,诸如第一和第二等之类的关系术语仅仅用来将一个实体或者操作与另一个实体或操作区分开来,而不一定要求或者暗示这些实体或操作之间存在任何这种实际的关系或者顺序。而且,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的过程、方法、物品或者设备中还存在另外的相同要素。
本说明书中的各个实施例均采用相关的方式描述,各个实施例之间相同相似的部分互相参见即可,每个实施例重点说明的都是与其他实施例的不同之处。尤其,对于监控系统实施例而言,由于其基本相似于模拟摄像机实施例,所以描述的比较简单,相关之处参见模拟摄像机实施例的部分说明即可。
以上所述仅为本申请的较佳实施例而已,并不用以限制本申请,凡在本申请的精神和原则之内,所做的任何修改、等同替换、改进等,均应包含在本申请保护的范围之内。

Claims (31)

  1. 一种模拟摄像机,其特征在于,包括:图像采集芯片、图像分析芯片、集成芯片和发送芯片,所述图像采集芯片分别与所述图像分析芯片、所述集成芯片相连接,所述图像分析芯片与所述集成芯片相连接,所述集成芯片与所述发送芯片相连接;
    所述图像采集芯片,用于采集数字视频流数据;
    所述图像分析芯片,用于对所述数字视频流数据进行分析,识别所述数字视频流数据中存在的目标,提取所述目标的属性和/或位置信息;并将所述目标的属性和/或位置信息发送到所述集成芯片;
    所述集成芯片,用于根据预设插入模式,将所述目标的属性和/或位置信息插入至所述数字视频流数据中,得到混合数据,并将所述混合数据发送到所述发送芯片;
    所述发送芯片,用于将所述混合数据、或者所述混合数据中的数字视频流数据转换为模拟数据,得到转换后的混合数据;发送所述转换后的混合数据。
  2. 根据权利要求1所述的模拟摄像机,其特征在于,
    所述图像采集芯片,还用于将所述数字视频流数据进行复制,得到两份数字视频流数据;并将一份数字视频流数据发送到所述图像分析芯片,将另一份数字视频流数据发送到所述集成芯片;
    所述集成芯片,具体用于:
    在所述数字视频流数据中抓取携带有所述目标的图像;
    根据预设插入模式,将所抓取的图像、所述目标的属性和/或位置信息插入至所述数字视频流数据中,得到混合数据,并将所述混合数据发送到所述发送芯片。
  3. 根据权利要求1所述的模拟摄像机,其特征在于,
    所述图像采集芯片,还用于将所述数字视频流数据进行复制,得到两份数字视频流数据;并将一份数字视频流数据发送到所述图像分析芯片,将另 一份数字视频流数据发送到所述集成芯片;
    所述集成芯片,具体用于:
    在所述数字视频流数据中抓取携带有所述目标的图像;
    对所抓取的图像进行压缩,得到压缩后的图像;
    根据预设插入模式,将所述压缩后的图像、所述目标的属性和/或位置信息插入至所述数字视频流数据中,得到混合数据,并将所述混合数据发送到所述发送芯片。
  4. 根据权利要求1所述的模拟摄像机,其特征在于,
    所述图像采集芯片,还用于将所述数字视频流数据发送到所述集成芯片;
    所述集成芯片,还用于在所述数字视频流数据中抓取图像,并将所抓取的图像发送给所述图像分析芯片;
    所述图像分析芯片,具体用于:
    接收所述集成芯片发送的图像,对所述图像进行分析,识别所述图像中存在的目标,提取所述目标的属性和/或位置信息;并将所述目标的属性和/或位置信息发送到所述集成芯片。
  5. 根据权利要求1所述的模拟摄像机,其特征在于,所述图像采集芯片,还用于将所述数字视频流数据发送到所述集成芯片;
    所述集成芯片,还用于在所述数字视频流数据中抓取图像,并将所抓取的图像发送给所述图像分析芯片;
    所述图像分析芯片,具体用于:
    接收所述集成芯片发送的图像,对所述图像进行分析,识别所述图像中存在的目标,提取所述目标的属性和/或位置信息;并将所述目标的属性和/或位置信息发送到所述集成芯片;
    所述集成芯片,还用于对所抓取的图像进行压缩,得到压缩后的图像;并根据预设插入模式,将所述压缩后的图像、所述目标的属性和/或位置信息插入至所述数字视频流数据中,得到混合数据。
  6. 根据权利要求5所述的模拟摄像机,其特征在于,所述集成芯片,还用于:
    将所述压缩后的图像、所述目标的属性和/或位置信息插入所述数字视频流数据的消隐区;
    根据预设排列方式,对插入所述消隐区的所述压缩后的图像、所述目标的属性和/或位置信息进行排列,得到排列后数据;
    将所述排列后数据转换为预设格式的数据;
    将所述数字视频流数据转换为所述预设格式的数据。
  7. 根据权利要求1所述的模拟摄像机,其特征在于,所述发送芯片,具体用于:
    将所述混合数据中的数字视频流数据转换为模拟视频流数据,将所述混合数据中的图像分析数据转换为低频数字数据,所述图像分析数据包括所述目标的属性和/或位置信息;发送所述模拟视频流数据及所述低频数字数据。
  8. 根据权利要求1所述的模拟摄像机,其特征在于,所述集成芯片包含多个集成设置的芯片,所述集成芯片包含所述发送芯片。
  9. 根据权利要求1所述的模拟摄像机,其特征在于,所述集成芯片包含多个集成设置的芯片,所述集成芯片包含图像处理芯片和插入芯片;
    所述图像处理芯片,用于对所述数字视频流数据进行色彩和/或亮度处理;
    所述插入芯片,用于根据预设插入模式,将所述目标的属性和/或位置信息插入至处理后的数字视频流数据中,得到混合数据。
  10. 一种数据传输方法,其特征在于,应用于模拟摄像机,所述方法包括:
    采集数字视频流数据;
    对所述数字视频流数据进行分析,识别所述数字视频流数据中存在的目标,提取所述目标的属性和/或位置信息;
    根据预设插入模式,将所述目标的属性和/或位置信息插入至所述数字视 频流数据中,得到混合数据;
    将所述混合数据、或者所述混合数据中的数字视频流数据转换为模拟数据,得到转换后的混合数据;发送所述转换后的混合数据。
  11. 根据权利要求10所述的方法,其特征在于,在所述提取所述目标的属性和/或位置信息之后,还包括:
    在所述数字视频流数据中抓取携带有所述目标的图像;
    所述根据预设插入模式,将所述目标的属性和/或位置信息插入至所述数字视频流数据中,得到混合数据,包括:
    根据预设插入模式,将所抓取的图像、所述目标的属性和/或位置信息插入至所述数字视频流数据中,得到混合数据。
  12. 根据权利要求11所述的方法,其特征在于,在所述数字视频流数据中抓取携带有所述目标的图像之后,还包括:
    对所抓取的图像进行压缩,得到压缩后的图像;
    所述根据预设插入模式,将所抓取的图像、所述目标的属性和/或位置信息插入至所述数字视频流数据中,得到混合数据,包括:
    根据预设插入模式,将所述压缩后的图像、所述目标的属性和/或位置信息插入至所述数字视频流数据中,得到混合数据。
  13. 根据权利要求11或12所述的方法,其特征在于,在所述采集数字视频流数据之后,还包括:
    将所述数字视频流数据进行复制,得到两份数字视频流数据;
    所述对所述数字视频流数据进行分析,包括:
    对所述两份数字视频流数据中的一份数字视频流数据进行分析;
    在所述数字视频流数据中抓取携带有所述目标的图像,包括:
    在所述两份数字视频流数据中的另一份数字视频流数据中抓取携带有所述目标的图像。
  14. 根据权利要求10所述的方法,其特征在于,所述根据预设插入模式,将所述目标的属性和/或位置信息插入至所述数字视频流数据中,得到混合数据,包括:
    将所述目标的属性和/或位置信息插入所述数字视频流数据的消隐区,得到混合数据。
  15. 根据权利要求14所述的方法,其特征在于,所述将所述混合数据、或者所述混合数据中的数字视频流数据转换为模拟数据,得到转换后的混合数据,包括:
    根据预设排列方式,对插入所述消隐区的所述目标的属性和/或位置信息进行排列,得到排列后数据;
    将所述排列后数据、以及所述数字视频流数据转换为预设格式的数据;
    将所述预设格式的数据进行数模转换,得到转换后的混合数据。
  16. 根据权利要求10所述的方法,其特征在于,所述将所述混合数据、或者所述混合数据中的数字视频流数据转换为模拟数据,得到转换后的混合数据,包括:
    将所述混合数据中的数字视频流数据转换为模拟视频流数据;
    将所述混合数据中的图像分析数据转换为低频数字数据,所述图像分析数据包括所述目标的属性和/或位置信息。
  17. 根据权利要求10所述的方法,其特征在于,所述发送所述转换后的混合数据,包括:
    连续发送预设数量个第二种图像帧,所述第二种图像帧为携带视频流数据且不携带目标数据的图像帧,所述目标数据包括所述目标的属性和/或位置信息;
    将所述目标数据作为第一种图像帧的有效图像区携带的数据,采用有效图像区的数据发送方式,发送所述目标数据;其中,所述第一种图像帧与所述第二种图像帧占用同一传输通道。
  18. 根据权利要求10所述的方法,其特征在于,所述发送所述转换后的混合数据,包括:
    判断待发送的目标数据的数据量是否达到预设数据量阈值;所述目标数据包括所述目标的属性和/或位置信息;
    如果达到,将所述目标数据作为第一种图像帧的有效图像区携带的数据,采用有效图像区的数据发送方式,发送所述目标数据;
    如果未达到,发送第二种图像帧,所述第二种图像帧为携带视频流数据且不携带目标数据的图像帧,其中,所述第一种图像帧与所述第二种图像帧占用同一传输通道。
  19. 根据权利要求17或18所述的方法,其特征在于,在发送所述转换后的混合数据之前,所述方法还包括:
    获取待发送的同轴数据;
    将所述同轴数据作为与所述目标数据所在的第一种图像帧的消隐区携带的数据,采用消隐区的数据发送方式,发送所述同轴数据。
  20. 根据权利要求19所述的方法,其特征在于,所述同轴数据还包括:
    图像帧的有效图像区中的数据为目标数据的同轴数据标识。
  21. 根据权利要求10所述的方法,其特征在于,所述发送所述转换后的混合数据,包括:
    确定视频流数据在图像帧的有效图像区中的第一位置,以及确定目标数据在图像帧的有效图像区中的第二位置,所述目标数据包括所述目标的属性和/或位置信息;
    将所述视频流数据以及所述目标数据作为同一图像帧的数据,采用有效图像区的数据发送方式,按照所述第一位置发送所述视频流数据,按照所述第二位置发送所述目标数据。
  22. 根据权利要求21所述的方法,其特征在于,在所述发送所述转换后的混合数据之前,所述方法还包括:
    获取待发送的同轴数据,以及确定所述同轴数据在图像帧的消隐区的第三位置;
    将所述同轴数据作为与所述视频流数据和目标数据所在图像帧的数据,采用消隐区的数据发送方式,按照所述第三位置发送所述同轴数据。
  23. 根据权利要求10所述的方法,其特征在于,所述发送所述转换后的混合数据,包括:
    确定视频流数据在图像帧的有效图像区中的第一位置,以及确定目标数据在图像帧的消隐区中的第二位置,所述目标数据包括所述目标的属性和/或位置信息;
    将所述视频流数据以及所述目标数据作为同一图像帧的数据,采用有效图像区的数据发送方式,按照所述第一位置发送所述视频流数据,以及采用消隐区的数据发送方式,按照所述第二位置发送所述目标数据。
  24. 一种数据处理方法,其特征在于,应用于监控系统中的服务器,所述监控系统中还包括模拟摄像机,所述模拟摄像机与所述服务器同轴连接;所述方法包括:
    接收待处理模拟摄像机发送的数据;
    从所接收到的数据中分离得到视频流数据及图像分析数据;
    对分离得到的视频流数据及所述图像分析数据分别进行处理。
  25. 根据权利要求24所述的方法,其特征在于,在所述接收待处理模拟摄像机发送的数据之后,还包括:
    判断所述待处理模拟摄像机是否具有图像分析功能;
    如果是,则执行所述从所接收到的数据中分离得到视频流数据及图像分析数据的步骤;
    如果否,则对接收到的视频流数据进行分析,得到图像分析数据;对接收到的视频流数据、以及分析得到的图像分析数据分别进行处理;其中,所述图像分析数据包括目标的属性和/或位置信息,所述目标为视频流数据中存 在的目标。
  26. 根据权利要求25所述的方法,其特征在于,在所述接收待处理模拟摄像机发送的数据之前,还包括:
    向所述系统中的模拟摄像机发送属性请求指令,并接收所述系统中的模拟摄像机反馈的设备属性;
    对接收到的模拟摄像机反馈的设备属性进行记录;
    所述判断所述待处理模拟摄像机是否具有图像分析功能,包括:
    在所记录的设备属性中,查找所述待处理模拟摄像机反馈的设备属性;
    根据查找到的设备属性,判断所述待处理模拟摄像机是否具有图像分析功能。
  27. 根据权利要求24所述的方法,其特征在于,所述从所接收到的数据中分离得到视频流数据及图像分析数据,包括:
    根据消隐区标识,在所接收到的数据中确定消隐区和图像区;
    读取所述消隐区中的数据,提取图像分析数据;
    读取所述图像区中的数据,提取视频流数据。
  28. 一种模拟摄像机,其特征在于,包括处理器和存储器;
    存储器,用于存放计算机程序;
    处理器,用于执行存储器上所存放的程序时,实现权利要求10-23任一所述的方法步骤。
  29. 一种服务器,其特征在于,包括处理器和存储器;
    存储器,用于存放计算机程序;
    处理器,用于执行存储器上所存放的程序时,实现权利要求24-27任一所述的方法步骤。
  30. 一种监控系统,其特征在于,包括:权利要求1-9任一项所述的模拟摄像机和服务器,其中,
    所述模拟摄像机将所述所述转换后的混合数据发送至所述服务器。
  31. 根据权利要求30所述的系统,其特征在于,所述服务器包括:传感信号接收芯片、图像信号接收芯片和信号处理芯片,所述传感信号接收芯片及图像信号接收芯片分别与所述信号处理芯片相连接;
    所述传感信号接收芯片,用于接收无线传感器发送的传感信号,并将所述传感信号发送至所述信号处理芯片;
    所述图像信号接收芯片,用于接收模拟摄像机发送的图像信号,并将所述图像信号发送至所述信号处理芯片;
    所述信号处理芯片,用于对所述传感信号及所述图像信号进行关联处理。
PCT/CN2018/092146 2017-10-20 2018-06-21 模拟摄像机、服务器、监控系统和数据传输、处理方法 WO2019076076A1 (zh)

Applications Claiming Priority (12)

Application Number Priority Date Filing Date Title
CN201710985836.5 2017-10-20
CN201710985130.9 2017-10-20
CN201710985838.4 2017-10-20
CN201710985838.4A CN109698933B (zh) 2017-10-20 2017-10-20 数据传输方法及摄像机、电子设备、计算机可读存储介质
CN201710984275.7A CN109698932B (zh) 2017-10-20 2017-10-20 数据传输方法及摄像机、电子设备
CN201721357044.5 2017-10-20
CN201710985839.9A CN109698895A (zh) 2017-10-20 2017-10-20 一种模拟摄像机、监控系统及数据发送方法
CN201710984275.7 2017-10-20
CN201710985839.9 2017-10-20
CN201710985130.9A CN109698900B (zh) 2017-10-20 2017-10-20 一种数据处理方法、装置及监控系统
CN201721357044.5U CN207766402U (zh) 2017-10-20 2017-10-20 一种服务器及监控系统
CN201710985836.5A CN109698923B (zh) 2017-10-20 2017-10-20 数据传输方法及摄像机、电子设备

Publications (1)

Publication Number Publication Date
WO2019076076A1 true WO2019076076A1 (zh) 2019-04-25

Family

ID=66173132

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/092146 WO2019076076A1 (zh) 2017-10-20 2018-06-21 模拟摄像机、服务器、监控系统和数据传输、处理方法

Country Status (1)

Country Link
WO (1) WO2019076076A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220130151A1 (en) * 2019-07-23 2022-04-28 Zhejiang Xinsheng Electronic Technology Co., Ltd. Surveillance systems and methods

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060066722A1 (en) * 2004-09-28 2006-03-30 Objectvideo, Inc. View handling in video surveillance systems
CN101540835A (zh) * 2008-03-20 2009-09-23 海南三基科技有限公司 高清数字ip摄像机
CN104038711A (zh) * 2013-03-05 2014-09-10 派视尔株式会社 图像传感器及包括它的监视系统
CN104065923A (zh) * 2014-06-23 2014-09-24 苏州阔地网络科技有限公司 一种在线同步课堂跟踪控制方法及系统
CN104519318A (zh) * 2013-09-27 2015-04-15 三星泰科威株式会社 图像监控系统和监视摄像机
CN105898207A (zh) * 2015-01-26 2016-08-24 杭州海康威视数字技术股份有限公司 视频数据的智能处理方法及系统

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060066722A1 (en) * 2004-09-28 2006-03-30 Objectvideo, Inc. View handling in video surveillance systems
CN101540835A (zh) * 2008-03-20 2009-09-23 海南三基科技有限公司 高清数字ip摄像机
CN104038711A (zh) * 2013-03-05 2014-09-10 派视尔株式会社 图像传感器及包括它的监视系统
CN104519318A (zh) * 2013-09-27 2015-04-15 三星泰科威株式会社 图像监控系统和监视摄像机
CN104065923A (zh) * 2014-06-23 2014-09-24 苏州阔地网络科技有限公司 一种在线同步课堂跟踪控制方法及系统
CN105898207A (zh) * 2015-01-26 2016-08-24 杭州海康威视数字技术股份有限公司 视频数据的智能处理方法及系统

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220130151A1 (en) * 2019-07-23 2022-04-28 Zhejiang Xinsheng Electronic Technology Co., Ltd. Surveillance systems and methods
JP2022542789A (ja) * 2019-07-23 2022-10-07 チェジアン シンシェン エレクトロニック テクノロジー カンパニー リミテッド 監視システム及び方法

Similar Documents

Publication Publication Date Title
US7423669B2 (en) Monitoring system and setting method for the same
WO2020073709A1 (zh) 一种多摄像机多人脸视频接续采集装置及方法
CN106411915B (zh) 用于多媒体捕获的嵌入式装置
CN101344988B (zh) 图像获取和处理装置及方法、车辆监测和记录系统
US8208040B2 (en) Display control apparatus, display control method, and recording medium
CN112165573B (zh) 拍摄处理方法和装置、设备、存储介质
CN106204815A (zh) 一种基于人脸检测和识别的门禁系统
US20140214885A1 (en) Apparatus and method for generating evidence video
WO2020094088A1 (zh) 一种图像抓拍方法、监控相机及监控系统
CN111163259A (zh) 一种图像抓拍方法、监控相机及监控系统
CN106803936B (zh) 基于内存编码机制的视频抓拍方法及装置
CN109446946B (zh) 一种基于多线程的多摄像头实时检测方法
CN101388146B (zh) 图像获取和处理装置及方法、车辆监测和记录系统
CN112232211A (zh) 一种基于深度学习的智能视频监控系统
CN116916049B (zh) 一种基于云计算技术的视频数据在线采集与存储系统
CN113038375B (zh) 一种隐蔽摄像头感知、定位方法及系统
CN209608768U (zh) 一种无线pir双光源摄像机与无线录像机组成的报警系统
CN111753743B (zh) 一种基于网闸的人脸识别方法及系统
CN111586432B (zh) 空播直播间确定方法、装置、服务器及存储介质
JP5153478B2 (ja) 画像処理装置及び画像処理方法
WO2019076076A1 (zh) 模拟摄像机、服务器、监控系统和数据传输、处理方法
KR101957218B1 (ko) 기준 배경이미지 획득 시스템 및 기준 배경이미지 획득 방법
KR101547255B1 (ko) 지능형 감시 시스템의 객체기반 검색방법
CN109034267A (zh) 片尾曲智能选择方法
CN108881119B (zh) 一种视频浓缩的方法、装置和系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18868652

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18868652

Country of ref document: EP

Kind code of ref document: A1