WO2018170685A1 - Video stitching method and system for monitoring cloud platform - Google Patents

Video stitching method and system for monitoring cloud platform Download PDF

Info

Publication number
WO2018170685A1
WO2018170685A1 PCT/CN2017/077332 CN2017077332W WO2018170685A1 WO 2018170685 A1 WO2018170685 A1 WO 2018170685A1 CN 2017077332 W CN2017077332 W CN 2017077332W WO 2018170685 A1 WO2018170685 A1 WO 2018170685A1
Authority
WO
WIPO (PCT)
Prior art keywords
video
video files
time
cloud platform
monitoring
Prior art date
Application number
PCT/CN2017/077332
Other languages
French (fr)
Chinese (zh)
Inventor
张北江
胡君健
Original Assignee
华平智慧信息技术(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华平智慧信息技术(深圳)有限公司 filed Critical 华平智慧信息技术(深圳)有限公司
Priority to PCT/CN2017/077332 priority Critical patent/WO2018170685A1/en
Publication of WO2018170685A1 publication Critical patent/WO2018170685A1/en

Links

Abstract

Disclosed in the present invention is a video stitching method for a monitoring cloud platform, the method comprising the following steps: a monitoring cloud platform obtains multiple video files, and extracts multiple feature points from the multiple video files and the time of the video files; the monitoring cloud platform extracts at least two video files that have the same feature points and have the time ranges in a set range; and the monitoring cloud platform stitches the at least two video files together according to the time sequence, so that a stitched video is obtained. The technical solution provided by the present invention has the advantage of being convenient for users to watch.

Description

 Video stitching method and system for monitoring cloud platform Technical field

The present invention relates to the field of monitoring, and in particular, to a video stitching method and system for monitoring a cloud platform.

Background technique

The monitoring system consists of 5 parts: camera, transmission, control, display, and record registration. The camera transmits the video image to the control host through the coaxial video cable, and the control host distributes the video signal to each monitor and recording device, and simultaneously records the voice signal to be transmitted into the recorder. By controlling the host, the operator can issue commands to control the up, down, left, and right movements of the pan/tilt and focus the zoom on the lens, and can be implemented in the multi-channel camera and the pan/tilt by the control host. Switch between. With special recording processing mode, images can be recorded, played back, processed, etc., so that the recording effect is optimal.

The video of the existing monitoring system is a separate video, and the related videos are not spliced together, which is inconvenient for the user to watch.

technical problem

The application provides a video splicing method for monitoring a cloud platform. It solves the disadvantage that the technical solutions of the prior art are inconvenient to watch.

Technical solution

In one aspect, a video splicing method for monitoring a cloud platform is provided. The method includes the following steps: monitoring a cloud platform to acquire multiple video files, extracting multiple feature points in multiple video files, and time of a video file; monitoring the cloud platform And extracting at least two video files of the same feature point and having a time range of a set range; and the monitoring cloud platform splicing the at least two video files together in time series to obtain a spliced video.

Optionally, the method further includes:

The monitoring cloud platform uses the earliest time and the latest time of at least two video files as the time period of the stitched video.

Optionally, the method further includes:

The monitoring cloud platform marks the splicing points of the spliced video.

In a second aspect, a video splicing system for monitoring a cloud platform is provided, the system comprising:

a receiving unit, configured to acquire multiple video files;

a processing unit, configured to extract a plurality of feature points in the plurality of video files and a time of the video file, extract at least two video files with the same feature point and a time range within a set range, and time the at least two video files by time Sequentially stitched together to get a stitched video.

Optionally, the system further includes:

The processing unit is configured to use the earliest time and the latest time of the at least two video files as the time period of the stitched video.

Optionally, the system further includes:

A processing unit for marking the splicing points of the spliced video.

A third aspect provides a monitoring system, including: a processor, a wireless transceiver, a memory, and a bus, wherein the processor, the wireless transceiver, and the memory are connected by a bus, and the wireless transceiver is configured to acquire multiple video files. ;

The processor is configured to extract a plurality of feature points of the plurality of video files and a time of the video file, extract at least two video files with the same feature point and a time range within a set range, and set the at least two video files Splicing together in chronological order to get a stitched video.

Optionally, the processor is specifically configured to use the earliest time and the latest time of the at least two video files as the time period of the stitched video.

Optionally, the processor is configured to mark a splicing point of the spliced video.

Beneficial effect

The technical solution provided by the present invention has the advantages of being convenient for the user to view by splicing a plurality of videos of the same feature point and the set time period together.

DRAWINGS

In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings used in the description of the embodiments will be briefly described below. It is obvious that the drawings in the following description are some embodiments of the present invention. Those skilled in the art can also obtain other drawings based on these drawings without paying any creative work.

1 is a flowchart of a video splicing method for monitoring a cloud platform according to a first preferred embodiment of the present invention;

2 is a structural diagram of a video splicing system for monitoring a cloud platform according to a second preferred embodiment of the present invention.

FIG. 3 is a hardware structural diagram of a monitoring system according to a second preferred embodiment of the present invention.

Embodiments of the invention

The technical solutions in the embodiments of the present invention are clearly and completely described in the following with reference to the accompanying drawings in the embodiments of the present invention. It is obvious that the described embodiments are a part of the embodiments of the present invention, but not all embodiments. All other embodiments obtained by those skilled in the art based on the embodiments of the present invention without creative efforts are within the scope of the present invention.

Please refer to FIG. 1. FIG. 1 is a video splicing method for monitoring a cloud platform according to a first preferred embodiment of the present invention. The method is as shown in FIG. 1 and includes the following steps:

Step S101: The monitoring cloud platform acquires a plurality of video files, and extracts a plurality of feature points in the plurality of video files and a time of the video file.

Step S102: The monitoring cloud platform extracts at least two video files with the same feature point and a time range within a set range.

Step S103: The monitoring cloud platform splices the at least two video files together in time sequence to obtain a stitched video.

The technical solution provided by the present invention has the advantages of being convenient for the user to view by splicing a plurality of videos of the same feature point and the set time period together.

Optionally, the monitoring cloud platform uses the earliest time and the latest time of the at least two video files as the time period of the stitched video.

Optionally, the monitoring cloud platform marks the splicing points of the spliced video.

Please refer to FIG. 2. FIG. 2 is a video splicing system for monitoring a cloud platform according to a second preferred embodiment of the present invention. The system is as shown in FIG.

The receiving unit 201 is configured to acquire multiple video files.

The processing unit 202 is configured to extract a plurality of feature points in the plurality of video files and a time of the video file, extract at least two video files with the same feature point and a time range within a set range, and press the at least two video files The time sequence is stitched together to get the stitched video.

The technical solution provided by the present invention has the advantages of being convenient for the user to view by splicing a plurality of videos of the same feature point and the set time period together.

Optionally, the system may further include: the processing unit 202, the earliest time and the latest time of the at least two video files are used as the time period of the stitched video.

Optionally, the processing unit 202 is configured to mark the splicing point of the spliced video.

Referring to FIG. 3, FIG. 3 is a monitoring system 30, including: a processor 301, a wireless transceiver 302, a memory 303, and a bus 304. The wireless transceiver 302 is configured to transmit and receive data with and from an external device. The number of processors 301 can be one or more. In some embodiments of the present application, processor 301, memory 302, and transceiver 303 may be connected by bus 304 or other means. Monitoring system 30 can be used to perform the steps of FIG. For the meaning and examples of the terms involved in the embodiment, reference may be made to the corresponding embodiment of FIG. 1. I will not repeat them here.

The wireless transceiver 302 is configured to acquire a plurality of video files.

The program code is stored in the memory 303. The processor 901 is configured to call the program code stored in the memory 903 for performing the following operations:

The processor 301 is configured to extract a plurality of feature points in the plurality of video files and a time of the video file, extract at least two video files with the same feature point and a time range within a set range, and press the at least two video files The time sequence is stitched together to get the stitched video.

It should be noted that the processor 301 herein may be a processing component or a general term of multiple processing components. For example, the processing element can be a central processor (Central) Processing Unit, CPU), or a specific integrated circuit (Application Specific Integrated) Circuit, ASIC), or one or more integrated circuits configured to implement embodiments of the present application, such as one or more microprocessors (digital singnal Processor, DSP), or one or more Field Programmable Gate Arrays (FPGAs).

The memory 303 may be a storage device or a collective name of a plurality of storage elements, and is used to store executable program code or parameters, data, and the like required for the application running device to operate. And the memory 303 may include random access memory (RAM), and may also include non-volatile memory (non-volatile memory) Memory), such as disk storage, flash (Flash), etc.

Bus 304 can be an industry standard architecture (Industry Standard Architecture, ISA) bus, Peripheral Component (PCI) bus or extended industry standard architecture (Extended Industry Standard Architecture, EISA) bus, etc. The bus can be divided into an address bus, a data bus, a control bus, and the like. For ease of representation, only one thick line is shown in Figure 3, but it does not mean that there is only one bus or one type of bus.

The terminal may further include input and output means connected to the bus 304 for connection to other parts such as the processor 301 via the bus. The input/output device can provide an input interface for the operator, so that the operator can select the control item through the input interface, and can also be other interfaces through which other devices can be externally connected.

It should be noted that, for the foregoing various method embodiments, for the sake of simple description, they are all expressed as a series of action combinations, but those skilled in the art should understand that the present invention is not limited by the described action sequence. Because certain steps may be performed in other sequences or concurrently in accordance with the present invention. In addition, those skilled in the art should also understand that the embodiments described in the specification are all preferred embodiments, and the actions and modules involved are not necessarily required by the present invention.

In the above embodiments, the descriptions of the various embodiments are different, and the parts that are not described in detail in a certain embodiment can be referred to the related descriptions of other embodiments.

A person skilled in the art may understand that all or part of the various steps of the foregoing embodiments may be performed by a program to instruct related hardware. The program may be stored in a computer readable storage medium, and the storage medium may include: Flash drive, read-only memory (English: Read-Only Memory, referred to as: ROM), random accessor (English: Random Access Memory, referred to as: RAM), disk or CD.

The content downloading method and the related device and system provided by the embodiments of the present invention are described in detail above. The principles and implementation manners of the present invention are described in the specific examples. The description of the above embodiments is only used to help understand the present invention. The method of the invention and its core idea; at the same time, for the person of ordinary skill in the art, according to the idea of the present invention, there are some changes in the specific embodiment and the scope of application. In summary, the content of the specification should not be understood. To limit the invention.

Claims (9)

  1.  A video splicing method for monitoring a cloud platform, characterized in that the method comprises the following steps:
    The monitoring cloud platform acquires multiple video files, and extracts multiple feature points and video files in multiple video files;
    Monitoring the cloud platform to extract at least two video files of the same feature point and having a time range within a set range;
    The monitoring cloud platform splices the at least two video files together in time series to obtain a stitched video.
  2. The method of claim 1 further comprising:
    The monitoring cloud platform uses the earliest time and the latest time of at least two video files as the time period of the stitched video.
  3. The method of claim 1, wherein the method further comprises:
    The monitoring cloud platform marks the splicing points of the spliced video.
  4. A video splicing system for monitoring a cloud platform, the system comprising:
    a receiving unit, configured to acquire multiple video files;
    a processing unit, configured to extract a plurality of feature points in the plurality of video files and a time of the video file, extract at least two video files with the same feature point and a time range within a set range, and time the at least two video files by time Sequentially stitched together to get a stitched video.
  5. The system of claim 4, wherein the system further comprises:
    The processing unit is configured to use the earliest time and the latest time of the at least two video files as the time period of the stitched video.
  6. The system of claim 5, wherein the system further comprises:
    A processing unit for marking the splicing points of the spliced video.
  7. A monitoring system includes: a processor, a wireless transceiver, a memory, and a bus, wherein the processor, the wireless transceiver, and the memory are connected by a bus, wherein
    The wireless transceiver is configured to acquire a plurality of video files;
    The processor is configured to extract a plurality of feature points of the plurality of video files and a time of the video file, extract at least two video files with the same feature point and a time range within a set range, and set the at least two video files Splicing together in chronological order to get a stitched video.
  8. The monitoring system according to claim 7, wherein the processor is specifically configured to use the earliest time and the latest time of the at least two video files as the time period of the stitched video.
  9. The monitoring system according to claim 7, wherein the processor is configured to mark a splicing point of the spliced video.
PCT/CN2017/077332 2017-03-20 2017-03-20 Video stitching method and system for monitoring cloud platform WO2018170685A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/077332 WO2018170685A1 (en) 2017-03-20 2017-03-20 Video stitching method and system for monitoring cloud platform

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/077332 WO2018170685A1 (en) 2017-03-20 2017-03-20 Video stitching method and system for monitoring cloud platform

Publications (1)

Publication Number Publication Date
WO2018170685A1 true WO2018170685A1 (en) 2018-09-27

Family

ID=63583968

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/077332 WO2018170685A1 (en) 2017-03-20 2017-03-20 Video stitching method and system for monitoring cloud platform

Country Status (1)

Country Link
WO (1) WO2018170685A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101000859B1 (en) * 2009-05-25 2010-12-13 연세대학교 산학협력단 HDRI-Panorama image generation apparatus and method with fisheye lens
CN102354449A (en) * 2011-10-09 2012-02-15 昆山市工业技术研究院有限责任公司 Internet of vehicles-based method for realizing image information sharing and device and system thereof
CN102737509A (en) * 2012-06-29 2012-10-17 惠州天缘电子有限公司 Method and system for realizing image information sharing based on internet of vehicles
CN106060479A (en) * 2016-07-13 2016-10-26 三峡大学 Smart pasture monitoring system based on beyond visual range video technology
CN106954030A (en) * 2017-03-20 2017-07-14 华平智慧信息技术(深圳)有限公司 Monitor the video-splicing method and system of cloud platform

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101000859B1 (en) * 2009-05-25 2010-12-13 연세대학교 산학협력단 HDRI-Panorama image generation apparatus and method with fisheye lens
CN102354449A (en) * 2011-10-09 2012-02-15 昆山市工业技术研究院有限责任公司 Internet of vehicles-based method for realizing image information sharing and device and system thereof
CN102737509A (en) * 2012-06-29 2012-10-17 惠州天缘电子有限公司 Method and system for realizing image information sharing based on internet of vehicles
CN106060479A (en) * 2016-07-13 2016-10-26 三峡大学 Smart pasture monitoring system based on beyond visual range video technology
CN106954030A (en) * 2017-03-20 2017-07-14 华平智慧信息技术(深圳)有限公司 Monitor the video-splicing method and system of cloud platform

Similar Documents

Publication Publication Date Title
WO2013191462A1 (en) Terminal and method of operating the terminal
WO2016048024A1 (en) Display apparatus and displaying method thereof
WO2013125916A1 (en) Method and apparatus for controlling lock/unlock state of terminal through voice recognition
US9924137B2 (en) Multifunctional conferencing systems and methods
RU2510895C2 (en) System and method for efficient use or recorder
WO2015046667A1 (en) Smart watch and control method thereof
RU2638938C2 (en) Method and device for alerting incoming call
US20140086446A1 (en) Method and apparatus for image data processing, and electronic device including the apparatus
US8120662B2 (en) System and method for efficiently transferring data from an electronic camera device
CN104618793B (en) A kind of information processing method and electronic equipment
JP2004194013A (en) Image data display system in imaging apparatus
WO2013115512A1 (en) Short-range radio communication system and method for operating the same
WO2014133278A1 (en) Apparatus and method for positioning image area using image sensor location
US20090316724A1 (en) Electronically configurable interface
WO2014142604A1 (en) Electronic device and method for image processing
WO2015196564A1 (en) Intelligent wearing device-based mobile terminal shooting control method and system therefor
WO2014044076A1 (en) Mobile terminal with automatic exposure compensation function and automatic exposure compensation method
CN103945132A (en) Electronic apparatus and image producing method thereof
WO2015122654A1 (en) Display method and mobile device
EP2828818A1 (en) Mobile communication terminal and method of recommending application or content
WO2011025207A2 (en) Method and apparatus for sharing functions between devices via a network
EP3152639A1 (en) Wearable device and method for providing augmented reality information
US8624983B2 (en) Digital information input apparatus
CN105282372A (en) Camera command set host command translation
CN104488258A (en) Method and apparatus for dual camera shutter

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17901511

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 20.01.2020)