WO2014100966A1 - 播放视频的方法、终端和系统 - Google Patents
播放视频的方法、终端和系统 Download PDFInfo
- Publication number
- WO2014100966A1 WO2014100966A1 PCT/CN2012/087391 CN2012087391W WO2014100966A1 WO 2014100966 A1 WO2014100966 A1 WO 2014100966A1 CN 2012087391 W CN2012087391 W CN 2012087391W WO 2014100966 A1 WO2014100966 A1 WO 2014100966A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- interest
- playback
- video
- picture
- region
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 95
- 238000009877 rendering Methods 0.000 claims abstract description 12
- 238000010586 diagram Methods 0.000 description 20
- 230000006870 function Effects 0.000 description 15
- 238000004458 analytical method Methods 0.000 description 9
- 230000001960 triggered effect Effects 0.000 description 7
- 238000001514 detection method Methods 0.000 description 6
- 238000004891 communication Methods 0.000 description 5
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 238000004140 cleaning Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000001788 irregular Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 238000000638 solvent extraction Methods 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 230000003321 amplification Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19665—Details related to the storage of video surveillance data
- G08B13/19669—Event triggers storage or change of storage policy
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
- H04N5/77—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/91—Television signal processing therefor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/82—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
- H04N9/8205—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/87—Regeneration of colour television signals
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19654—Details concerning communication with a camera
- G08B13/19656—Network used to communicate with a camera, e.g. WAN, LAN, Internet
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19665—Details related to the storage of video surveillance data
- G08B13/19671—Addition of non-video data, i.e. metadata, to video stream
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/34—Indicating arrangements
Definitions
- the present invention relates to the field of video surveillance, and in particular, to a method, a terminal and a system for playing video in the field of video surveillance. Background technique
- the existing video surveillance client generally plays video images of multiple cameras at the same time, and as the resolution of the video image increases, the total resolution of the video images of multiple cameras often exceeds the resolution range of the client monitor. Taking a 22-inch display as an example, the maximum resolution can only support up to 1920*1080, which means that only one 1080p picture can be played. If multiple 1080p pictures are played simultaneously on one monitor, only the picture can be reduced;
- a typical video surveillance client, in addition to the playback window has a number of auxiliary function panels such as a title bar, a camera list, and a pan/tilt control panel, which further reduces the space that the video screen can display. Thus, the playback window can play a much smaller picture than the original picture.
- an event occurs on the video screen (such as an intelligent analysis trigger event, etc.)
- the screen is reduced during playback, the area of the event area will be even smaller, which will be inconvenient for the user to view. If the personnel monitors the picture, the reduced picture makes it difficult for the observer to pay attention to the details, resulting in missing key information.
- Embodiments of the present invention provide a method, a terminal, and a system for playing a video, which can improve a user experience.
- an embodiment of the present invention provides a method for playing a video, the method comprising: dividing an original play picture into at least two regions of interest; determining a first sense of the at least two regions of interest having a trigger event a region of interest; acquiring decoded data of the first video frame displayed in the first region of interest; rendering the decoded data of the first video frame to a specified play window for playing.
- the method further includes: determining a correspondence between each of the at least two regions of interest and a specified play window;
- the decoding of the decoded data to the specified play window for playing includes: rendering the decoded data of the first video frame to a specified play window corresponding to the first region of interest for playing according to the correspondence.
- the determining the first region of interest in the at least two regions of interest with a trigger event including: determining a trigger operation of the user on the region of interest in the original play screen, the trigger operation includes: clicking an operation, a double-click operation, or a selected operation of the region of interest; determining the region of interest having the trigger operation as the The first region of interest.
- the determining the first region of interest in the at least two regions of interest with a trigger event includes: acquiring coordinate metadata of a trigger event occurrence point in the original play screen; and determining, according to the coordinate metadata, the region of interest to which the trigger event occurrence point belongs as the first region of interest.
- the acquiring is in the first Decoding data of the first video picture displayed by the region of interest includes: acquiring decoded data of the original play picture; and determining decoded data of the first video picture according to the decoded data of the original play picture.
- the decoding of the decoded data of the first video frame to the specified play window for playing includes: rendering the decoded data of the first video frame And zooming in to the designated play window, wherein the specified play window is larger than the first region of interest.
- the decoded data of the picture is rendered to the specified play window for playing, including: popping up the independent play window; and the decoded data of the first video picture is rendered to the independent play window for playing.
- an embodiment of the present invention provides a terminal for playing a video, where the terminal includes: a dividing module, configured to divide an original playing screen into at least two regions of interest; and a first determining module, configured to determine the dividing module a first region of interest having a triggering event in the at least two regions of interest; an obtaining module, configured to acquire decoded data of the first video frame displayed in the first region of interest determined by the first determining module; The playing module is configured to render the decoded data of the first video frame acquired by the acquiring module to a specified playing window for playing.
- the terminal further includes: a second determining module, configured to determine a correspondence between each of the at least two regions of interest and the specified play window;
- the playing module is further configured to: according to the correspondence determined by the second determining module, the decoded data of the first video frame acquired by the acquiring module is rendered to a specified playing window corresponding to the first region of interest for playing .
- the first determining module includes: a first determining unit, configured to determine a user a triggering operation of the region of interest in the playback screen, the triggering operation comprising: a click operation, a double-click operation, or a selected operation of the region of interest; a second determining unit, configured to: use the triggering operation determined by the first determining unit The region of interest is determined as the first region of interest.
- the first determining module includes: a first acquiring unit, configured to acquire the original playing screen The coordinate data of the triggering event occurrence point; the third determining unit is configured to determine, according to the coordinate metadata acquired by the first acquiring unit, the region of interest to which the trigger event occurrence point belongs as the first region of interest .
- the acquiring module package The second obtaining unit is configured to obtain the decoded data of the original play picture, and the third determining unit is configured to determine the decoded data of the first video picture according to the decoded data of the original play picture acquired by the second acquiring unit. .
- the playing module is further used to : rendering the decoded data of the first video frame to the designated play window for performing the enlarged play, wherein the designated play window is larger than the first region of interest.
- the playing module includes: popping up The unit is configured to pop up an independent play window, and the playing unit is configured to render the decoded data of the first video frame to the independent play window popped up by the pop-up unit for playing.
- an embodiment of the present invention provides a system for playing a video, the system comprising: a terminal according to the second aspect of the present invention; a video capture system, configured to collect a video image, and generate a media by encoding the video image a server, configured to obtain the media stream generated by the video collection system, and provide the media stream to the terminal; and a storage device, configured to store the media stream obtained by the server; wherein the terminal includes: a partitioning module, The first playback module is configured to determine a first region of interest having a trigger event in the at least two regions of interest divided by the partitioning module; Obtaining, by the first determining module, the decoded data of the first video frame displayed by the first region of interest; the playing module, configured to: render the decoded data of the first video image acquired by the acquiring module to a specified playing window Play.
- a method, a terminal, and a system for playing a video by dividing an original play picture into a plurality of regions of interest, and separately displaying a picture of the region of interest having a trigger event, on the one hand
- the user can observe the picture details of the clearer region of interest, and on the other hand enables the user to simultaneously track the picture details of multiple regions of interest, thereby significantly improving the user experience.
- FIG. 1 is a schematic structural diagram of an exemplary application scenario of an embodiment of the present invention.
- FIG. 2 is a schematic flowchart of a method for playing a video according to an embodiment of the present invention.
- FIG. 3 is another schematic flowchart of a method for playing a video according to an embodiment of the present invention.
- 4 is a schematic flow chart of a method of dividing a region of interest according to an embodiment of the present invention.
- FIG. 5 is another schematic flowchart of a method of dividing a region of interest according to an embodiment of the present invention.
- FIG. 6 is a schematic flow diagram of a method of determining a region of interest with a triggering event, in accordance with an embodiment of the present invention.
- FIG. 7 is another schematic flow diagram of a method of determining a region of interest with a triggering event in accordance with an embodiment of the present invention.
- FIG. 8 is a schematic flowchart of a method of acquiring decoded data of a region of interest according to an embodiment of the present invention.
- FIG. 9 is a schematic flow diagram of a method of playing a picture of a region of interest, in accordance with an embodiment of the present invention.
- FIG. 10 is a schematic flowchart of a method for playing a video according to another embodiment of the present invention.
- 11A and 11B are another schematic flowchart of a method of playing a video according to another embodiment of the present invention.
- 12A and 12B are schematic diagrams of playing a region of interest in accordance with an embodiment of the present invention.
- FIG. 13 is a schematic block diagram of a terminal according to an embodiment of the present invention.
- FIG. 14 is another schematic block diagram of a terminal according to an embodiment of the present invention.
- Figure 15 is a schematic block diagram of a first determining module in accordance with an embodiment of the present invention.
- FIG. 16 is another schematic block diagram of a first determining module according to an embodiment of the present invention.
- 17 is a schematic block diagram of an acquisition module in accordance with an embodiment of the present invention.
- FIG. 18 is a schematic block diagram of a playback module in accordance with an embodiment of the present invention.
- 19 is a schematic block diagram of a system in accordance with an embodiment of the present invention.
- FIG. 20 is a schematic block diagram of a terminal according to another embodiment of the present invention. detailed description
- FIG. 1 shows a schematic architectural diagram of an exemplary application scenario of an embodiment of the present invention.
- a video surveillance system suitable for applying the embodiments of the present invention may include: a video capture device, a central server, a storage device, and a terminal having a client, where the video capture device may be used to collect video images and may pass The video image is encoded to generate a media stream for network transmission.
- the video capture device may include: a network camera, an analog camera, an encoder, a digital video recorder (Digital Video Recorder, a cartridge called "DVR"), etc.;
- DVR Digital Video Recorder
- the client After connecting to the central server, the client can request the video stream, and can decode and display it, and can provide users with viewing live video images.
- the central server may include a management server and a media server, wherein the media server may be responsible for receiving the media stream, and recording the media stream data on the storage device, and forwarding the media stream to the client for on-demand playback; the management server may be responsible for the user's login. , authentication, service scheduling and other functions; the central server can also receive access from multiple clients, and manage the connection between the video surveillance systems through the network.
- the storage device may be, for example, a disk array, and the disk array may be responsible for storing video data, and may adopt a network attached storage (Network Attached Storage, referred to as "NAS"), and a storage area network (Storage Area Network). SAN”) or the server itself for storage.
- NAS Network Attached Storage
- SAN Storage Area Network
- FIG. 1 is only one embodiment suitable for applying the method of the present invention, and is not intended to impose any limitation on the use and function of the present invention, and the present invention should not be construed as being shown with the video. Any one or combination of the components of the monitoring system has any relevant requirements. However, in order to explain the present invention more clearly, the following describes the application scenario of the video surveillance system as an example, but the present invention is not limited thereto.
- the video data transmission technical solution in the embodiment of the present invention may adopt various communication networks or communication systems, for example: Global System of Mobile communication ("GSM”) system, code division Code Division Multiple Access (“CDMA”) system, Wideband Code Division Multiple Access (WCDMA) system, General Packet Radio Service (General Packet Radio Service) "GPRS”), Long Term Evolution (LTE) system, LTE Frequency Division Duplex (“FDD”) system, LTE Time Division Duplex (Time Division Duplex) Called “TDD”), Universal Mobile
- GSM Global System of Mobile communication
- CDMA code division Code Division Multiple Access
- WCDMA Wideband Code Division Multiple Access
- GPRS General Packet Radio Service
- LTE Long Term Evolution
- FDD Frequency Division Duplex
- Time Division Duplex Time Division Duplex
- Time Division Duplex Time Division Duplex
- the method 100 includes:
- the device playing the video may first divide the original play picture into multiple senses.
- the method for playing a video in the embodiment of the present invention by dividing the original play picture into a plurality of regions of interest, and separately displaying the picture of the region of interest having the trigger event, on the one hand enables the user to observe a clearer
- the picture detail of the region of interest on the other hand, also enables the user to simultaneously track the picture details of multiple regions of interest, thereby significantly improving the user experience.
- the video may include a video file, and may also include a real-time video stream.
- the embodiment of the present invention is only described by playing a real-time video stream, but the embodiment of the present invention is not limited thereto. .
- the method 100 further includes:
- the decoding of the decoded data of the first video picture to the specified play window for playing includes:
- the decoded data of the first video picture is rendered to a designated play window corresponding to the first region of interest for playing. That is, for each region of interest, one or more play windows may be associated for playing a picture of the region of interest when the region of interest has a trigger event.
- the specified window may be a maximum play window of the display device, or may be part of the maximum play window; the specified window may be a part of the currently existing play window or the existing play window, or may be re-popped or re-opened.
- the generated play window, the embodiment of the present invention is not limited thereto.
- dividing the original play picture into at least two regions of interest includes: dividing the original play picture into the at least two regions of interest by using an equally divided manner or a free split manner.
- multiple regions of interest may be pre-divided at the client, and the sizes of the regions of interest may be equal or unequal, and each region of interest may be set as an irregular region;
- the correspondence between different regions of interest and the play window may also be determined.
- the division of the area of interest can be manually split by the user, or automatically configured by the client software, and saved by the client.
- the division of the picture can be divided into equal divisions or freely divided.
- the specific configuration process is shown in Figures 4 and 5, for example.
- the method of dividing the region of interest in an equally divided manner includes:
- the method for dividing the region of interest by means of free segmentation may include, for example:
- the original play screen for dividing the region of interest may be all the play screens in the maximum play window of the display device, or may be in multiple pictures simultaneously played in the largest part of the window.
- One or more pictures, embodiments of the present invention are not limited thereto.
- the device for playing a video determines a first region of interest having a trigger event in the at least two regions of interest to display the picture in the first region of interest in a separate play window, thereby improving picture detail The display effect.
- the event may be manually triggered by the user and the region of interest may be determined, or the trigger generated automatically by the event may be detected, and the region of interest may be determined, which will be described below in conjunction with FIGS. 6 and 7.
- the determining, by the first region of interest, the triggering event in the at least two regions of interest includes:
- a trigger operation of the user on the region of interest in the original play screen where the trigger operation includes: a click operation, a double-click operation, or a selected operation of the region of interest;
- the user interface may be operated, for example, the region of interest in the original playback screen is triggered to play the scene of the region of interest in the event to the prior The specified playback window, or through the pop-up independent play window for display; when multiple regions of interest have events, multiple windows can be triggered for display.
- the triggering operation is, for example, a click operation, a double-click operation, or a selected operation on the region of interest, and the like, and the embodiment of the present invention is not limited thereto.
- FIG. 7 illustrates another schematic flow diagram of a method of determining a region of interest with a triggering event in accordance with an embodiment of the present invention.
- the determining, by the first region of interest, the triggering event in the at least two regions of interest includes:
- the user can pre-configure an area that needs to automatically detect an event, and configure event detection rules, such as motion detection or intelligent analysis detection.
- event detection rules such as motion detection or intelligent analysis detection.
- the client software may determine the corresponding pre-configured region of interest according to the coordinate metadata of the triggering event occurrence point, thereby being able to play the corresponding screen to the previously specified play window, or by popping up the independent play window. Display; multiple windows can be triggered when multiple regions of interest have events The port is displayed.
- the triggering event may cover multiple regions of interest.
- the plurality of regions of interest may be determined as the first region of interest having a triggering event, which is not Limited to this.
- the device for playing video may determine whether there is a trigger event in the region of interest by means of motion detection or intelligent analysis detection, etc.
- the central server may also perform detection to determine whether there is a trigger event in the region of interest. And detecting a trigger event, the coordinate metadata of the trigger event occurrence point may be fed back to the device for playing the video, so that the device playing the video may determine the first region of interest with the trigger event according to the coordinate metadata.
- the embodiment of the invention is not limited thereto.
- the device playing the video acquires the decoded data of the first video picture displayed in the first region of interest, so as to play the first video picture in the specified play window.
- the acquiring the decoded data of the first video frame displayed in the first region of interest includes:
- S132 Determine decoding data of the first video picture according to the decoded data of the original play picture.
- the device that plays the video receives an event manually triggered by the user, including a mouse click, a double click, a click of a toolbar button, or a shortcut key trigger, or the device determines, according to the coordinate metadata, the first region of interest with the trigger event. Afterwards, the device may intercept the data content belonging to the region of interest from the decoded YUV data of the original play window, and may play the portion of the content in a pre-defined play pane according to the pre-configured correspondence (or Pop-up independent play window playback), since multiple play windows use the same YUV data source, the device does not need to elicit or add additional multiple video streams.
- an event manually triggered by the user including a mouse click, a double click, a click of a toolbar button, or a shortcut key trigger
- the device determines, according to the coordinate metadata, the first region of interest with the trigger event. Afterwards, the device may intercept the data content belonging to the region of interest from the decoded YUV data of the original play
- the resolution of the original playback screen For example, set the resolution of the original playback screen to Width X Height, the starting point of the region of interest is StartX, the ordinate is StartY, the ending point is the end coordinate, the ordinate is EndY, and the YUV data of the original playback screen is In the array Org[Width Height], the YUV data of the region of interest is in Dst[ROIWidth X ROIHeight], where n is any point in the region of interest, and the YUV data of the region of interest can be determined according to the following equation:
- ROIWidth EndX-StartX
- the device playing the video renders the decoded data of the first video picture to a designated play window for playing.
- the device for playing a video may play the first video frame in a pop-up window, or display the first video frame in a new play window, and display the first video frame in the original play window, and may A video frame is digitally scaled to fit the size of the playback window.
- the specified window may be a maximum play window of the display device, or may be part of the maximum play window; the specified window may be a currently existing play window or a part of the existing play window.
- the playback window may be re-popped or re-created; the specified window may be one window or more than one window, and the embodiment of the present invention is not limited thereto.
- the decoding of the decoded data of the first video frame to the specified play window for playing includes:
- the decoded data of the first video picture is rendered to the designated play window for enlarged play, wherein the designated play window is larger than the first region of interest.
- the decoding of the decoded data of the first video picture to the specified play window for playing includes:
- the specified window is a pop-up independent play window
- the pop-up independent play window may be larger than the first region of interest to perform enlarged play on the first video frame, but the present invention is implemented.
- the example is not limited to this.
- the independent play window can also be smaller than or equal to the first region of interest.
- B corresponding to A means that B is associated with A, and can be determined according to A, but it should also be understood that determining B according to A does not mean that it is determined only by A.
- B can also be determined according to A and / or other information ⁇
- the size of the sequence numbers of the above processes does not mean the order of execution, and the order of execution of each process should be determined by its function and internal logic, and should not be taken to the embodiments of the present invention.
- the implementation process constitutes any limitation.
- the method for playing a video in the embodiment of the present invention divides the original playback screen into a plurality of regions of interest, and displays the screen of the region of interest having the trigger event separately.
- the surface enables the user to observe the picture details of the region of interest more clearly and more intuitively, and on the other hand enables the user to simultaneously track the picture details of the plurality of regions of interest, thereby significantly improving the user experience; further, the embodiment of the present invention uses The original decoded data plays a picture of the region of interest without adding additional video streams.
- the method 200 for playing a video may be performed by a device that plays a video, such as a terminal or a client.
- the method 200 may include:
- GUI graphical User Interface
- Figure 11A shows a schematic flow diagram of a method 300 of manually triggering a region of interest, which may be performed by a device that plays video, such as a terminal or client, in accordance with an embodiment of the present invention.
- the method 300 can include:
- the region of interest YUV data is rendered to the specified play window for playing; for example, as shown in FIG. 12A, the entire play window includes an original play picture window, and three and the original play picture window. a specified play window of the same size, wherein the original play picture is divided into 16 regions of interest, and the picture of the region of interest having the manually triggered event is enlarged and played in one of the designated play windows;
- Figure 11B shows a schematic flow diagram of a method 400 of automatically triggering an area of interest for an event in accordance with an embodiment of the present invention.
- the method 400 can include:
- the window is played; for example, as shown in FIG. 12B, the entire play window includes an original play picture window, and three designated play windows having the same size as the original play picture window, wherein the original play picture is divided into 16 regions of interest. , the picture of the region of interest with the trigger event is enlarged and played in one of the specified play windows;
- the size of the sequence numbers of the above processes does not mean the order of execution, and the order of execution of each process should be determined by its function and internal logic, and should not be taken to the embodiments of the present invention.
- the implementation process constitutes any limitation.
- the method for playing a video in the embodiment of the present invention by dividing the original play picture into a plurality of regions of interest, and separately displaying the picture of the region of interest having the trigger event, on the one hand enables the user to observe a clearer
- the picture detail of the region of interest on the other hand, also enables the user to simultaneously track the picture details of multiple regions of interest, thereby significantly improving the user experience.
- FIG. 1 to FIG. 12B The method of playing a video according to an embodiment of the present invention is described in detail above with reference to FIG. 1 to FIG. 12B.
- a terminal and system for playing a video according to an embodiment of the present invention will be described in detail below with reference to FIGS. 13 to 20.
- FIG. 13 shows a schematic block diagram of a terminal 500 in accordance with an embodiment of the present invention. As shown in FIG. 13, the terminal 500 includes:
- a dividing module 510 configured to divide the original playing picture into at least two regions of interest; the first determining module 520 is configured to determine, in the at least two regions of interest divided by the dividing module 510, the first interest having a trigger event Area
- the obtaining module 530 is configured to acquire decoded data of the first video picture displayed in the first region of interest determined by the first determining module 520;
- the playing module 540 is configured to render the decoded data of the first video frame acquired by the obtaining module 530 to a specified playing window for playing.
- the terminal playing the video according to the embodiment of the present invention divides the original play picture into a plurality of regions of interest, and displays the picture of the region of interest having the trigger event separately, thereby enabling the user to observe clearer on the one hand.
- the picture detail of the region of interest on the other hand, also enables the user to simultaneously track the picture details of multiple regions of interest, thereby significantly improving the user experience.
- the terminal that plays the video can play the video file
- the real-time video stream can be played.
- the embodiment of the present invention is only described by using the terminal to play the real-time video stream, but the embodiment of the present invention is not limited thereto.
- the terminal 500 further includes: a second determining module 550, configured to determine each of the at least two regions of interest and the specified play window Correspondence relationship;
- the playing module 540 is further configured to: according to the correspondence determined by the second determining module 550, the decoded data of the first video frame acquired by the obtaining module 530 is rendered to a designation corresponding to the first region of interest. Play the window to play.
- the dividing module 510 is further configured to: divide the original play picture into the at least two regions of interest by using an equally divided manner or a free split manner.
- the first determining module 520 includes: a first determining unit 521, configured to determine a trigger operation of the user on the region of interest in the original play screen, where The triggering operation includes: a click operation, a double-click operation, or a selected operation of the region of interest; a second determining unit 522, configured to determine the region of interest having the triggering operation determined by the first determining unit 521 as the first interested region.
- the first determining module 520 includes: a first acquiring unit 523, configured to acquire coordinate metadata of a trigger event occurrence point in the original playing screen;
- the third determining unit 524 is configured to determine, according to the coordinate metadata acquired by the first acquiring unit 523, the region of interest to which the trigger event occurrence point belongs as the first region of interest.
- the acquiring module 530 includes: a second acquiring unit 531, configured to acquire decoded data of the original playing screen;
- the third determining unit 532 is configured to determine, according to the decoded data of the original playing screen acquired by the second acquiring unit 531, the decoded data of the first video picture.
- the playing module is further configured to: render the decoded data of the first video frame to the designated play window for performing the enlarged play, wherein the specified play window is larger than the first region of interest.
- the play module 540 includes: an ejecting unit 541, configured to pop up an independent play window;
- the playing unit 542 is configured to render the decoded data of the first video picture to the independent playing window popped up by the pop-up unit for playing.
- the terminal 500 for playing video may correspond to the device for playing video in the embodiment of the present invention, and the above and other operations and/or functions of the respective modules in the terminal 500 are respectively implemented to implement FIG. 1 to The corresponding processes of the respective methods 100 to 400 in FIG. 12B are not described herein.
- the terminal playing the video according to the embodiment of the present invention divides the original play picture into a plurality of regions of interest, and displays the picture of the region of interest having the trigger event separately, thereby enabling the user to observe clearer on the one hand.
- the picture detail of the region of interest on the other hand, also enables the user to simultaneously track the picture details of multiple regions of interest, thereby significantly improving the user experience.
- Figure 19 shows a schematic block diagram of a system 600 in accordance with an embodiment of the present invention. As shown in Figure 19, the system 600 includes:
- Terminal 610 according to an embodiment of the present invention.
- a video capture system 620 configured to collect a video image, and generate a media stream by encoding the video image
- the server 630 is configured to obtain the media stream generated by the video collection system, and provide the media stream to the terminal 620;
- the storage device 640 is configured to store the media stream obtained by the server 630.
- the system 600 for playing video may include a terminal 610 corresponding to the terminal 500 for playing video in the embodiment of the present invention, and the above-mentioned and other operations and/or functions of the respective modules in the terminal 610 are respectively
- the corresponding processes of the respective methods 100 to 400 in FIGS. 1 to 12B for the sake of cleaning, no further details are provided herein.
- the system for playing video divides the original play picture into a plurality of regions of interest, and displays the screen of the region of interest having the trigger event separately, thereby enabling the user to observe clearer on the one hand.
- the picture detail of the region of interest on the other hand, also enables the user to simultaneously track the picture details of multiple regions of interest, thereby significantly improving the user experience.
- the embodiment of the present invention further provides a terminal for playing video.
- the terminal 700 includes: a processor 710, a memory 720, and a bus system 730.
- the processor 710 and the memory 720 are connected by a bus system 730.
- the memory 720 is configured to store instructions
- the processor 710 is configured to execute the instructions stored in the memory 720, wherein the processor 710 is configured to: divide the original play picture into at least two regions of interest; determine the at least two senses a first region of interest having a triggering event in the region of interest; acquiring decoded data of the first video frame displayed in the first region of interest; rendering the decoded data of the first video frame to a specified playing window for playing.
- the terminal playing the video according to the embodiment of the present invention divides the original play picture into a plurality of regions of interest, and displays the picture of the region of interest having the trigger event separately, thereby enabling the user to observe clearer on the one hand.
- the picture detail of the region of interest on the other hand, also enables the user to simultaneously track the picture details of multiple regions of interest, thereby significantly improving the user experience.
- the processor 710 may be a central processing unit (Central)
- the processing unit may also be other general purpose processors, digital signal processors (DSPs), application specific integrated circuits (ASICs), off-the-shelf programmable gate arrays (FPGAs), or other programmable Logic devices, discrete gates or transistor logic devices, discrete hardware components, and more.
- the general purpose processor may be a microprocessor or the processor or any conventional processor or the like.
- the memory 720 can include read only memory and random access memory and provides instructions and data to the processor 710. A portion of memory 720 may also include non-volatile random access memory. For example, the memory 720 can also store information of the device type.
- the bus system 730 can include, in addition to the data bus, a power bus, a control bus, and a status signal bus. However, for clarity of description, various buses are labeled as bus system 730 in the figure.
- each step of the above method may be completed by an integrated logic circuit of hardware in the processor 710 or an instruction in the form of software.
- the steps of the method disclosed in the embodiments of the present invention may be directly implemented as a hardware processor, or may be performed by a combination of hardware and software modules in the processor.
- the software modules can be located in random memory, flash memory, read only memory, programmable read only memory or electrically erasable programmable memory, registers, etc., which are well established in the art.
- the storage medium is located in the memory 720.
- the processor 710 reads the information in the memory 720 and combines the hardware to perform the steps of the above method. To avoid repetition, it will not be described in detail here.
- the processor 710 is further configured to: determine a correspondence between each of the at least two regions of interest and a specified play window; and the processor 710 decodes the first video frame.
- the data is rendered to the specified play window for playing, including: according to the correspondence, the decoded data of the first video frame is rendered to a designated play window corresponding to the first region of interest for playing.
- the dividing, by the processor 710, the original play picture into the at least two regions of interest includes: dividing the original play picture into the at least two by using an equally divided manner or a free split manner. Area of interest.
- the processor 710 determines that the at least two regions of interest have The first region of interest having a triggering event, comprising: determining a triggering operation by the user on the region of interest in the original playing screen, the triggering operation comprising: clicking an operation, a double-clicking operation, or a selected operation of the region of interest; The region of interest that triggered the operation is determined to be the first region of interest.
- the determining, by the processor 710, the first region of interest having the trigger event in the at least two regions of interest including: acquiring coordinate metadata of a trigger event occurrence point in the original play screen; The coordinate metadata determines the region of interest to which the trigger event occurrence point belongs as the first region of interest.
- the acquiring, by the processor 710, the decoded data of the first video frame displayed in the first region of interest includes: acquiring decoded data of the original play frame; and according to the decoded data of the original play frame, Determining the decoded data of the first video picture.
- the terminal 700 is further configured to: render the decoded data of the first video frame to the designated play window for performing the enlarged play, wherein the designated play window is greater than the first region of interest.
- the terminal 700 further includes a display 740, where the processor 710 renders the decoded data of the first video picture to the specified play window for playing, including: popping up the independent play window; The decoded data of the first video picture is rendered to the independent play window for playing.
- the terminal 700 for playing video may correspond to the terminal 500 or the terminal 610 for playing video in the embodiment of the present invention, and the above operations and/or functions of the respective modules in the terminal 700 are respectively The corresponding processes of the respective methods 100 to 400 in FIG. 1 to FIG. 12B are implemented, and are not described herein again.
- the terminal playing the video according to the embodiment of the present invention divides the original play picture into a plurality of regions of interest, and displays the picture of the region of interest having the trigger event separately, thereby enabling the user to observe clearer on the one hand.
- the picture detail of the region of interest on the other hand, also enables the user to simultaneously track the picture details of multiple regions of interest, thereby significantly improving the user experience.
- the disclosed systems, devices, and methods may be implemented in other ways.
- the device embodiments described above are merely illustrative.
- the division of the unit is only a logical function division.
- there may be another division manner for example, multiple units or components may be combined or Can be integrated into another system, or some features can be ignored, or not executed.
- the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, device or unit, or an electrical, mechanical or other form of connection.
- the components displayed for the unit may or may not be physical units, ie may be located in one place, or may be distributed over multiple network units. Some or all of the units may be selected according to actual needs to achieve the objectives of the embodiments of the present invention.
- each functional unit in each embodiment of the present invention may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
- the above integrated unit can be implemented in the form of hardware or in the form of a software functional unit.
- the integrated unit if implemented in the form of a software functional unit and sold or used as a standalone product, may be stored in a computer readable storage medium.
- the technical solution of the present invention contributes in essence or to the prior art, or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium.
- a number of instructions are included to cause a computer device (which may be a personal computer, server, or network device, etc.) to perform all or part of the steps of the methods described in various embodiments of the present invention.
- the foregoing storage medium includes: a U disk, a removable hard disk, a read-only memory (ROM), a random access memory (RAM), a magnetic disk or an optical disk, and the like, which can store program codes. .
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Library & Information Science (AREA)
- Television Signal Processing For Recording (AREA)
- Studio Circuits (AREA)
Abstract
Description
Claims
Priority Applications (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201280005842.5A CN104081760B (zh) | 2012-12-25 | 2012-12-25 | 播放视频的方法、终端和系统 |
PCT/CN2012/087391 WO2014100966A1 (zh) | 2012-12-25 | 2012-12-25 | 播放视频的方法、终端和系统 |
CN201711339410.9A CN108401134A (zh) | 2012-12-25 | 2012-12-25 | 播放视频的方法、终端和系统 |
KR1020157017260A KR101718373B1 (ko) | 2012-12-25 | 2012-12-25 | 비디오 재생 방법, 단말 및 시스템 |
EP12878638.1A EP2768216A4 (en) | 2012-12-25 | 2012-12-25 | VIDEO GAME PROCEDURE, END USER AND SYSTEM |
JP2015549913A JP2016506167A (ja) | 2012-12-25 | 2012-12-25 | ビデオ再生方法、端末、およびシステム |
US14/108,180 US9064393B2 (en) | 2012-12-25 | 2013-12-16 | Video playback method, terminal, and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2012/087391 WO2014100966A1 (zh) | 2012-12-25 | 2012-12-25 | 播放视频的方法、终端和系统 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/108,180 Continuation US9064393B2 (en) | 2012-12-25 | 2013-12-16 | Video playback method, terminal, and system |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014100966A1 true WO2014100966A1 (zh) | 2014-07-03 |
Family
ID=50974785
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2012/087391 WO2014100966A1 (zh) | 2012-12-25 | 2012-12-25 | 播放视频的方法、终端和系统 |
Country Status (6)
Country | Link |
---|---|
US (1) | US9064393B2 (zh) |
EP (1) | EP2768216A4 (zh) |
JP (1) | JP2016506167A (zh) |
KR (1) | KR101718373B1 (zh) |
CN (2) | CN108401134A (zh) |
WO (1) | WO2014100966A1 (zh) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113286196A (zh) * | 2021-05-14 | 2021-08-20 | 湖北亿咖通科技有限公司 | 一种车载视频播放系统及视频分屏显示方法及装置 |
Families Citing this family (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9449229B1 (en) | 2014-07-07 | 2016-09-20 | Google Inc. | Systems and methods for categorizing motion event candidates |
US10127783B2 (en) | 2014-07-07 | 2018-11-13 | Google Llc | Method and device for processing motion events |
US9082018B1 (en) | 2014-09-30 | 2015-07-14 | Google Inc. | Method and system for retroactively changing a display characteristic of event indicators on an event timeline |
US9501915B1 (en) | 2014-07-07 | 2016-11-22 | Google Inc. | Systems and methods for analyzing a video stream |
US9158974B1 (en) | 2014-07-07 | 2015-10-13 | Google Inc. | Method and system for motion vector-based video monitoring and event categorization |
US10140827B2 (en) | 2014-07-07 | 2018-11-27 | Google Llc | Method and system for processing motion event notifications |
USD782495S1 (en) | 2014-10-07 | 2017-03-28 | Google Inc. | Display screen or portion thereof with graphical user interface |
US9361011B1 (en) | 2015-06-14 | 2016-06-07 | Google Inc. | Methods and systems for presenting multiple live video feeds in a user interface |
CN105872816A (zh) * | 2015-12-18 | 2016-08-17 | 乐视网信息技术(北京)股份有限公司 | 一种放大视频图像的方法及装置 |
CN108139799B (zh) * | 2016-04-22 | 2022-01-14 | 深圳市大疆创新科技有限公司 | 基于用户的兴趣区(roi)处理图像数据的系统和方法 |
US10506237B1 (en) | 2016-05-27 | 2019-12-10 | Google Llc | Methods and devices for dynamic adaptation of encoding bitrate for video streaming |
US10957171B2 (en) | 2016-07-11 | 2021-03-23 | Google Llc | Methods and systems for providing event alerts |
US10380429B2 (en) | 2016-07-11 | 2019-08-13 | Google Llc | Methods and systems for person detection in a video feed |
US10192415B2 (en) | 2016-07-11 | 2019-01-29 | Google Llc | Methods and systems for providing intelligent alerts for events |
CN106802759A (zh) | 2016-12-21 | 2017-06-06 | 华为技术有限公司 | 视频播放的方法及终端设备 |
US11783010B2 (en) | 2017-05-30 | 2023-10-10 | Google Llc | Systems and methods of person recognition in video streams |
US10599950B2 (en) | 2017-05-30 | 2020-03-24 | Google Llc | Systems and methods for person recognition data management |
CN107580228B (zh) * | 2017-09-15 | 2020-12-22 | 威海元程信息科技有限公司 | 一种监控视频处理方法、装置及设备 |
US11134227B2 (en) | 2017-09-20 | 2021-09-28 | Google Llc | Systems and methods of presenting appropriate actions for responding to a visitor to a smart home environment |
US10664688B2 (en) | 2017-09-20 | 2020-05-26 | Google Llc | Systems and methods of detecting and responding to a visitor to a smart home environment |
TWI657697B (zh) | 2017-12-08 | 2019-04-21 | 財團法人工業技術研究院 | 搜尋視訊事件之方法、裝置、及電腦可讀取記錄媒體 |
US11012750B2 (en) * | 2018-11-14 | 2021-05-18 | Rohde & Schwarz Gmbh & Co. Kg | Method for configuring a multiviewer as well as multiviewer |
CN112866625A (zh) * | 2019-11-12 | 2021-05-28 | 杭州海康威视数字技术股份有限公司 | 一种监控视频的显示方法、装置、电子设备及存储介质 |
US11893795B2 (en) | 2019-12-09 | 2024-02-06 | Google Llc | Interacting with visitors of a connected home environment |
CN111258484A (zh) * | 2020-02-12 | 2020-06-09 | 北京奇艺世纪科技有限公司 | 一种视频播放方法、装置、电子设备及存储介质 |
CN111491203B (zh) * | 2020-03-16 | 2023-01-24 | 浙江大华技术股份有限公司 | 视频回放方法、装置、设备和计算机可读存储介质 |
CN111752655B (zh) * | 2020-05-13 | 2024-06-04 | 西安万像电子科技有限公司 | 数据处理系统及方法 |
CN111757162A (zh) * | 2020-06-19 | 2020-10-09 | 广州博冠智能科技有限公司 | 一种高清视频播放方法、装置、设备及存储介质 |
CN112235626B (zh) * | 2020-10-15 | 2023-06-13 | Oppo广东移动通信有限公司 | 视频渲染方法、装置、电子设备及存储介质 |
CN112422907B (zh) * | 2020-11-09 | 2023-10-13 | 西安万像电子科技有限公司 | 图像处理方法、装置及系统 |
US11800179B2 (en) | 2020-12-03 | 2023-10-24 | Alcacruz Inc. | Multiview video with one window based on another |
CN112911384A (zh) * | 2021-01-20 | 2021-06-04 | 三星电子(中国)研发中心 | 视频播放方法和视频播放装置 |
KR102500923B1 (ko) * | 2021-06-03 | 2023-02-17 | 주식회사 지미션 | 스트림 영상 재생 장치 및 스트림 영상 재생 시스템 |
CN114339371A (zh) * | 2021-12-30 | 2022-04-12 | 咪咕音乐有限公司 | 视频显示方法、装置、设备及存储介质 |
CN114666668B (zh) * | 2022-03-18 | 2024-03-15 | 上海艺赛旗软件股份有限公司 | 一种视频回放方法、系统、设备及存储介质 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1901642A (zh) * | 2005-07-20 | 2007-01-24 | 英业达股份有限公司 | 视频浏览系统及方法 |
CN101365117A (zh) * | 2008-09-18 | 2009-02-11 | 中兴通讯股份有限公司 | 一种自定义分屏模式的方法 |
CN101540858A (zh) * | 2008-03-18 | 2009-09-23 | 索尼株式会社 | 图像处理装置、方法以及记录介质 |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5724475A (en) * | 1995-05-18 | 1998-03-03 | Kirsten; Jeff P. | Compressed digital video reload and playback system |
JPH0918849A (ja) * | 1995-07-04 | 1997-01-17 | Matsushita Electric Ind Co Ltd | 撮影装置 |
US20110058036A1 (en) * | 2000-11-17 | 2011-03-10 | E-Watch, Inc. | Bandwidth management and control |
JP2004120341A (ja) * | 2002-09-26 | 2004-04-15 | Riosu Corp:Kk | 映像監視システム |
JP2006033380A (ja) * | 2004-07-15 | 2006-02-02 | Hitachi Kokusai Electric Inc | 監視システム |
JP2005094799A (ja) * | 2004-11-15 | 2005-04-07 | Chuo Electronics Co Ltd | 映像集約表示装置 |
US8228372B2 (en) * | 2006-01-06 | 2012-07-24 | Agile Sports Technologies, Inc. | Digital video editing system |
JP4714039B2 (ja) * | 2006-02-27 | 2011-06-29 | 株式会社東芝 | 映像再生装置及び映像再生方法 |
WO2008103929A2 (en) * | 2007-02-23 | 2008-08-28 | Johnson Controls Technology Company | Video processing systems and methods |
JP2009100259A (ja) * | 2007-10-17 | 2009-05-07 | Mitsubishi Electric Corp | 監視カメラおよび画像監視システム |
JP4921338B2 (ja) * | 2007-12-14 | 2012-04-25 | 株式会社日立製作所 | プラント監視制御システム |
KR101009881B1 (ko) * | 2008-07-30 | 2011-01-19 | 삼성전자주식회사 | 재생되는 영상의 타겟 영역을 확대 디스플레이하기 위한장치 및 방법 |
KR100883632B1 (ko) * | 2008-08-13 | 2009-02-12 | 주식회사 일리시스 | 고해상도 카메라를 이용한 지능형 영상 감시 시스템 및 그 방법 |
JP4715909B2 (ja) * | 2008-12-04 | 2011-07-06 | ソニー株式会社 | 画像処理装置及び方法、画像処理システム、並びに、画像処理プログラム |
CN101616281A (zh) * | 2009-06-26 | 2009-12-30 | 中兴通讯股份有限公司南京分公司 | 一种将手机电视播放画面局部放大的方法及移动终端 |
KR20110023634A (ko) * | 2009-08-31 | 2011-03-08 | (주)아이디스 | 썸네일 이미지 생성 장치 및 이를 이용한 썸네일 이미지 출력 방법 |
US20110316697A1 (en) * | 2010-06-29 | 2011-12-29 | General Electric Company | System and method for monitoring an entity within an area |
CN101951493A (zh) * | 2010-09-25 | 2011-01-19 | 中兴通讯股份有限公司 | 移动终端及其视频通话中对远端图像局部放大方法 |
-
2012
- 2012-12-25 WO PCT/CN2012/087391 patent/WO2014100966A1/zh active Application Filing
- 2012-12-25 EP EP12878638.1A patent/EP2768216A4/en not_active Ceased
- 2012-12-25 CN CN201711339410.9A patent/CN108401134A/zh active Pending
- 2012-12-25 CN CN201280005842.5A patent/CN104081760B/zh active Active
- 2012-12-25 JP JP2015549913A patent/JP2016506167A/ja active Pending
- 2012-12-25 KR KR1020157017260A patent/KR101718373B1/ko not_active Application Discontinuation
-
2013
- 2013-12-16 US US14/108,180 patent/US9064393B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1901642A (zh) * | 2005-07-20 | 2007-01-24 | 英业达股份有限公司 | 视频浏览系统及方法 |
CN101540858A (zh) * | 2008-03-18 | 2009-09-23 | 索尼株式会社 | 图像处理装置、方法以及记录介质 |
CN101365117A (zh) * | 2008-09-18 | 2009-02-11 | 中兴通讯股份有限公司 | 一种自定义分屏模式的方法 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113286196A (zh) * | 2021-05-14 | 2021-08-20 | 湖北亿咖通科技有限公司 | 一种车载视频播放系统及视频分屏显示方法及装置 |
CN113286196B (zh) * | 2021-05-14 | 2023-02-17 | 亿咖通(湖北)技术有限公司 | 一种车载视频播放系统及视频分屏显示方法及装置 |
Also Published As
Publication number | Publication date |
---|---|
US9064393B2 (en) | 2015-06-23 |
EP2768216A1 (en) | 2014-08-20 |
CN104081760B (zh) | 2018-01-09 |
CN104081760A (zh) | 2014-10-01 |
KR101718373B1 (ko) | 2017-03-21 |
EP2768216A4 (en) | 2015-10-28 |
US20140178033A1 (en) | 2014-06-26 |
KR20150090223A (ko) | 2015-08-05 |
CN108401134A (zh) | 2018-08-14 |
JP2016506167A (ja) | 2016-02-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2014100966A1 (zh) | 播放视频的方法、终端和系统 | |
JP7126539B2 (ja) | ビデオ再生方法、端末、およびシステム | |
WO2017107441A1 (zh) | 截取视频动画的方法及装置 | |
EP3526964B1 (en) | Masking in video stream | |
US7839434B2 (en) | Video communication systems and methods | |
US9275604B2 (en) | Constant speed display method of mobile device | |
US9196306B2 (en) | Smart scaling and cropping | |
WO2021031850A1 (zh) | 图像处理的方法、装置、电子设备及存储介质 | |
JP2018519727A (ja) | ビデオ画像の上に情報を表示するための方法及びデバイス | |
JP2019110545A (ja) | ビデオ再生方法、端末、およびシステム | |
WO2020062684A1 (zh) | 视频处理方法、装置、终端和介质 | |
US20190050426A1 (en) | Automatic grouping based handling of similar photos | |
WO2022121731A1 (zh) | 图像拍摄方法、装置、电子设备和可读存储介质 | |
KR101652856B1 (ko) | Cctv에서 관제 이벤트에 기초한 사용자 인터페이스 화면 제공 장치 | |
US10692532B2 (en) | Systems and methods for video synopses | |
CN112948627B (zh) | 一种报警视频生成方法、显示方法和装置 | |
JP2023536365A (ja) | ビデオ処理方法及び装置 | |
JP2004179881A5 (zh) | ||
US20210051276A1 (en) | Method and apparatus for providing video in portable terminal | |
CN111835955B (zh) | 一种数据获取方法及装置 | |
CN108882004B (zh) | 视频录制方法、装置、设备及存储介质 | |
TWI564822B (zh) | 可預先篩選之視訊檔案回放系統及其方法與電腦程式產品 | |
WO2022061723A1 (zh) | 一种图像处理方法、设备、终端及存储介质 | |
CN113596582A (zh) | 一种视频预览方法、装置及电子设备 | |
US9413960B2 (en) | Method and apparatus for capturing video images including a start frame |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 2012878638 Country of ref document: EP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12878638 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015549913 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 20157017260 Country of ref document: KR Kind code of ref document: A |