WO2014068629A1 - Dispositif de traitement vidéo, dispositif d'émission vidéo, dispositif de réception vidéo, procédé de traitement vidéo, procédé d'émission vidéo et procédé de réception vidéo - Google Patents
Dispositif de traitement vidéo, dispositif d'émission vidéo, dispositif de réception vidéo, procédé de traitement vidéo, procédé d'émission vidéo et procédé de réception vidéo Download PDFInfo
- Publication number
- WO2014068629A1 WO2014068629A1 PCT/JP2012/077824 JP2012077824W WO2014068629A1 WO 2014068629 A1 WO2014068629 A1 WO 2014068629A1 JP 2012077824 W JP2012077824 W JP 2012077824W WO 2014068629 A1 WO2014068629 A1 WO 2014068629A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- video
- reference signal
- information
- processing
- video information
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 156
- 230000005540 biological transmission Effects 0.000 title claims abstract description 69
- 238000000034 method Methods 0.000 title claims abstract description 34
- 238000003672 processing method Methods 0.000 title claims description 7
- 238000004148 unit process Methods 0.000 claims abstract description 4
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 claims description 33
- 238000003384 imaging method Methods 0.000 claims description 22
- 238000007906 compression Methods 0.000 description 40
- 230000006835 compression Effects 0.000 description 38
- 230000001360 synchronised effect Effects 0.000 description 13
- 230000006870 function Effects 0.000 description 9
- 101100048435 Caenorhabditis elegans unc-18 gene Proteins 0.000 description 7
- 230000006837 decompression Effects 0.000 description 7
- 238000010586 diagram Methods 0.000 description 7
- 230000003111 delayed effect Effects 0.000 description 4
- 230000010355 oscillation Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000010561 standard procedure Methods 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- ORQBXQOJMQIAOY-UHFFFAOYSA-N nobelium Chemical compound [No] ORQBXQOJMQIAOY-UHFFFAOYSA-N 0.000 description 1
- 238000001454 recorded image Methods 0.000 description 1
- 230000000630 rising effect Effects 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 238000011144 upstream manufacturing Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/4402—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
- H04N21/44029—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display for generating different versions
Definitions
- the technical field relates to an apparatus for transmitting video.
- Patent Document 1 aims at “presenting a data transmission system and communication device that can effectively eliminate time mismatch” (see Patent Document 1 [0006]).
- a data transmission system that transmits and receives data between a plurality of communication devices via a network, and a transmission side that transmits data adds absolute time information to the data to be transmitted and outputs the data to the network.
- the receiving side that receives data from the above transmitting apparatus outputs each received data with a predetermined delay determined according to the transmission delay time of each data, which is determined according to the transmission delay time of each data.
- the predetermined time is added to each transmission delay of one or more data sent from one or more transmission sides, so that the delay times are substantially equal "(Patent Document 1) 0009]
- the reference that the like are disclosed.
- Patent Document 1 has a problem that the processing on the video receiving side is complicated because the video received from a plurality of video transmitting devices is displayed simultaneously.
- the present application includes a plurality of means for solving the above-described problems.
- the present application includes an interface that transmits and receives information via a network and a video processing unit that processes video information received by the interface.
- the video processing unit processes the plurality of video information based on a delay time including a transmission delay time of the plurality of video information.
- FIG. 1 shows an example of a video transmission system including a camera which is a video transmission device.
- 1a, 1b, and 1c are cameras.
- Reference numeral 2 denotes a LAN (Local Area Network)
- 3 denotes a reference time server
- the cameras 1 a, 1 b, 1 c and the reference time server 3 are connected to the LAN 2.
- Reference numerals 4a, 4b, and 4c cameras 6 are reference time servers, and are connected to the LAN 5.
- 8 is a center server
- 9 is a reference time server
- 10 is a controller
- 11 is a reference time server
- 12 is a display.
- the LAN 2, LAN 5, center server 8, and controller 10 are connected via a WAN 7 that is a WAN (Wide Area Network).
- WAN 7 Wide Area Network
- the system shown in FIG. 1 is a system that collects and monitors a plurality of images from a plurality of cameras installed in a remote place or a camera connected to a LAN installed in a moving body such as a car.
- the recorded images can be displayed side by side on the display 12.
- a protocol used for data transfer between devices for example, a method defined in the IEEE 802.3 standard which is a data link protocol may be used, and a network protocol IP (Internet Protocol) is used, As the transport protocol, TCP (Transmission Control Protocol) and UDP (User Datagram Protocol) may be used.
- IP Internet Protocol
- TCP Transmission Control Protocol
- UDP User Datagram Protocol
- RTP Real-time Transport Protocol
- HTTP Hyper Text Transfer Protocol
- a protocol method defined in the IEEE 802.3 standard may be used.
- the LAN 2 and LAN 5 are connected to the WAN 7 via a gateway device (not shown), and two or less cameras or four or more cameras can be connected.
- the center server 8 receives video and audio data distributed from each camera, performs signal processing such as synthesis processing, image recognition processing, and voice recognition processing of a plurality of camera videos, and transmits the result to the controller 10. .
- the controller 10 receives data transmitted from the center server 8, video and audio data distributed from each camera, and outputs video and audio to the display 12 and a speaker (not shown).
- the reference time servers 3, 6, 9, and 11 acquire accurate time using, for example, a GPS (Global Positioning System) or a radio clock, and provide the time information to other devices.
- the reference time servers 3 and 6 are connected to LAN2 and LAN5, respectively, and provide time information via the network.
- the reference time servers 9 and 11 supply time information by directly connecting to the center server 8 and the controller 10, for example. Similar to the reference time servers 3 and 6, time information may be supplied via a LAN.
- the time information provided to each of the LAN 2, the LAN 5, the center server 8, and the controller 10 is synchronized by each of the reference time servers 3, 6, 9, 11 acquiring an accurate time. Is possible.
- FIG. 2 is a diagram illustrating an example of an internal block configuration of the cameras 1a, 1b, 1c, 4a, 4b, and 4c that are video transmission apparatuses.
- Reference numeral 100 denotes a lens, 101 an image sensor, 102 a video compression circuit, 103 a video buffer, 104 a system encoder, 105 a packet buffer, 106 a reference signal generation circuit, 107 a LAN interface circuit, 108 a control circuit, 109 It is memory.
- the video signal obtained by the image sensor 101 via the lens 100 is input to the video compression circuit 102, color tone and contrast are corrected, and stored in the video buffer 103.
- the video compression circuit 102 reads out the video data stored in the video buffer 103 and, for example, ISO / IEC13818-2 (commonly known as MPEG2Video) MP @ ML (Main Profile @ Main Level) standard as a video compression encoding system. Compressed encoded data that conforms to the above is generated.
- Other video compression encoding methods include H.264.
- the H.264 / AVC standard method or the JPEG standard method may be used.
- the generated compressed encoded video data is input to the system encoder 104.
- the reference signal generation circuit 106 supplies, for example, a frame pulse indicating a break of the video signal frame to the image sensor 101 and the video compression circuit 102 as a reference signal serving as a reference for processing timing of the image sensor 101 and the video compression circuit 102. In accordance with this reference signal, the image pickup by the image sensor, the compression of the shot image, and the transmission of the compressed image (described later) are performed.
- system encoder 104 and the video compression circuit 102 are described as hardware. However, these functions can also be realized by software by causing the control circuit 108 to develop and execute a program having functions corresponding to each in a memory (not shown). In the following, for simplification of description, the system encoder 104 and the video compression circuit 102 will be described as executing the processes, including the case where the control circuit 108 executes a program corresponding to each function.
- the video compression circuit 102, the video buffer 103, the system encoder 104, the packet buffer 105, the reference signal generation circuit 106, and the control circuit 108 are collectively referred to simply as a video processing unit.
- a mechanism for synchronizing time between cameras connected in one LAN is provided, and this reference signal is a signal synchronized between cameras connected in the LAN.
- this reference signal is a signal synchronized between cameras connected in the LAN.
- a method of synchronizing the time for example, a method based on PTP (Precision Time Protocol) that is described in the Precision Clock Synchronization Protocol for Networked Measurement and Control Systems defined in IEEE 1588 can be used.
- the reference time server is defined as a server for time synchronization, and each camera is defined as a client side that matches the time on the server side.
- FIG. 3 shows a packet transmission / reception method performed for time synchronization between the server side and the client side.
- FIG. 3 shows an example of exchange of message packets for time synchronization performed between the server side and the client side.
- the server side transmits the first packet for obtaining synchronization information at the T1 time point to the client side for time synchronization.
- This packet is called a Sync packet, and the LAN interface circuit 107 in FIG. 2 receiving this packet transmits this packet information to the reference signal generation circuit 106.
- the reference signal generation circuit 106 acquires the packet transmission time (T1) on the server side described in the Sync packet and the arrival time (T2) of this packet measured by the reference signal generation circuit 106. Next, the reference signal generation circuit 106 generates a packet (DelayReq) to be transmitted from the client to the server and sends it to the LAN interface circuit 107. At this time, the time (T3) at which this packet is transmitted is stored in the DelayReq packet and transmitted to the server side.
- the server reads the timing (T4) when the DelayReq packet arrives, describes this in the DelayResp packet, and transmits it to the client side.
- the DelayResp packet that has arrived at the client side is transmitted from the LAN interface circuit 107 to the reference signal generation circuit 106.
- the reference signal generation circuit 106 obtains time information of T1, T2, T3, and T4.
- Tnet T2 ⁇ T1 + T4 ⁇ T3
- Toffset T2 ⁇ T1 It can be obtained as -Tnet.
- the reference signal generation circuit 106 calculates Toffset by the above calculation when the T1, T2, T3, and T4 information is obtained. Similarly to the above, the transmission / reception of the Sync, DelayReq, and DelayResp packets is repeated a plurality of times, Toffset is calculated several times, and the operation clock supply source of the reference signal generation circuit 106 is controlled in the direction in which Toffset approaches 0. In this control, for example, when the operation clock supply source is composed of VCXO (Voltage-Controlled Crystal Crystal Oscillator), when the Toffset is a positive value and the clock is delayed, the control voltage is lowered, and on the contrary, the Toffset is a negative value and the clock is accelerated. Sometimes this is done by raising the control voltage. With this control, the client time coincides with the server time.
- VCXO Voltage-Controlled Crystal Crystal Oscillator
- the client side can obtain the time synchronized with the server side.
- the period of a reference signal can be made to correspond between several cameras.
- the phase of the reference signal is realized, for example, by the center server 8 determining and transmitting the phase information to each camera via the network.
- the phase information can be determined, for example, by expressing the rising timing of the reference signal having a cycle of 30 cycles as a synchronized time.
- the compressed encoded video data compressed by the video compression circuit 102 and input to the system encoder 104 is packetized as shown below.
- FIG. 4 shows an example of digital compression processing.
- 201 is an intra frame and 202 is an inter frame.
- the digital compressed video signal has a predetermined number of frames, for example, 15 frames as one sequence, the head of which is an intra frame, and the remaining frames are inter frames compressed using prediction from the intra frame.
- an intra frame other than the head may be arranged.
- only the first frame may be an intra frame, and all subsequent frames may be inter frames, or all the frames may be intra frames.
- FIG. 5 shows an example of the configuration of the digital compressed video signal.
- 302 is a picture header added in units of frames
- 301 is a sequence header added in units of sequences.
- the sequence header 301 includes information such as a synchronization signal and a transmission rate.
- the picture header 302 includes a synchronization signal and identification information such as an intra frame or an inter frame. Usually, the length of each data changes with the amount of information.
- This digital video compression signal is divided into transport packets to be described later to form a packet sequence.
- FIG. 6 is a configuration example of a transport packet of a digital video compression signal.
- Reference numeral 40 denotes the transport packet, and one packet is composed of a fixed length, for example, 188 bytes, and is composed of a packet header 401 and packet information 402.
- the digital compressed video signal described with reference to FIG. 5 is divided and arranged in an area of packet information 402, and the packet header 401 includes information such as the type of packet information.
- the digital video compression signal packetized by the system encoder 104 is temporarily stored in the packet buffer 105, and the packet string read from the packet buffer 105 is input to the LAN interface circuit 107.
- the LAN interface circuit 107 in FIG. 2 packetizes the input packet sequence into, for example, a LAN packet conforming to the IEEE 802.3 standard and outputs the packet.
- FIG. 7 is a diagram illustrating an example of converting the packet sequence generated by the system encoder 104 into a LAN packet.
- the LAN packet 60 has, for example, one packet having a variable length of a maximum of 1518 bytes, and includes a LAN packet header 601 and LAN packet information 602.
- the transport packet 40 generated by the system encoder 106 is stored in the area of the LAN packet information 602 together with a data error detection code in accordance with the network protocol described above, and address information on the LAN 4 for identifying each camera is stored.
- a LAN packet header 601 is added and output as a LAN packet 60 to the LAN.
- the LAN interface circuit 107 exchanges information for control with devices connected to the LAN 4. This is because information such as an instruction from the control circuit 108 is stored in the LAN packet information 602, information is transmitted from the LAN packet 60 of the LAN packet 60 transmitted or received from the LAN 4, and transmitted to the control circuit 108. Done in The acquisition of the phase information of the reference signal is also performed here.
- FIG. 8 is a diagram illustrating an example of an internal block configuration of the center server 8.
- 801 is a LAN interface circuit
- 8021 to 8023 are system decoders
- 8031 to 8033 are video decompression circuits
- 804 is an image processing circuit
- 806 is a reference signal generation circuit
- 807 is a control circuit
- 808 is a video compression circuit
- 809 is a video buffer
- Reference numeral 810 denotes a system encoder
- 811 denotes a packet buffer.
- system decoders 8021 to 8023, the video expansion circuits 8031 to 8033, the image processing circuit 804, the video compression circuit 808, and the system encoder 810 are described as hardware. However, these functions can also be realized by software by causing the control circuit 807 to develop and execute a program having a function corresponding to each in a memory (not shown).
- control circuit 807 executes a program corresponding to each function
- system decoders 8021 to 8023, the video decompression circuits 8031 to 8033, the image processing circuit 804, the reference signal generation circuit 806, the control circuit 807, the video compression circuit 808, the video buffer 809, the packet buffer 811 and the system encoder 810 are simply combined. Also called a video processing unit.
- the LAN packet 60 generated by each camera is input to the LAN interface circuit 801.
- the LAN packet header 601 is removed in the LAN interface circuit 801, and the transport packet 40 is extracted from the LAN packet data 602 according to the network protocol described above.
- the transport packet 40 is input to the system decoder 8021, and the packet information 402 described above is extracted from the transport packet 40 and combined into the digital compressed video signal shown in FIG.
- the digital compressed video signal is subjected to expansion processing in a video expansion circuit 8031 and is input to the image processing circuit 804 as a digital video signal.
- the same processing is performed on the LAN packet 60 input from another camera.
- digital video signals are input to the image processing circuit 804 from the video expansion circuits 8032 and 8033.
- the image processing circuit 804 performs image quality correction, distortion correction, viewpoint conversion by coordinate replacement, composition processing, etc. from each camera, or recognition of an object shape by video signals from each camera, distance measurement, Perform image processing such as motion detection.
- the results obtained by performing such image processing are superimposed on the video signal as characters or graphics and output to the video compression circuit 808.
- the video compression circuit 808 and the video buffer 809 generate digital video compression signals
- the system encoder 810 and the packet buffer 811 generate transport packets, which are input to the LAN interface circuit 801. Is done.
- the input packet sequence is packetized into, for example, a LAN packet conforming to the IEEE 802.3 standard and output.
- the reference signal generation circuit 806 supplies a frame pulse indicating a video signal frame break to the image processing circuit 804 as a reference signal serving as a reference for processing timing of the image processing circuit 804.
- the reference signal is adjusted by the control circuit 807 controlling the reference signal generation circuit 806.
- information such as an instruction from the control circuit 807 is stored in the LAN packet information 602 and transmitted to each camera, or transmitted to each camera in order to exchange information for control with each camera.
- the information is extracted from the LAN packet information 602 of the LAN packet 60 received from, and transmitted to the control circuit 807.
- FIG. 9 is a diagram illustrating an example of an internal block configuration of the controller 10.
- Reference numeral 901 denotes a LAN interface circuit which distributes the transport packet 40 extracted from the LAN packet information 602 of the LAN packet 60 to the system decoders 9021 to 9023 according to the network protocol described above.
- the processing of the system decoders 9021 to 9023 and the video decompression circuits 9031 to 9033 is the same as the description of the system decoders 8021 to 8023 and the video decompression circuits 8031 to 8033 in FIG.
- the LAN interface circuit 901 exchanges information for control with each camera and the center server 8
- information such as an instruction from the control circuit 907 is stored in the LAN packet information 602
- each camera and center Information is extracted from the LAN packet information 602 of the LAN packet 60 transmitted to or received from the server 8 and transmitted to the control circuit 907.
- Digital video signals from the video expansion circuits 9031 to 9033 are input to the image processing circuit 904.
- the image processing circuit 904 displays a plurality of video signals from each camera side by side or displays a video signal from the center server 8 so that an image processing result in the center server 8 can be observed.
- the video signal from each camera and the video signal from the center server 8 may be displayed side by side.
- the OSD circuit 905 superimposes characters and figures on the video signal from the image processing circuit 904 and outputs it to the display 6.
- system decoders 9021 to 9023, the video decompression circuits 9031 to 9033, and the image processing circuit 904 are described as hardware. However, these functions can also be realized by software by causing the control circuit 907 to develop and execute a program having functions corresponding to each in a memory (not shown).
- the system decoders 9021 to 9023, the video decompression circuits 9031 to 9033, and the image processing circuit 904 are mainly used as the operation subjects, including the case where the control circuit 907 executes a program corresponding to each function. Will be explained.
- FIG. 10 is a flowchart of the operation process of each camera in this embodiment.
- the operation clock is synchronized with the reference time server to synchronize the time (step S101).
- the phase information of the reference signal that is the frame pulse is extracted by the LAN interface circuit 107 (step S102), and the reference signal is generated based on the phase information (step S103).
- the captured video signal is compressed (step S104) and transmitted (step S105).
- FIG. 11 is a flowchart of the operation process of the center server 8 in this embodiment.
- the operation clock is synchronized with the reference time server to synchronize the time (step S111).
- a reference signal is generated based on the operation clock (step S112).
- the reference signal phase information indicating the generation time of the reference signal as the reference time is transmitted to each camera and the controller 10 (step S113). Then, the video signal from each camera is received (step 114), decoded (step S115), a second reference signal to be described later is generated (step 116), and the image processing described above is performed (step S117). The resulting video signal is compressed (step S118) and transmitted to the controller 10 (step S119).
- FIG. 12 is a flowchart of the operation process of the controller 10 in this embodiment.
- the operation clock is synchronized with the reference time server (step S121), the phase information of the reference signal is extracted by the LAN interface circuit 901 (step S122), and the reference signal is extracted based on the phase information.
- Generate step S123.
- video signals from each camera and data center center server 8 are received (step S124), decoded (step S125), image processing by the above-described image processing circuit 904 is performed (step S126), and the result is displayed. (Step S127).
- FIG. 13 is a diagram illustrating an example of the transmission processing timing of each camera and the reception processing timing of the center server 8 in the present embodiment.
- (1-1) to (1-4) are processing timings of the camera 1a
- (2-1) to (2-4) are processing timings of the camera 4b
- (3-1) to (3-11). ) Shows the processing timing of the center server 8.
- (1-1) is the reference signal a
- (1-2) is the imaging timing a when imaging processing is performed by the imaging device 101
- (1-3) is the video compression processing performed by the video compression circuit 102.
- Video compression timing a, (1-4) is transmission timing a at which the LAN interface circuit 107 performs transmission processing.
- a video signal for one frame is processed for each reference signal.
- the camera 1a uses the reference signal a as a processing reference, starts imaging processing, for example, at the timing of the pulse of the reference signal a, and then sequentially performs video compression processing and transmission processing.
- a time d1 from the reference signal a to the start of transmission processing at the transmission timing a is a processing delay time.
- (2-1) is a reference signal b of the camera 4b
- (2-2) is an imaging timing b at which an imaging process is performed by the imaging device 101 of the camera 4b
- (2-3) is a video compression circuit 102.
- (2-4) is a transmission timing b at which the LAN interface circuit 107 of the camera 4b performs a transmission process.
- the camera 4b starts the imaging process at the timing of the reference signal b using the reference signal b as a processing reference, and then sequentially performs the video compression process and the transmission process.
- a time d2 from the reference signal b to the transmission timing b is a processing delay time.
- the reference signal a of the camera 1a and the reference signal b of the camera 4b are synchronized.
- (3-1) is the reference signal S of the center server 8
- (3-2) is the reception timing a when the center server 8 receives the LAN packet from the camera 1a
- (3-3) is the video.
- Video expansion timing a (3-4) for performing video expansion processing by the expansion circuit 8031 is the video output timing a of the camera 1a for one frame obtained by expansion by the video expansion circuit 8031.
- the time d3 is a delay time from the reference signal a until the video is output, and the delay time d1 described above is combined with the transmission delay time of the packet passing through the WAN, the reception of the center server 8, and the expansion processing delay time. Is.
- (3-5) is the reception timing b when the center server 8 is receiving the LAN packet from the camera 4b
- (3-6) is the video expansion where the video expansion circuit 8032 is performing the video expansion processing.
- Timing b, (3-7) is the video output timing b of the camera 4b for one frame obtained by the video expansion circuit 8032.
- time d4 is a delay time until the video is output from the reference signal b.
- the delay time d2 described above, the transmission delay time of the packet via the WAN, the reception of the center server 8, and the expansion processing delay It is a combination of time.
- (3-8) is another reference signal S2 in the center server 8
- (3-9) is an image processing timing S at which the center server 8 performs image processing by the image processing circuit 804
- (3-10) is a center server.
- 8 is a compression timing S for generating a digital video compression signal
- (3-11) is a transmission timing S for transmitting a packetized digital video compression signal by the center server 8.
- the center server 8 uses the reception timing a from the camera 1a as a reference for processing, and sequentially performs video expansion processing following the reception processing. Similarly, the video expansion process is performed following the reception process from the camera 4b.
- the transmission timing a of the camera 1a and the transmission timing b of the camera 4b may differ depending on, for example, a difference in video compression processing method.
- the reception timing a of the video from the camera 1a and the reception timing b of the video from the camera 4b may be different due to a difference in the WAN route. Therefore, the time d3 and the time d4 are different.
- the second reference signal S2 is generated with reference to the video output timing a that is output after the received video is expanded and output later.
- the generation of the reference signal S2 can be realized by generating a signal delayed by Tdelay1, which is a delay time of the video output timing a having a later video output timing with respect to the reference signal S.
- Tdelay1 is a delay time of the video output timing a having a later video output timing with respect to the reference signal S.
- (3-8) to (3-11) are processing timings of the center server 8, which are the same as (3-8) to (3-11) in FIG.
- a time d5 from the reference signal S2 to the start of transmission processing at the transmission timing S is a processing delay time.
- (4-1) to (4-4) indicate processing timing of the controller 10.
- (4-1) is a reference signal C of the controller 10
- (4-2) is a reception timing C at which the controller 10 is receiving a LAN packet from the center server 8
- (4-3) is a video expansion circuit 9031.
- the video expansion timing C (4-4) for performing the video expansion processing according to (4) is the video output timing C for one frame obtained by the video expansion circuit 9031.
- the generation of the reference signal C can be realized by generating a signal delayed by Tdelay2, which is a delay time of the video output timing C, with respect to the reference signal S2, that is, a signal sent by Tdelay1 + Tdelay2 with respect to the reference signal a in FIG. .
- the time from imaging with each camera to video transmission with the center server can be realized with the shortest delay time that can be realized between connected devices. Become. Further, when the center server 8 performs video processing of a plurality of cameras, video at the same shooting time can be processed, and image processing accuracy is improved.
- the controller 10 has shown an example of receiving a video signal from the center server 8, but may receive and display a video signal from each camera together with the video reception from the center server 8. Moreover, although transmission / reception of a video signal has been described, transmission of an audio signal is possible as well.
- a series of processing from image capturing by each camera to video processing at the center server 8 and video output at the controller 10 can be performed with the shortest possible delay time.
- video processing at the center server 8 can be performed between videos at the same shooting time, and high-precision image processing can be realized.
- the example in which the camera reference signal a and the reference signal b are synchronized including the period and the phase has been described.
- the phases are not necessarily matched.
- the present embodiment assuming such a case, the case where the periods of the reference signal a and the reference signal b match but the phases do not match will be described.
- each camera, the center server 8, and the controller 10 have a mechanism for synchronizing time.
- this method to synchronize the time periodically between systems and adjusting the oscillation period of the reference signal in the system using the time, the period of the reference signal can be matched between the systems.
- FIG. 15 is a diagram showing an example of processing timing in this embodiment. Since each camera adjusts the oscillation period of the reference signal based on the respective reference time, the time of each camera coincides, and the periods of the reference signal a and the reference signal b coincide. However, the phases of each other do not necessarily match.
- this imaging time is stored as imaging time information, for example, in the packet header of the above-described transport packet at the time of imaging when the video signal is compressed.
- the center server 8 receives the video signal from each camera and expands it to obtain a digital video signal as in the first embodiment.
- the center server 8 performs processing based on the reference signal S.
- the phase of the reference signal of each camera and the reference signal S of the center server 8 is obtained by comparing with the imaging time information stored in the video signal transmitted from each camera, and the time between the center server 8 and the camera 1a. It can be seen that the phase difference is d6 and the time phase difference between the center server 8 and the camera 4b is d7. Therefore, the image processing circuit 804 generates (3-4 ′) a-1 ′, which is a video delayed by the time phase difference d6 from the video a-1 shown in (3-4).
- the generation can be realized, for example, by obtaining the video a-1 ′ from the video a-0 and the video a-1 by performing video motion prediction and interpolation processing.
- the video b-1 'after the time phase difference d7 from the time T1 is obtained from the video b-0 and b-1.
- the obtained video a-1 ′ and video b-1 ′ can maintain the image processing accuracy by performing the subsequent processing as videos simultaneously captured at time T3.
- a-1 ′ and b-1 ′ may be generated from only one frame.
- the image processing at the center server is expected to increase the processing load and increase the processing delay, it is also possible to maintain the processing performance by dynamically increasing the CPU (Central Processing Unit) allocated to the image processing. is there.
- CPU Central Processing Unit
- the second embodiment even when the phases of the reference signals are not matched, a series of processes from image capturing by each camera to video processing at the center server 8 and video output at the controller 10 are realized. This can be done with the shortest possible delay time.
- video processing at the center server 8 can be performed between videos at almost the same shooting time, and image processing can be realized while maintaining high accuracy.
- the time from imaging to video output can be processed with the shortest delay time that can be realized between connected devices, and the same. It is possible to construct a system that can process the video at the shooting time and improve the image processing accuracy.
- thermo information and humidity information detected by a plurality of temperature sensors and a plurality of humidity sensors can be temporally synchronized and displayed on one display.
- LAN interface circuit 8021, 8022, 8023 ... system decoder, 8031 , 8032, 8033 ... Image expansion circuit, 804... Image processing circuit, 806... Reference signal generation circuit, 807... Control circuit, 901... LAN interface circuit, 9021, 9022, 9023 ... System decoder, 9031, 9032, 9033 ... Video expansion circuit, 904. Processing circuit, 905... OSD circuit, 906... Reference signal generation circuit, 907.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
Dans les documents de l'art antérieur, il existe le problème de la complexité, du côté de la réception vidéo, du processus permettant d'afficher simultanément des vidéos reçues de plusieurs dispositifs d'émission vidéo. La présente invention est caractérisée en ce qu'elle possède une interface qui émet et reçoit des informations par l'intermédiaire d'un réseau, et une unité de traitement vidéo qui traite des informations vidéo reçues avec l'interface, et lorsque plusieurs instances d'informations vidéo sont reçues avec l'interface, l'unité de traitement vidéo traite ces multiples instances des informations vidéo en se basant sur un temps de retard qui comprend le temps de retard d'émission des multiples instances des informations vidéo.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2012/077824 WO2014068629A1 (fr) | 2012-10-29 | 2012-10-29 | Dispositif de traitement vidéo, dispositif d'émission vidéo, dispositif de réception vidéo, procédé de traitement vidéo, procédé d'émission vidéo et procédé de réception vidéo |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2012/077824 WO2014068629A1 (fr) | 2012-10-29 | 2012-10-29 | Dispositif de traitement vidéo, dispositif d'émission vidéo, dispositif de réception vidéo, procédé de traitement vidéo, procédé d'émission vidéo et procédé de réception vidéo |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014068629A1 true WO2014068629A1 (fr) | 2014-05-08 |
Family
ID=50626610
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/077824 WO2014068629A1 (fr) | 2012-10-29 | 2012-10-29 | Dispositif de traitement vidéo, dispositif d'émission vidéo, dispositif de réception vidéo, procédé de traitement vidéo, procédé d'émission vidéo et procédé de réception vidéo |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2014068629A1 (fr) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005252344A (ja) * | 2004-03-01 | 2005-09-15 | Nippon Telegr & Teleph Corp <Ntt> | 多地点通信における品質改善方法及び多地点通信用端末、多地点間通信における品質改善プログラム |
JP2006250638A (ja) * | 2005-03-09 | 2006-09-21 | Matsushita Electric Ind Co Ltd | 時計同期機能付きビデオカメラ |
JP2009152733A (ja) * | 2007-12-19 | 2009-07-09 | Nec Corp | 人物特定システム、人物特定装置、人物特定方法および人物特定プログラム |
JP2010197320A (ja) * | 2009-02-27 | 2010-09-09 | Sony Corp | スレーブ装置、スレーブ装置の時刻同期化方法、マスタ装置および電子機器システム |
-
2012
- 2012-10-29 WO PCT/JP2012/077824 patent/WO2014068629A1/fr active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005252344A (ja) * | 2004-03-01 | 2005-09-15 | Nippon Telegr & Teleph Corp <Ntt> | 多地点通信における品質改善方法及び多地点通信用端末、多地点間通信における品質改善プログラム |
JP2006250638A (ja) * | 2005-03-09 | 2006-09-21 | Matsushita Electric Ind Co Ltd | 時計同期機能付きビデオカメラ |
JP2009152733A (ja) * | 2007-12-19 | 2009-07-09 | Nec Corp | 人物特定システム、人物特定装置、人物特定方法および人物特定プログラム |
JP2010197320A (ja) * | 2009-02-27 | 2010-09-09 | Sony Corp | スレーブ装置、スレーブ装置の時刻同期化方法、マスタ装置および電子機器システム |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5697743B2 (ja) | 映像送信装置、映像送信方法、映像受信装置、および映像受信方法 | |
US9723193B2 (en) | Transmitting device, receiving system, communication system, transmission method, reception method, and program | |
US8745432B2 (en) | Delay controller, control method, and communication system | |
US10038827B2 (en) | Semiconductor device, electronic device module and network system | |
JP2013134119A (ja) | 送信装置、送信方法、受信装置、受信方法、同期伝送システム、同期伝送方法、およびプログラム | |
US20160366431A1 (en) | Video decoding device and video decoding method | |
JP2011234341A (ja) | 受信装置およびカメラシステム | |
US20150163003A1 (en) | Communication apparatus, communication system, communication controlling method, and program | |
WO2020017499A1 (fr) | Système de transmission vidéo/audio, procédé de transmission, dispositif de transmission et dispositif de réception | |
US9807282B2 (en) | Synchronous camera | |
JP2013058986A (ja) | 通信システム、送信装置、受信装置、送信方法、受信方法及びプログラム | |
WO2014068629A1 (fr) | Dispositif de traitement vidéo, dispositif d'émission vidéo, dispositif de réception vidéo, procédé de traitement vidéo, procédé d'émission vidéo et procédé de réception vidéo | |
JP5697494B2 (ja) | 映像送信装置、映像送信方法、映像受信装置、および映像受信方法 | |
US20220360845A1 (en) | Reception apparatus, reception method, and transmission and reception system | |
JP2020005063A (ja) | 処理装置及びその制御方法、出力装置、同期制御システム、並びにプログラム | |
JP2015149761A (ja) | 符号化信号送信装置 | |
US9912427B2 (en) | Reception apparatus and system | |
JP2012195794A (ja) | 符号化信号受信装置 | |
JP2012195796A (ja) | 符号化信号送信装置 | |
JP2012195795A (ja) | ネットワークカメラシステム | |
GB2595879A (en) | Method for controlling an image capture device | |
JP2004289544A (ja) | ネットワーク接続機器及びこれに用いるタイムスタンプ処理方法 | |
JP6335775B2 (ja) | メディア受信装置 | |
JP2012191284A (ja) | 映像送信装置、映像送信方法、映像受信装置、および映像受信方法 | |
JP2007104263A (ja) | Ipを介在する転送システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12887752 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12887752 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: JP |