WO2014068629A1 - Video processing device, video transmission device, video reception device, video processing method, video transmission method, and video reception method - Google Patents

Video processing device, video transmission device, video reception device, video processing method, video transmission method, and video reception method Download PDF

Info

Publication number
WO2014068629A1
WO2014068629A1 PCT/JP2012/077824 JP2012077824W WO2014068629A1 WO 2014068629 A1 WO2014068629 A1 WO 2014068629A1 JP 2012077824 W JP2012077824 W JP 2012077824W WO 2014068629 A1 WO2014068629 A1 WO 2014068629A1
Authority
WO
WIPO (PCT)
Prior art keywords
video
reference signal
information
processing
video information
Prior art date
Application number
PCT/JP2012/077824
Other languages
French (fr)
Japanese (ja)
Inventor
佐々本 学
Original Assignee
株式会社日立製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立製作所 filed Critical 株式会社日立製作所
Priority to PCT/JP2012/077824 priority Critical patent/WO2014068629A1/en
Publication of WO2014068629A1 publication Critical patent/WO2014068629A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/44029Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display for generating different versions

Definitions

  • the technical field relates to an apparatus for transmitting video.
  • Patent Document 1 aims at “presenting a data transmission system and communication device that can effectively eliminate time mismatch” (see Patent Document 1 [0006]).
  • a data transmission system that transmits and receives data between a plurality of communication devices via a network, and a transmission side that transmits data adds absolute time information to the data to be transmitted and outputs the data to the network.
  • the receiving side that receives data from the above transmitting apparatus outputs each received data with a predetermined delay determined according to the transmission delay time of each data, which is determined according to the transmission delay time of each data.
  • the predetermined time is added to each transmission delay of one or more data sent from one or more transmission sides, so that the delay times are substantially equal "(Patent Document 1) 0009]
  • the reference that the like are disclosed.
  • Patent Document 1 has a problem that the processing on the video receiving side is complicated because the video received from a plurality of video transmitting devices is displayed simultaneously.
  • the present application includes a plurality of means for solving the above-described problems.
  • the present application includes an interface that transmits and receives information via a network and a video processing unit that processes video information received by the interface.
  • the video processing unit processes the plurality of video information based on a delay time including a transmission delay time of the plurality of video information.
  • FIG. 1 shows an example of a video transmission system including a camera which is a video transmission device.
  • 1a, 1b, and 1c are cameras.
  • Reference numeral 2 denotes a LAN (Local Area Network)
  • 3 denotes a reference time server
  • the cameras 1 a, 1 b, 1 c and the reference time server 3 are connected to the LAN 2.
  • Reference numerals 4a, 4b, and 4c cameras 6 are reference time servers, and are connected to the LAN 5.
  • 8 is a center server
  • 9 is a reference time server
  • 10 is a controller
  • 11 is a reference time server
  • 12 is a display.
  • the LAN 2, LAN 5, center server 8, and controller 10 are connected via a WAN 7 that is a WAN (Wide Area Network).
  • WAN 7 Wide Area Network
  • the system shown in FIG. 1 is a system that collects and monitors a plurality of images from a plurality of cameras installed in a remote place or a camera connected to a LAN installed in a moving body such as a car.
  • the recorded images can be displayed side by side on the display 12.
  • a protocol used for data transfer between devices for example, a method defined in the IEEE 802.3 standard which is a data link protocol may be used, and a network protocol IP (Internet Protocol) is used, As the transport protocol, TCP (Transmission Control Protocol) and UDP (User Datagram Protocol) may be used.
  • IP Internet Protocol
  • TCP Transmission Control Protocol
  • UDP User Datagram Protocol
  • RTP Real-time Transport Protocol
  • HTTP Hyper Text Transfer Protocol
  • a protocol method defined in the IEEE 802.3 standard may be used.
  • the LAN 2 and LAN 5 are connected to the WAN 7 via a gateway device (not shown), and two or less cameras or four or more cameras can be connected.
  • the center server 8 receives video and audio data distributed from each camera, performs signal processing such as synthesis processing, image recognition processing, and voice recognition processing of a plurality of camera videos, and transmits the result to the controller 10. .
  • the controller 10 receives data transmitted from the center server 8, video and audio data distributed from each camera, and outputs video and audio to the display 12 and a speaker (not shown).
  • the reference time servers 3, 6, 9, and 11 acquire accurate time using, for example, a GPS (Global Positioning System) or a radio clock, and provide the time information to other devices.
  • the reference time servers 3 and 6 are connected to LAN2 and LAN5, respectively, and provide time information via the network.
  • the reference time servers 9 and 11 supply time information by directly connecting to the center server 8 and the controller 10, for example. Similar to the reference time servers 3 and 6, time information may be supplied via a LAN.
  • the time information provided to each of the LAN 2, the LAN 5, the center server 8, and the controller 10 is synchronized by each of the reference time servers 3, 6, 9, 11 acquiring an accurate time. Is possible.
  • FIG. 2 is a diagram illustrating an example of an internal block configuration of the cameras 1a, 1b, 1c, 4a, 4b, and 4c that are video transmission apparatuses.
  • Reference numeral 100 denotes a lens, 101 an image sensor, 102 a video compression circuit, 103 a video buffer, 104 a system encoder, 105 a packet buffer, 106 a reference signal generation circuit, 107 a LAN interface circuit, 108 a control circuit, 109 It is memory.
  • the video signal obtained by the image sensor 101 via the lens 100 is input to the video compression circuit 102, color tone and contrast are corrected, and stored in the video buffer 103.
  • the video compression circuit 102 reads out the video data stored in the video buffer 103 and, for example, ISO / IEC13818-2 (commonly known as MPEG2Video) MP @ ML (Main Profile @ Main Level) standard as a video compression encoding system. Compressed encoded data that conforms to the above is generated.
  • Other video compression encoding methods include H.264.
  • the H.264 / AVC standard method or the JPEG standard method may be used.
  • the generated compressed encoded video data is input to the system encoder 104.
  • the reference signal generation circuit 106 supplies, for example, a frame pulse indicating a break of the video signal frame to the image sensor 101 and the video compression circuit 102 as a reference signal serving as a reference for processing timing of the image sensor 101 and the video compression circuit 102. In accordance with this reference signal, the image pickup by the image sensor, the compression of the shot image, and the transmission of the compressed image (described later) are performed.
  • system encoder 104 and the video compression circuit 102 are described as hardware. However, these functions can also be realized by software by causing the control circuit 108 to develop and execute a program having functions corresponding to each in a memory (not shown). In the following, for simplification of description, the system encoder 104 and the video compression circuit 102 will be described as executing the processes, including the case where the control circuit 108 executes a program corresponding to each function.
  • the video compression circuit 102, the video buffer 103, the system encoder 104, the packet buffer 105, the reference signal generation circuit 106, and the control circuit 108 are collectively referred to simply as a video processing unit.
  • a mechanism for synchronizing time between cameras connected in one LAN is provided, and this reference signal is a signal synchronized between cameras connected in the LAN.
  • this reference signal is a signal synchronized between cameras connected in the LAN.
  • a method of synchronizing the time for example, a method based on PTP (Precision Time Protocol) that is described in the Precision Clock Synchronization Protocol for Networked Measurement and Control Systems defined in IEEE 1588 can be used.
  • the reference time server is defined as a server for time synchronization, and each camera is defined as a client side that matches the time on the server side.
  • FIG. 3 shows a packet transmission / reception method performed for time synchronization between the server side and the client side.
  • FIG. 3 shows an example of exchange of message packets for time synchronization performed between the server side and the client side.
  • the server side transmits the first packet for obtaining synchronization information at the T1 time point to the client side for time synchronization.
  • This packet is called a Sync packet, and the LAN interface circuit 107 in FIG. 2 receiving this packet transmits this packet information to the reference signal generation circuit 106.
  • the reference signal generation circuit 106 acquires the packet transmission time (T1) on the server side described in the Sync packet and the arrival time (T2) of this packet measured by the reference signal generation circuit 106. Next, the reference signal generation circuit 106 generates a packet (DelayReq) to be transmitted from the client to the server and sends it to the LAN interface circuit 107. At this time, the time (T3) at which this packet is transmitted is stored in the DelayReq packet and transmitted to the server side.
  • the server reads the timing (T4) when the DelayReq packet arrives, describes this in the DelayResp packet, and transmits it to the client side.
  • the DelayResp packet that has arrived at the client side is transmitted from the LAN interface circuit 107 to the reference signal generation circuit 106.
  • the reference signal generation circuit 106 obtains time information of T1, T2, T3, and T4.
  • Tnet T2 ⁇ T1 + T4 ⁇ T3
  • Toffset T2 ⁇ T1 It can be obtained as -Tnet.
  • the reference signal generation circuit 106 calculates Toffset by the above calculation when the T1, T2, T3, and T4 information is obtained. Similarly to the above, the transmission / reception of the Sync, DelayReq, and DelayResp packets is repeated a plurality of times, Toffset is calculated several times, and the operation clock supply source of the reference signal generation circuit 106 is controlled in the direction in which Toffset approaches 0. In this control, for example, when the operation clock supply source is composed of VCXO (Voltage-Controlled Crystal Crystal Oscillator), when the Toffset is a positive value and the clock is delayed, the control voltage is lowered, and on the contrary, the Toffset is a negative value and the clock is accelerated. Sometimes this is done by raising the control voltage. With this control, the client time coincides with the server time.
  • VCXO Voltage-Controlled Crystal Crystal Oscillator
  • the client side can obtain the time synchronized with the server side.
  • the period of a reference signal can be made to correspond between several cameras.
  • the phase of the reference signal is realized, for example, by the center server 8 determining and transmitting the phase information to each camera via the network.
  • the phase information can be determined, for example, by expressing the rising timing of the reference signal having a cycle of 30 cycles as a synchronized time.
  • the compressed encoded video data compressed by the video compression circuit 102 and input to the system encoder 104 is packetized as shown below.
  • FIG. 4 shows an example of digital compression processing.
  • 201 is an intra frame and 202 is an inter frame.
  • the digital compressed video signal has a predetermined number of frames, for example, 15 frames as one sequence, the head of which is an intra frame, and the remaining frames are inter frames compressed using prediction from the intra frame.
  • an intra frame other than the head may be arranged.
  • only the first frame may be an intra frame, and all subsequent frames may be inter frames, or all the frames may be intra frames.
  • FIG. 5 shows an example of the configuration of the digital compressed video signal.
  • 302 is a picture header added in units of frames
  • 301 is a sequence header added in units of sequences.
  • the sequence header 301 includes information such as a synchronization signal and a transmission rate.
  • the picture header 302 includes a synchronization signal and identification information such as an intra frame or an inter frame. Usually, the length of each data changes with the amount of information.
  • This digital video compression signal is divided into transport packets to be described later to form a packet sequence.
  • FIG. 6 is a configuration example of a transport packet of a digital video compression signal.
  • Reference numeral 40 denotes the transport packet, and one packet is composed of a fixed length, for example, 188 bytes, and is composed of a packet header 401 and packet information 402.
  • the digital compressed video signal described with reference to FIG. 5 is divided and arranged in an area of packet information 402, and the packet header 401 includes information such as the type of packet information.
  • the digital video compression signal packetized by the system encoder 104 is temporarily stored in the packet buffer 105, and the packet string read from the packet buffer 105 is input to the LAN interface circuit 107.
  • the LAN interface circuit 107 in FIG. 2 packetizes the input packet sequence into, for example, a LAN packet conforming to the IEEE 802.3 standard and outputs the packet.
  • FIG. 7 is a diagram illustrating an example of converting the packet sequence generated by the system encoder 104 into a LAN packet.
  • the LAN packet 60 has, for example, one packet having a variable length of a maximum of 1518 bytes, and includes a LAN packet header 601 and LAN packet information 602.
  • the transport packet 40 generated by the system encoder 106 is stored in the area of the LAN packet information 602 together with a data error detection code in accordance with the network protocol described above, and address information on the LAN 4 for identifying each camera is stored.
  • a LAN packet header 601 is added and output as a LAN packet 60 to the LAN.
  • the LAN interface circuit 107 exchanges information for control with devices connected to the LAN 4. This is because information such as an instruction from the control circuit 108 is stored in the LAN packet information 602, information is transmitted from the LAN packet 60 of the LAN packet 60 transmitted or received from the LAN 4, and transmitted to the control circuit 108. Done in The acquisition of the phase information of the reference signal is also performed here.
  • FIG. 8 is a diagram illustrating an example of an internal block configuration of the center server 8.
  • 801 is a LAN interface circuit
  • 8021 to 8023 are system decoders
  • 8031 to 8033 are video decompression circuits
  • 804 is an image processing circuit
  • 806 is a reference signal generation circuit
  • 807 is a control circuit
  • 808 is a video compression circuit
  • 809 is a video buffer
  • Reference numeral 810 denotes a system encoder
  • 811 denotes a packet buffer.
  • system decoders 8021 to 8023, the video expansion circuits 8031 to 8033, the image processing circuit 804, the video compression circuit 808, and the system encoder 810 are described as hardware. However, these functions can also be realized by software by causing the control circuit 807 to develop and execute a program having a function corresponding to each in a memory (not shown).
  • control circuit 807 executes a program corresponding to each function
  • system decoders 8021 to 8023, the video decompression circuits 8031 to 8033, the image processing circuit 804, the reference signal generation circuit 806, the control circuit 807, the video compression circuit 808, the video buffer 809, the packet buffer 811 and the system encoder 810 are simply combined. Also called a video processing unit.
  • the LAN packet 60 generated by each camera is input to the LAN interface circuit 801.
  • the LAN packet header 601 is removed in the LAN interface circuit 801, and the transport packet 40 is extracted from the LAN packet data 602 according to the network protocol described above.
  • the transport packet 40 is input to the system decoder 8021, and the packet information 402 described above is extracted from the transport packet 40 and combined into the digital compressed video signal shown in FIG.
  • the digital compressed video signal is subjected to expansion processing in a video expansion circuit 8031 and is input to the image processing circuit 804 as a digital video signal.
  • the same processing is performed on the LAN packet 60 input from another camera.
  • digital video signals are input to the image processing circuit 804 from the video expansion circuits 8032 and 8033.
  • the image processing circuit 804 performs image quality correction, distortion correction, viewpoint conversion by coordinate replacement, composition processing, etc. from each camera, or recognition of an object shape by video signals from each camera, distance measurement, Perform image processing such as motion detection.
  • the results obtained by performing such image processing are superimposed on the video signal as characters or graphics and output to the video compression circuit 808.
  • the video compression circuit 808 and the video buffer 809 generate digital video compression signals
  • the system encoder 810 and the packet buffer 811 generate transport packets, which are input to the LAN interface circuit 801. Is done.
  • the input packet sequence is packetized into, for example, a LAN packet conforming to the IEEE 802.3 standard and output.
  • the reference signal generation circuit 806 supplies a frame pulse indicating a video signal frame break to the image processing circuit 804 as a reference signal serving as a reference for processing timing of the image processing circuit 804.
  • the reference signal is adjusted by the control circuit 807 controlling the reference signal generation circuit 806.
  • information such as an instruction from the control circuit 807 is stored in the LAN packet information 602 and transmitted to each camera, or transmitted to each camera in order to exchange information for control with each camera.
  • the information is extracted from the LAN packet information 602 of the LAN packet 60 received from, and transmitted to the control circuit 807.
  • FIG. 9 is a diagram illustrating an example of an internal block configuration of the controller 10.
  • Reference numeral 901 denotes a LAN interface circuit which distributes the transport packet 40 extracted from the LAN packet information 602 of the LAN packet 60 to the system decoders 9021 to 9023 according to the network protocol described above.
  • the processing of the system decoders 9021 to 9023 and the video decompression circuits 9031 to 9033 is the same as the description of the system decoders 8021 to 8023 and the video decompression circuits 8031 to 8033 in FIG.
  • the LAN interface circuit 901 exchanges information for control with each camera and the center server 8
  • information such as an instruction from the control circuit 907 is stored in the LAN packet information 602
  • each camera and center Information is extracted from the LAN packet information 602 of the LAN packet 60 transmitted to or received from the server 8 and transmitted to the control circuit 907.
  • Digital video signals from the video expansion circuits 9031 to 9033 are input to the image processing circuit 904.
  • the image processing circuit 904 displays a plurality of video signals from each camera side by side or displays a video signal from the center server 8 so that an image processing result in the center server 8 can be observed.
  • the video signal from each camera and the video signal from the center server 8 may be displayed side by side.
  • the OSD circuit 905 superimposes characters and figures on the video signal from the image processing circuit 904 and outputs it to the display 6.
  • system decoders 9021 to 9023, the video decompression circuits 9031 to 9033, and the image processing circuit 904 are described as hardware. However, these functions can also be realized by software by causing the control circuit 907 to develop and execute a program having functions corresponding to each in a memory (not shown).
  • the system decoders 9021 to 9023, the video decompression circuits 9031 to 9033, and the image processing circuit 904 are mainly used as the operation subjects, including the case where the control circuit 907 executes a program corresponding to each function. Will be explained.
  • FIG. 10 is a flowchart of the operation process of each camera in this embodiment.
  • the operation clock is synchronized with the reference time server to synchronize the time (step S101).
  • the phase information of the reference signal that is the frame pulse is extracted by the LAN interface circuit 107 (step S102), and the reference signal is generated based on the phase information (step S103).
  • the captured video signal is compressed (step S104) and transmitted (step S105).
  • FIG. 11 is a flowchart of the operation process of the center server 8 in this embodiment.
  • the operation clock is synchronized with the reference time server to synchronize the time (step S111).
  • a reference signal is generated based on the operation clock (step S112).
  • the reference signal phase information indicating the generation time of the reference signal as the reference time is transmitted to each camera and the controller 10 (step S113). Then, the video signal from each camera is received (step 114), decoded (step S115), a second reference signal to be described later is generated (step 116), and the image processing described above is performed (step S117). The resulting video signal is compressed (step S118) and transmitted to the controller 10 (step S119).
  • FIG. 12 is a flowchart of the operation process of the controller 10 in this embodiment.
  • the operation clock is synchronized with the reference time server (step S121), the phase information of the reference signal is extracted by the LAN interface circuit 901 (step S122), and the reference signal is extracted based on the phase information.
  • Generate step S123.
  • video signals from each camera and data center center server 8 are received (step S124), decoded (step S125), image processing by the above-described image processing circuit 904 is performed (step S126), and the result is displayed. (Step S127).
  • FIG. 13 is a diagram illustrating an example of the transmission processing timing of each camera and the reception processing timing of the center server 8 in the present embodiment.
  • (1-1) to (1-4) are processing timings of the camera 1a
  • (2-1) to (2-4) are processing timings of the camera 4b
  • (3-1) to (3-11). ) Shows the processing timing of the center server 8.
  • (1-1) is the reference signal a
  • (1-2) is the imaging timing a when imaging processing is performed by the imaging device 101
  • (1-3) is the video compression processing performed by the video compression circuit 102.
  • Video compression timing a, (1-4) is transmission timing a at which the LAN interface circuit 107 performs transmission processing.
  • a video signal for one frame is processed for each reference signal.
  • the camera 1a uses the reference signal a as a processing reference, starts imaging processing, for example, at the timing of the pulse of the reference signal a, and then sequentially performs video compression processing and transmission processing.
  • a time d1 from the reference signal a to the start of transmission processing at the transmission timing a is a processing delay time.
  • (2-1) is a reference signal b of the camera 4b
  • (2-2) is an imaging timing b at which an imaging process is performed by the imaging device 101 of the camera 4b
  • (2-3) is a video compression circuit 102.
  • (2-4) is a transmission timing b at which the LAN interface circuit 107 of the camera 4b performs a transmission process.
  • the camera 4b starts the imaging process at the timing of the reference signal b using the reference signal b as a processing reference, and then sequentially performs the video compression process and the transmission process.
  • a time d2 from the reference signal b to the transmission timing b is a processing delay time.
  • the reference signal a of the camera 1a and the reference signal b of the camera 4b are synchronized.
  • (3-1) is the reference signal S of the center server 8
  • (3-2) is the reception timing a when the center server 8 receives the LAN packet from the camera 1a
  • (3-3) is the video.
  • Video expansion timing a (3-4) for performing video expansion processing by the expansion circuit 8031 is the video output timing a of the camera 1a for one frame obtained by expansion by the video expansion circuit 8031.
  • the time d3 is a delay time from the reference signal a until the video is output, and the delay time d1 described above is combined with the transmission delay time of the packet passing through the WAN, the reception of the center server 8, and the expansion processing delay time. Is.
  • (3-5) is the reception timing b when the center server 8 is receiving the LAN packet from the camera 4b
  • (3-6) is the video expansion where the video expansion circuit 8032 is performing the video expansion processing.
  • Timing b, (3-7) is the video output timing b of the camera 4b for one frame obtained by the video expansion circuit 8032.
  • time d4 is a delay time until the video is output from the reference signal b.
  • the delay time d2 described above, the transmission delay time of the packet via the WAN, the reception of the center server 8, and the expansion processing delay It is a combination of time.
  • (3-8) is another reference signal S2 in the center server 8
  • (3-9) is an image processing timing S at which the center server 8 performs image processing by the image processing circuit 804
  • (3-10) is a center server.
  • 8 is a compression timing S for generating a digital video compression signal
  • (3-11) is a transmission timing S for transmitting a packetized digital video compression signal by the center server 8.
  • the center server 8 uses the reception timing a from the camera 1a as a reference for processing, and sequentially performs video expansion processing following the reception processing. Similarly, the video expansion process is performed following the reception process from the camera 4b.
  • the transmission timing a of the camera 1a and the transmission timing b of the camera 4b may differ depending on, for example, a difference in video compression processing method.
  • the reception timing a of the video from the camera 1a and the reception timing b of the video from the camera 4b may be different due to a difference in the WAN route. Therefore, the time d3 and the time d4 are different.
  • the second reference signal S2 is generated with reference to the video output timing a that is output after the received video is expanded and output later.
  • the generation of the reference signal S2 can be realized by generating a signal delayed by Tdelay1, which is a delay time of the video output timing a having a later video output timing with respect to the reference signal S.
  • Tdelay1 is a delay time of the video output timing a having a later video output timing with respect to the reference signal S.
  • (3-8) to (3-11) are processing timings of the center server 8, which are the same as (3-8) to (3-11) in FIG.
  • a time d5 from the reference signal S2 to the start of transmission processing at the transmission timing S is a processing delay time.
  • (4-1) to (4-4) indicate processing timing of the controller 10.
  • (4-1) is a reference signal C of the controller 10
  • (4-2) is a reception timing C at which the controller 10 is receiving a LAN packet from the center server 8
  • (4-3) is a video expansion circuit 9031.
  • the video expansion timing C (4-4) for performing the video expansion processing according to (4) is the video output timing C for one frame obtained by the video expansion circuit 9031.
  • the generation of the reference signal C can be realized by generating a signal delayed by Tdelay2, which is a delay time of the video output timing C, with respect to the reference signal S2, that is, a signal sent by Tdelay1 + Tdelay2 with respect to the reference signal a in FIG. .
  • the time from imaging with each camera to video transmission with the center server can be realized with the shortest delay time that can be realized between connected devices. Become. Further, when the center server 8 performs video processing of a plurality of cameras, video at the same shooting time can be processed, and image processing accuracy is improved.
  • the controller 10 has shown an example of receiving a video signal from the center server 8, but may receive and display a video signal from each camera together with the video reception from the center server 8. Moreover, although transmission / reception of a video signal has been described, transmission of an audio signal is possible as well.
  • a series of processing from image capturing by each camera to video processing at the center server 8 and video output at the controller 10 can be performed with the shortest possible delay time.
  • video processing at the center server 8 can be performed between videos at the same shooting time, and high-precision image processing can be realized.
  • the example in which the camera reference signal a and the reference signal b are synchronized including the period and the phase has been described.
  • the phases are not necessarily matched.
  • the present embodiment assuming such a case, the case where the periods of the reference signal a and the reference signal b match but the phases do not match will be described.
  • each camera, the center server 8, and the controller 10 have a mechanism for synchronizing time.
  • this method to synchronize the time periodically between systems and adjusting the oscillation period of the reference signal in the system using the time, the period of the reference signal can be matched between the systems.
  • FIG. 15 is a diagram showing an example of processing timing in this embodiment. Since each camera adjusts the oscillation period of the reference signal based on the respective reference time, the time of each camera coincides, and the periods of the reference signal a and the reference signal b coincide. However, the phases of each other do not necessarily match.
  • this imaging time is stored as imaging time information, for example, in the packet header of the above-described transport packet at the time of imaging when the video signal is compressed.
  • the center server 8 receives the video signal from each camera and expands it to obtain a digital video signal as in the first embodiment.
  • the center server 8 performs processing based on the reference signal S.
  • the phase of the reference signal of each camera and the reference signal S of the center server 8 is obtained by comparing with the imaging time information stored in the video signal transmitted from each camera, and the time between the center server 8 and the camera 1a. It can be seen that the phase difference is d6 and the time phase difference between the center server 8 and the camera 4b is d7. Therefore, the image processing circuit 804 generates (3-4 ′) a-1 ′, which is a video delayed by the time phase difference d6 from the video a-1 shown in (3-4).
  • the generation can be realized, for example, by obtaining the video a-1 ′ from the video a-0 and the video a-1 by performing video motion prediction and interpolation processing.
  • the video b-1 'after the time phase difference d7 from the time T1 is obtained from the video b-0 and b-1.
  • the obtained video a-1 ′ and video b-1 ′ can maintain the image processing accuracy by performing the subsequent processing as videos simultaneously captured at time T3.
  • a-1 ′ and b-1 ′ may be generated from only one frame.
  • the image processing at the center server is expected to increase the processing load and increase the processing delay, it is also possible to maintain the processing performance by dynamically increasing the CPU (Central Processing Unit) allocated to the image processing. is there.
  • CPU Central Processing Unit
  • the second embodiment even when the phases of the reference signals are not matched, a series of processes from image capturing by each camera to video processing at the center server 8 and video output at the controller 10 are realized. This can be done with the shortest possible delay time.
  • video processing at the center server 8 can be performed between videos at almost the same shooting time, and image processing can be realized while maintaining high accuracy.
  • the time from imaging to video output can be processed with the shortest delay time that can be realized between connected devices, and the same. It is possible to construct a system that can process the video at the shooting time and improve the image processing accuracy.
  • thermo information and humidity information detected by a plurality of temperature sensors and a plurality of humidity sensors can be temporally synchronized and displayed on one display.
  • LAN interface circuit 8021, 8022, 8023 ... system decoder, 8031 , 8032, 8033 ... Image expansion circuit, 804... Image processing circuit, 806... Reference signal generation circuit, 807... Control circuit, 901... LAN interface circuit, 9021, 9022, 9023 ... System decoder, 9031, 9032, 9033 ... Video expansion circuit, 904. Processing circuit, 905... OSD circuit, 906... Reference signal generation circuit, 907.

Abstract

In the prior art documents there is a problem in that on the video reception side the process for simultaneously displaying videos received from multiple video transmission devices is complex. The present invention is characterized by having an interface that transmits and receives information via a network, and a video processing unit that processes video information received with the interface, and when multiple instances of video information are received with the interface, the video processing unit processes the multiple instances of video information on the basis of a delay time that includes the transmission delay time of the multiple instances of video information.

Description

映像処理装置、映像送信装置、映像受信装置、映像処理方法、映像送信方法、及び映像受信方法。Video processing device, video transmission device, video reception device, video processing method, video transmission method, and video reception method
 技術分野は、映像を伝送する装置に関する。 The technical field relates to an apparatus for transmitting video.
 上記技術分野に関し、例えば特許文献1には、「時間的な不整合を効果的に解消できるデータ伝送システム及び通信装置を提示すること」(特許文献1[0006]参照)等を目的とし、その解決手段として「ネットワークを介して複数の通信装置間でデータを送受信するデータ伝送システムであって、データを送信する送信側が、送信すべきデータに絶対時刻情報を付加してネットワークに出力する。1以上の送信装置からデータを受信する受信側は、受信した各データを、各データの伝送遅延時間に応じて決定される所定時間遅延して出力する。各データの伝送遅延時間に応じて決定される所定時間は、1以上の送信側から送られる1以上のデータの各伝送遅延に加算されて、遅延時間を実質的に等しくするものである」(特許文献1[0009]参照)こと等が記載されている。 With regard to the above technical field, for example, Patent Document 1 aims at “presenting a data transmission system and communication device that can effectively eliminate time mismatch” (see Patent Document 1 [0006]). As a solution, “a data transmission system that transmits and receives data between a plurality of communication devices via a network, and a transmission side that transmits data adds absolute time information to the data to be transmitted and outputs the data to the network. The receiving side that receives data from the above transmitting apparatus outputs each received data with a predetermined delay determined according to the transmission delay time of each data, which is determined according to the transmission delay time of each data. The predetermined time is added to each transmission delay of one or more data sent from one or more transmission sides, so that the delay times are substantially equal "(Patent Document 1) 0009] The reference) that the like are disclosed.
特開平9-51515号公報JP-A-9-51515
 しかし、特許文献1に記載の技術は、複数の映像送信装置から受信した映像を同時に表示するために映像受信側の処理が複雑となる課題があった。 However, the technique described in Patent Document 1 has a problem that the processing on the video receiving side is complicated because the video received from a plurality of video transmitting devices is displayed simultaneously.
 上記課題を解決するために、例えば特許請求の範囲に記載の構成を採用する。
  本願は上記課題を解決する手段を複数含んでいるが、その一例を挙げるならば、ネットワークを介して情報を送受信するインタフェースと、インタフェースで受信した映像情報を処理する映像処理部と、を有し、インタフェースで複数の映像情報を受信した場合に、映像処理部は複数の映像情報の伝送遅延時間を含む遅延時間に基づいて複数の映像情報を処理することを特徴とする。
In order to solve the above problems, for example, the configuration described in the claims is adopted.
The present application includes a plurality of means for solving the above-described problems. For example, the present application includes an interface that transmits and receives information via a network and a video processing unit that processes video information received by the interface. When a plurality of video information is received by the interface, the video processing unit processes the plurality of video information based on a delay time including a transmission delay time of the plurality of video information.
 本発明によれば、遅延時間を考慮した映像伝送システムを提供できる。 According to the present invention, it is possible to provide a video transmission system in consideration of the delay time.
映像送信装置、映像受信装置を含む映像伝送システムの一例を示す図である。It is a figure which shows an example of the video transmission system containing a video transmitter and a video receiver. 映像送信装置の内部ブロック構成の一例を示す図である。It is a figure which shows an example of an internal block structure of a video transmission apparatus. 時刻同期をとるために行うパケット送受信の一例を示す図である。It is a figure which shows an example of the packet transmission / reception performed in order to take time synchronization. 映像送信装置のディジタル圧縮処理の一例を示す図である。It is a figure which shows an example of the digital compression process of a video transmitter. 映像送信装置のディジタル圧縮映像信号の一例を示す図である。It is a figure which shows an example of the digital compression video signal of a video transmitter. 映像送信装置のディジタル圧縮映像信号のパケットの一例を示す図である。It is a figure which shows an example of the packet of the digital compression video signal of a video transmitter. 映像送信装置のLANパケットの一例を示す図である。It is a figure which shows an example of the LAN packet of a video transmission apparatus. 映像受信装置の内部ブロック構成の一例を示す図である。It is a figure which shows an example of an internal block structure of a video receiver. 映像受信装置の内部ブロック構成の他の一例を示す図である。It is a figure which shows another example of the internal block structure of a video receiver. 映像送信装置の送信処理のフローチャートの一例を示す図である。It is a figure which shows an example of the flowchart of the transmission process of a video transmission apparatus. 映像処理装置の送受信処理のフローチャートの一例を示す図である。It is a figure which shows an example of the flowchart of the transmission / reception process of a video processing apparatus. 映像受信装置の受信処理のフローチャートの一例を示す図である。It is a figure which shows an example of the flowchart of a reception process of a video receiver. 映像送信装置の送信処理タイミングと映像処理装置の処理タイミングの一例を示す図である。It is a figure which shows an example of the transmission processing timing of a video transmission device, and the processing timing of a video processing device. 映像処理装置の処理タイミングと映像受信装置の受信処理タイミングの一例を示す図である。It is a figure which shows an example of the processing timing of a video processing apparatus, and the reception processing timing of a video receiver. 映像送信装置の送信処理タイミングと映像処理装置の処理タイミングの他の一例を示す図である。It is a figure which shows another example of the transmission processing timing of a video transmission device, and the processing timing of a video processing device.
 図1は、映像伝送装置であるカメラを含む映像伝送システムの形態の一例である。図1において、1a、1b、1cはカメラである。2はLAN(Local Area Network)、3は基準時刻サーバであり、カメラ1a、1b、1c、基準時刻サーバ3は、LAN2に接続されている。また、4a、4b、4cカメラ、6は基準時刻サーバであり、LAN5に接続されている。8はセンタサーバ、9は基準時刻サーバ、10はコントローラ、11は基準時刻サーバ、12はディスプレイである。LAN2、LAN5、センタサーバ8、コントローラ10は、WAN(Wide Area Network)であるWAN7を介して接続されている。図1のシステムは、例えば遠隔地に設置された複数のカメラの映像や、車等の移動体内に設置されLAN接続されたカメラの映像を、複数集めて監視するシステムであり、各カメラで撮影された映像をディスプレイ12に並べて表示することができる。 FIG. 1 shows an example of a video transmission system including a camera which is a video transmission device. In FIG. 1, 1a, 1b, and 1c are cameras. Reference numeral 2 denotes a LAN (Local Area Network), 3 denotes a reference time server, and the cameras 1 a, 1 b, 1 c and the reference time server 3 are connected to the LAN 2. Reference numerals 4a, 4b, and 4c cameras 6 are reference time servers, and are connected to the LAN 5. 8 is a center server, 9 is a reference time server, 10 is a controller, 11 is a reference time server, and 12 is a display. The LAN 2, LAN 5, center server 8, and controller 10 are connected via a WAN 7 that is a WAN (Wide Area Network). The system shown in FIG. 1 is a system that collects and monitors a plurality of images from a plurality of cameras installed in a remote place or a camera connected to a LAN installed in a moving body such as a car. The recorded images can be displayed side by side on the display 12.
 機器間のデータ転送のため使用するプロトコルとして、例えばデータリンクプロトコルであるIEEE802.3規格で規定されている方式を用いてもよいし、さらにネットワークプロトコルのIP(Internet Protocol)を使用し、その上位のトランスポートプロトコルにはTCP(Transmission Control Protocol)およびUDP(User Datagram Protocol)を用いてもよい。 As a protocol used for data transfer between devices, for example, a method defined in the IEEE 802.3 standard which is a data link protocol may be used, and a network protocol IP (Internet Protocol) is used, As the transport protocol, TCP (Transmission Control Protocol) and UDP (User Datagram Protocol) may be used.
 映像や音声の伝送には更に上位のアプリケーションプロトコル、例えばRTP(Real-time Transport Protocol)やHTTP(Hyper Text TransferProtocol)等が使用される。その他、IEEE802.3規格で規定されているプロトコル方式を用いてもよい。 Further higher-level application protocols such as RTP (Real-time Transport Protocol) and HTTP (Hyper Text Transfer Protocol) are used for video and audio transmission. In addition, a protocol method defined in the IEEE 802.3 standard may be used.
 LAN2、LAN5の構成としては、図示しないゲートウェイ装置を介してWAN7と接続され、カメラは2台以下や4台以上の接続も可能である。 The LAN 2 and LAN 5 are connected to the WAN 7 via a gateway device (not shown), and two or less cameras or four or more cameras can be connected.
 センタサーバ8は、各カメラから配信される映像や音声データを受信し、例えば複数のカメラ映像の合成処理、画像認識処理、音声認識処理などの信号処理を施し、その結果をコントローラ10に送信する。 The center server 8 receives video and audio data distributed from each camera, performs signal processing such as synthesis processing, image recognition processing, and voice recognition processing of a plurality of camera videos, and transmits the result to the controller 10. .
 コントローラ10は、センタサーバ8から送信されるデータや、各カメラから配信される映像や音声データを受信し、ディスプレイ12、図示しないスピーカに映像、音声をそれぞれ出力する。 The controller 10 receives data transmitted from the center server 8, video and audio data distributed from each camera, and outputs video and audio to the display 12 and a speaker (not shown).
 基準時刻サーバ3、6、9、11は、例えばGPS(GrobalPositioningSystem)や、電波時計で正確な時刻を取得し、他の機器にその時刻情報を提供するものであり、基準時刻サーバ3、6は、それぞれLAN2、LAN5に接続し、ネットワークを経由して時刻情報を提供する。また、基準時刻サーバ9、11は、例えばセンタサーバ8、コントローラ10に直接接続して時刻情報を供給する。基準時刻サーバ3、6と同様に、LANを経由して時刻情報を供給してもよい。本実施例においては、基準時刻サーバ3、6、9、11のそれぞれが正確な時刻を取得することにより、LAN2、LAN5、センタサーバ8、コントローラ10のそれぞれに提供される時刻情報を同期させることが可能となる。 The reference time servers 3, 6, 9, and 11 acquire accurate time using, for example, a GPS (Global Positioning System) or a radio clock, and provide the time information to other devices. The reference time servers 3 and 6 Are connected to LAN2 and LAN5, respectively, and provide time information via the network. The reference time servers 9 and 11 supply time information by directly connecting to the center server 8 and the controller 10, for example. Similar to the reference time servers 3 and 6, time information may be supplied via a LAN. In the present embodiment, the time information provided to each of the LAN 2, the LAN 5, the center server 8, and the controller 10 is synchronized by each of the reference time servers 3, 6, 9, 11 acquiring an accurate time. Is possible.
 図2は、映像伝送装置であるカメラ1a、1b、1c、4a、4b、4cの内部ブロック構成の一例を示す図である。100はレンズ、101は撮像素子、102は映像圧縮回路、103は映像バッファ、104はシステムエンコーダ、105はパケットバッファ、106は基準信号発生回路、107はLANインタフェース回路、108は制御回路、109はメモリである。 FIG. 2 is a diagram illustrating an example of an internal block configuration of the cameras 1a, 1b, 1c, 4a, 4b, and 4c that are video transmission apparatuses. Reference numeral 100 denotes a lens, 101 an image sensor, 102 a video compression circuit, 103 a video buffer, 104 a system encoder, 105 a packet buffer, 106 a reference signal generation circuit, 107 a LAN interface circuit, 108 a control circuit, 109 It is memory.
 レンズ100を介して撮像素子101で得られた映像信号は、映像圧縮回路102に入力され、色調やコントラストの補正が行われて、映像バッファ103に格納される。次に、映像圧縮回路102は、映像バッファ103に格納された映像データを読み出して、例えば、映像圧縮符号化方式としてISO/IEC13818-2(通称MPEG2Video)MP@ML(Main Profile@Main Level)規格に準拠した圧縮符号化データを生成する。その他、映像圧縮符号化方式としては、H.264/AVC規格方式やJPEG規格方式でもよい。 The video signal obtained by the image sensor 101 via the lens 100 is input to the video compression circuit 102, color tone and contrast are corrected, and stored in the video buffer 103. Next, the video compression circuit 102 reads out the video data stored in the video buffer 103 and, for example, ISO / IEC13818-2 (commonly known as MPEG2Video) MP @ ML (Main Profile @ Main Level) standard as a video compression encoding system. Compressed encoded data that conforms to the above is generated. Other video compression encoding methods include H.264. The H.264 / AVC standard method or the JPEG standard method may be used.
 また、異なる映像圧縮符号化方式のカメラが混在してもよいし、一つのカメラが映像圧縮符号化方式を選択して切り換えてもよい。生成した圧縮符号化映像データは、システムエンコーダ104に入力される。基準信号発生回路106は、例えば映像信号フレームの区切りを示すフレームパルスを撮像素子101や映像圧縮回路102の処理タイミングの基準となる基準信号として、撮像素子101、映像圧縮回路102に供給する。この基準信号に従って、撮像素子による映像の撮影、撮影した映像の圧縮、および圧縮した映像の送信(後述)が行われる。 In addition, cameras of different video compression encoding methods may be mixed, or one camera may select and switch the video compression encoding method. The generated compressed encoded video data is input to the system encoder 104. The reference signal generation circuit 106 supplies, for example, a frame pulse indicating a break of the video signal frame to the image sensor 101 and the video compression circuit 102 as a reference signal serving as a reference for processing timing of the image sensor 101 and the video compression circuit 102. In accordance with this reference signal, the image pickup by the image sensor, the compression of the shot image, and the transmission of the compressed image (described later) are performed.
 図2の説明においては、システムエンコーダ104、映像圧縮回路102はハードウェアとして説明される。しかしこれらは、制御回路108が各々に対応する機能を持つプログラムを図示しないメモリに展開して実行することにより、各機能をソフトウェアでも実現可能である。以下では説明の簡略化のため、各機能に対応するプログラムを制御回路108が実行する場合も含め、システムエンコーダ104、映像圧縮回路102が動作主体として各処理を実行するように説明する。 In the description of FIG. 2, the system encoder 104 and the video compression circuit 102 are described as hardware. However, these functions can also be realized by software by causing the control circuit 108 to develop and execute a program having functions corresponding to each in a memory (not shown). In the following, for simplification of description, the system encoder 104 and the video compression circuit 102 will be described as executing the processes, including the case where the control circuit 108 executes a program corresponding to each function.
 また、映像圧縮回路102、映像バッファ103、システムエンコーダ104、パケットバッファ105、基準信号発生回路106、制御回路108をまとめて単に映像処理部ともいう。 Also, the video compression circuit 102, the video buffer 103, the system encoder 104, the packet buffer 105, the reference signal generation circuit 106, and the control circuit 108 are collectively referred to simply as a video processing unit.
 本実施例では、一つのLAN内に接続されているカメラ間で時刻を同期させる仕組みを設けており、この基準信号は、そのLAN内に接続されているカメラ間で同期した信号である。時刻を同期させる方法としては、例えばIEEE1588で規定されるPrecision Clock Synchronization Protocol for Networked Measurement and ControlSystemsに記載されたPTP(Precision Time Protocol)に従う方法を用いることができる。 In this embodiment, a mechanism for synchronizing time between cameras connected in one LAN is provided, and this reference signal is a signal synchronized between cameras connected in the LAN. As a method of synchronizing the time, for example, a method based on PTP (Precision Time Protocol) that is described in the Precision Clock Synchronization Protocol for Networked Measurement and Control Systems defined in IEEE 1588 can be used.
 前述のLAN2においては、基準時刻サーバ3がGPSや電波時計により得た正確な時刻情報を、LAN2を介して各カメラ1a、1b、1cに提供し、カメラ間で定期的に時刻を同期させ、その時刻を用いて基準信号発生回路106の動作クロックの発振周期を例えばPLL(Phase Locked Loop)を用いて調節する。このようにすることにより、基準信号の周期をカメラ間で一致させることができる。この時刻同期の方法について説明する。 In the LAN 2 described above, accurate time information obtained by the reference time server 3 using a GPS or a radio clock is provided to each camera 1a, 1b, 1c via the LAN 2, and the time is periodically synchronized between the cameras. Using that time, the oscillation period of the operation clock of the reference signal generation circuit 106 is adjusted using, for example, a PLL (Phase Locked Loop). By doing in this way, the period of a reference signal can be made to correspond between cameras. This time synchronization method will be described.
 本実施例では、基準時刻サーバを時刻同期のためのサーバと定義し、各カメラをサーバ側の時刻に合わせるクライアント側と定義する。 In this embodiment, the reference time server is defined as a server for time synchronization, and each camera is defined as a client side that matches the time on the server side.
 図3に、サーバ側とクライアント側が時刻同期をとるために行うパケット送受信の方法を示す。図3は、サーバ側とクライアント側との間で行われる時刻同期のためのメッセージパケットのやり取りの一例を示したものである。
  図3において、サーバ側は、時刻同期を取るために、T1時刻地点で同期情報を取るための最初のパケットをクライアント側に送信する。本パケットは、Syncパケットと呼ばれ、このパケットを受信した図2のLANインタフェース回路107は、このパケット情報を基準信号発生回路106に伝達する。
FIG. 3 shows a packet transmission / reception method performed for time synchronization between the server side and the client side. FIG. 3 shows an example of exchange of message packets for time synchronization performed between the server side and the client side.
In FIG. 3, the server side transmits the first packet for obtaining synchronization information at the T1 time point to the client side for time synchronization. This packet is called a Sync packet, and the LAN interface circuit 107 in FIG. 2 receiving this packet transmits this packet information to the reference signal generation circuit 106.
 基準信号発生回路106では、Syncパケットに記載されたサーバ側のパケット送信時刻(T1)と、基準信号発生回路106で計測したこのパケットの到着時刻(T2)を取得する。次に、基準信号発生回路106では、クライアントからサーバへ送信するパケット(DelayReq)を生成し、LANインタフェース回路107に送る。その際、本パケットを送信する時刻(T3)をDelayReqパケットに格納し、サーバ側に送信する。 The reference signal generation circuit 106 acquires the packet transmission time (T1) on the server side described in the Sync packet and the arrival time (T2) of this packet measured by the reference signal generation circuit 106. Next, the reference signal generation circuit 106 generates a packet (DelayReq) to be transmitted from the client to the server and sends it to the LAN interface circuit 107. At this time, the time (T3) at which this packet is transmitted is stored in the DelayReq packet and transmitted to the server side.
 サーバにおいては、DelayReqのパケットが到着したタイミング(T4)を読み取り、これをDelayRespのパケット内に記述して、クライアント側に送信する。クライアント側に到着したDelayRespパケットは、LANインタフェース回路107から基準信号発生回路106に伝達される。以上の過程で、基準信号発生回路106は、T1、T2、T3およびT4の時刻情報を得る。 The server reads the timing (T4) when the DelayReq packet arrives, describes this in the DelayResp packet, and transmits it to the client side. The DelayResp packet that has arrived at the client side is transmitted from the LAN interface circuit 107 to the reference signal generation circuit 106. In the above process, the reference signal generation circuit 106 obtains time information of T1, T2, T3, and T4.
 サーバ・クライアント間のパケット送受信時の時間差は、ネットワークの伝送遅延Tnetと両者の装置の基準時刻の差Toffset(クライアントの時刻―サーバ側の時刻)を考えるとT2-T1=Tnet+Toffset、T4-T3=Tnet-Toffsetとなる(ただし、サーバ/クライアント間のネットワークの伝送遅延は、上りと下りで同時間と仮定している)ため、Tnet=(T2-T1+T4-T3)/2、Toffset=T2-T1-Tnetとして求めることができる。 Considering the network transmission delay Tnet and the difference Toffset (client time-server time) between the two devices, the time difference during packet transmission / reception between the server and the client is T2-T1 = Tnet + Toffset, T4-T3 = Tnet-Toffset (however, the network transmission delay between the server and the client is assumed to be the same time for upstream and downstream), so Tnet = (T2−T1 + T4−T3) / 2, Toffset = T2−T1 It can be obtained as -Tnet.
 基準信号発生回路106は、T1、T2、T3およびT4情報が得られた段階で、上記計算によりToffsetを計算する。上記と同様、複数回、Sync、DelayReq,DelayRespのパケットの送受信を繰り返し、数回にわたりToffsetを計算し、Toffsetが0に近づく方向に、基準信号発生回路106の動作クロック供給源を制御する。この制御は、例えば動作クロック供給源をVCXO(Voltage-Controlled Crystal Oscillator)で構成した場合、Toffsetがプラス値でクロックを遅くしたいときには、制御電圧を下げ、反対にToffsetがマイナス値でクロックを早めたいときには、制御電圧を上げることにより行われる。この制御により、クライアントの時刻がサーバの時刻に一致するようになる。 The reference signal generation circuit 106 calculates Toffset by the above calculation when the T1, T2, T3, and T4 information is obtained. Similarly to the above, the transmission / reception of the Sync, DelayReq, and DelayResp packets is repeated a plurality of times, Toffset is calculated several times, and the operation clock supply source of the reference signal generation circuit 106 is controlled in the direction in which Toffset approaches 0. In this control, for example, when the operation clock supply source is composed of VCXO (Voltage-Controlled Crystal Crystal Oscillator), when the Toffset is a positive value and the clock is delayed, the control voltage is lowered, and on the contrary, the Toffset is a negative value and the clock is accelerated. Sometimes this is done by raising the control voltage. With this control, the client time coincides with the server time.
 このように、Toffsetの絶対値に応じたフィードバック制御を設けることにより、サーバ側に同期した周波数に収束させることが可能である。また、クライアント側はサーバ側と同期した時刻を得ることが可能となる。そして、基準信号の周期を複数のカメラ間で一致させることができる。さらに、同期した時刻に対する基準信号の位相を決定し、複数の各カメラがその位相に従って基準信号を発生されることにより、複数の各カメラの基準信号の周期、および位相を合わせることが可能になる。基準信号の位相は、例えばセンタサーバ8が決定し、その位相情報をネットワークを介して各カメラに送信することで実現される。位相情報としては、例えば周期30サイクルの基準信号の立ち上がりのタイミングを、同期した時刻で表すことにより決定できる。 Thus, by providing feedback control according to the absolute value of Toffset, it is possible to converge to a frequency synchronized with the server side. Further, the client side can obtain the time synchronized with the server side. And the period of a reference signal can be made to correspond between several cameras. Furthermore, by determining the phase of the reference signal with respect to the synchronized time and generating a reference signal according to the phase of each of the plurality of cameras, it becomes possible to match the period and phase of the reference signal of each of the plurality of cameras. . The phase of the reference signal is realized, for example, by the center server 8 determining and transmitting the phase information to each camera via the network. The phase information can be determined, for example, by expressing the rising timing of the reference signal having a cycle of 30 cycles as a synchronized time.
 次に、映像データの処理について説明する。映像圧縮回路102で圧縮され、システムエンコーダ104に入力された圧縮符号化映像データは、以下示すように、パケット化される。 Next, video data processing will be described. The compressed encoded video data compressed by the video compression circuit 102 and input to the system encoder 104 is packetized as shown below.
 図4は、ディジタル圧縮処理の例であり、ディジタル圧縮映像信号のフレーム単位で圧縮されたイントラフレームデータと、前後のフレームのデータに基づく予測を用いて差分情報のみの圧縮を行ったインターフレームデータの関係である。201はイントラフレーム、202はインターフレームである。ディジタル圧縮映像信号は、所定数のフレーム、例えば15フレームを一つのシーケンスとし、その先頭はイントラフレームとし、残りのフレームはイントラフレームからの予測を用いて圧縮したインターフレームとしている。もちろん、先頭以外にもイントラフレームを配置するようにしてもよい。また、先頭のフレームのみをイントラフレームとし、後続のフレームは全てインターフレームとしてもよいし、全てのフレームをイントラフレームとしてもよい。 FIG. 4 shows an example of digital compression processing. Intraframe data compressed in units of frames of a digital compressed video signal and interframe data in which only difference information is compressed using prediction based on data of previous and subsequent frames. It is a relationship. 201 is an intra frame and 202 is an inter frame. The digital compressed video signal has a predetermined number of frames, for example, 15 frames as one sequence, the head of which is an intra frame, and the remaining frames are inter frames compressed using prediction from the intra frame. Of course, an intra frame other than the head may be arranged. Also, only the first frame may be an intra frame, and all subsequent frames may be inter frames, or all the frames may be intra frames.
 図5は、ディジタル圧縮映像信号の構成の一例である。302はフレーム単位で付加されるピクチャヘッダ、301はシーケンス単位で付加されるシーケンスヘッダである。シーケンスヘッダ301は、同期信号及び伝送レート等の情報により構成される。ピクチャヘッダ302は、同期信号及びイントラフレームかインターフレームかの識別情報等により構成される。通常、各データの長さは情報量により変化する。このディジタル映像圧縮信号は、後述のトランスポートパケットに分割されてパケット列となる。 FIG. 5 shows an example of the configuration of the digital compressed video signal. 302 is a picture header added in units of frames, and 301 is a sequence header added in units of sequences. The sequence header 301 includes information such as a synchronization signal and a transmission rate. The picture header 302 includes a synchronization signal and identification information such as an intra frame or an inter frame. Usually, the length of each data changes with the amount of information. This digital video compression signal is divided into transport packets to be described later to form a packet sequence.
 図6はディジタル映像圧縮信号のトランスポートパケットの構成例である。40はそのトランスポートパケットであり、1パケットは固定長、例えば、188バイトで構成され、パケットヘッダ401と、パケット情報402により構成されている。図5で説明したディジタル圧縮映像信号は、パケット情報402の領域に分割されて配置され、また、パケットヘッダ401はパケット情報の種類等の情報により構成される。 FIG. 6 is a configuration example of a transport packet of a digital video compression signal. Reference numeral 40 denotes the transport packet, and one packet is composed of a fixed length, for example, 188 bytes, and is composed of a packet header 401 and packet information 402. The digital compressed video signal described with reference to FIG. 5 is divided and arranged in an area of packet information 402, and the packet header 401 includes information such as the type of packet information.
 システムエンコーダ104によりパケット化されたディジタル映像圧縮信号は、パケットバッファ105に一旦格納され、パケットバッファ105から読み出されたパケット列は、LANインタフェース回路107に入力される。 The digital video compression signal packetized by the system encoder 104 is temporarily stored in the packet buffer 105, and the packet string read from the packet buffer 105 is input to the LAN interface circuit 107.
 図2のLANインタフェース回路107では、入力されたパケット列を、例えばIEEE802.3規格に準拠したLANパケットにパケット化して出力する。 The LAN interface circuit 107 in FIG. 2 packetizes the input packet sequence into, for example, a LAN packet conforming to the IEEE 802.3 standard and outputs the packet.
 図7は、システムエンコーダ104によって生成されたパケット列のLANパケット化の例を示す図である。LANパケット60は、例えば1パケットが最大1518バイトの可変長で、LANパケットヘッダ601とLANパケット情報602で構成される。システムエンコーダ106によって生成されたトランスポートパケット40は、前述のネットワークプロトコルに従って、LANパケット情報602の領域にデータ誤り検出符号などと共に格納され、各カメラを識別するためのLAN4上におけるアドレス情報などが格納されるLANパケットヘッダ601が付加されてLANパケット60としてLANに出力される。 FIG. 7 is a diagram illustrating an example of converting the packet sequence generated by the system encoder 104 into a LAN packet. The LAN packet 60 has, for example, one packet having a variable length of a maximum of 1518 bytes, and includes a LAN packet header 601 and LAN packet information 602. The transport packet 40 generated by the system encoder 106 is stored in the area of the LAN packet information 602 together with a data error detection code in accordance with the network protocol described above, and address information on the LAN 4 for identifying each camera is stored. A LAN packet header 601 is added and output as a LAN packet 60 to the LAN.
 また、LANインタフェース回路107では、LAN4に接続されている機器との制御のための情報のやり取りが行われる。これは、制御回路108からの指示などの情報をLANパケット情報602に格納し、LAN4上に送信、あるいはLAN4から受信したLANパケット60のLANパケット情報602から情報を取り出し制御回路108に伝達することで行われる。前述の基準信号の位相情報の取得もここで行われる。 In addition, the LAN interface circuit 107 exchanges information for control with devices connected to the LAN 4. This is because information such as an instruction from the control circuit 108 is stored in the LAN packet information 602, information is transmitted from the LAN packet 60 of the LAN packet 60 transmitted or received from the LAN 4, and transmitted to the control circuit 108. Done in The acquisition of the phase information of the reference signal is also performed here.
 図8は、センタサーバ8の内部ブロック構成の一例を示す図である。801はLANインタフェース回路、8021~8023はシステムデコーダ、8031~8033は映像伸張回路、804は画像処理回路、806は基準信号発生回路、807は制御回路、808は映像圧縮回路、809は映像バッファ、810はシステムエンコーダ、811はパケットバッファである。 FIG. 8 is a diagram illustrating an example of an internal block configuration of the center server 8. 801 is a LAN interface circuit, 8021 to 8023 are system decoders, 8031 to 8033 are video decompression circuits, 804 is an image processing circuit, 806 is a reference signal generation circuit, 807 is a control circuit, 808 is a video compression circuit, 809 is a video buffer, Reference numeral 810 denotes a system encoder, and 811 denotes a packet buffer.
 図8の説明においては、システムデコーダ8021~8023、映像伸張回路8031~8033、画像処理回路804、映像圧縮回路808、システムエンコーダ810はハードウェアとして説明される。しかしこれらは、制御回路807が各々に対応する機能を持つプログラムを図示しないメモリに展開して実行することにより、各機能をソフトウェアでも実現可能である。以下では説明の簡略化のため、各機能に対応するプログラムを制御回路807が実行する場合も含め、システムデコーダ8021~8023、映像伸張回路8031~8033、画像処理回路804、映像圧縮回路808、システムエンコーダ810が動作主体として各処理を実行するように説明する。また、システムデコーダ8021~8023、映像伸張回路8031~8033、画像処理回路804、基準信号発生回路806、制御回路807、映像圧縮回路808、映像バッファ809、パケットバッファ811、システムエンコーダ810をまとめて単に映像処理部ともいう。 In the description of FIG. 8, the system decoders 8021 to 8023, the video expansion circuits 8031 to 8033, the image processing circuit 804, the video compression circuit 808, and the system encoder 810 are described as hardware. However, these functions can also be realized by software by causing the control circuit 807 to develop and execute a program having a function corresponding to each in a memory (not shown). In the following, for simplification of explanation, including the case where the control circuit 807 executes a program corresponding to each function, system decoders 8021 to 8023, video decompression circuits 8031 to 8033, image processing circuit 804, video compression circuit 808, system A description will be given so that the encoder 810 executes each process as an operation subject. Further, the system decoders 8021 to 8023, the video decompression circuits 8031 to 8033, the image processing circuit 804, the reference signal generation circuit 806, the control circuit 807, the video compression circuit 808, the video buffer 809, the packet buffer 811 and the system encoder 810 are simply combined. Also called a video processing unit.
 各カメラで生成されたLANパケット60は、LANインタフェース回路801へ入力される。カメラから入力されたLANパケット60は、LANインタフェース回路801において、LANパケットヘッダ601が取り除かれ、前述のネットワークプロトコルに従い、LANパケットデータ602からトランスポートパケット40が取り出される。トランスポートパケット40は、システムデコーダ8021に入力され、トランスポートパケット40から前述のパケット情報402が取り出され、結合されて、図5で示したディジタル圧縮映像信号となる。 The LAN packet 60 generated by each camera is input to the LAN interface circuit 801. In the LAN packet 60 input from the camera, the LAN packet header 601 is removed in the LAN interface circuit 801, and the transport packet 40 is extracted from the LAN packet data 602 according to the network protocol described above. The transport packet 40 is input to the system decoder 8021, and the packet information 402 described above is extracted from the transport packet 40 and combined into the digital compressed video signal shown in FIG.
 このディジタル圧縮映像信号は、映像伸張回路8031において、伸張処理が行われ、ディジタル映像信号として画像処理回路804に入力される。他のカメラから入力されるLANパケット60についても同様の処理が行われ、例えば映像伸張回路8032、8033からディジタル映像信号が画像処理回路804に入力される。画像処理回路804では、各カメラからの映像信号の画質補正、歪補正、座標の置き換えによる視点変換、合成処理等を施し、あるいは、各カメラからの映像信号による物体形状の認識、距離の計測、動き検出などの画像処理を行う。 The digital compressed video signal is subjected to expansion processing in a video expansion circuit 8031 and is input to the image processing circuit 804 as a digital video signal. The same processing is performed on the LAN packet 60 input from another camera. For example, digital video signals are input to the image processing circuit 804 from the video expansion circuits 8032 and 8033. The image processing circuit 804 performs image quality correction, distortion correction, viewpoint conversion by coordinate replacement, composition processing, etc. from each camera, or recognition of an object shape by video signals from each camera, distance measurement, Perform image processing such as motion detection.
 また、例えばそれらの画像処理を行って得られた結果を映像信号に文字や図形として重畳し、映像圧縮回路808に出力する。さらに、図2での説明と同様、映像圧縮回路808、映像バッファ809では、ディジタル映像圧縮信号を生成し、システムエンコーダ810、パケットバッファ811において、トランスポートパケットが生成され、LANインタフェース回路801に入力される。LANインタフェース回路801において、入力されたパケット列を、例えばIEEE802.3規格に準拠したLANパケットにパケット化して出力する。 Also, for example, the results obtained by performing such image processing are superimposed on the video signal as characters or graphics and output to the video compression circuit 808. 2, the video compression circuit 808 and the video buffer 809 generate digital video compression signals, and the system encoder 810 and the packet buffer 811 generate transport packets, which are input to the LAN interface circuit 801. Is done. In the LAN interface circuit 801, the input packet sequence is packetized into, for example, a LAN packet conforming to the IEEE 802.3 standard and output.
 基準信号発生回路806は、映像信号フレームの区切りを示すフレームパルスを画像処理回路804の処理タイミングの基準となる基準信号として、画像処理回路804に供給する。この基準信号の調節は、制御回路807が基準信号発生回路806を制御することで行われる。 The reference signal generation circuit 806 supplies a frame pulse indicating a video signal frame break to the image processing circuit 804 as a reference signal serving as a reference for processing timing of the image processing circuit 804. The reference signal is adjusted by the control circuit 807 controlling the reference signal generation circuit 806.
 また、LANインタフェース回路801では、各カメラとの制御のための情報のやり取りを行うため、制御回路807からの指示などの情報を、LANパケット情報602に格納し、各カメラに送信、あるいは各カメラから受信したLANパケット60のLANパケット情報602から情報を取り出し制御回路807に伝達する。 Further, in the LAN interface circuit 801, information such as an instruction from the control circuit 807 is stored in the LAN packet information 602 and transmitted to each camera, or transmitted to each camera in order to exchange information for control with each camera. The information is extracted from the LAN packet information 602 of the LAN packet 60 received from, and transmitted to the control circuit 807.
 図9は、コントローラ10の内部ブロック構成の一例を示す図である。901はLANインタフェース回路であり、前述のネットワークプロトコルに従い、LANパケット60のLANパケット情報602から取り出したトランスポートパケット40をシステムデコーダ9021~9023に振り分けて出力する。システムデコーダ9021~9023、映像伸長回路9031~9033の処理は、図8のシステムデコーダ8021~8023、映像伸張回路8031~8033の説明と同様である。 FIG. 9 is a diagram illustrating an example of an internal block configuration of the controller 10. Reference numeral 901 denotes a LAN interface circuit which distributes the transport packet 40 extracted from the LAN packet information 602 of the LAN packet 60 to the system decoders 9021 to 9023 according to the network protocol described above. The processing of the system decoders 9021 to 9023 and the video decompression circuits 9031 to 9033 is the same as the description of the system decoders 8021 to 8023 and the video decompression circuits 8031 to 8033 in FIG.
 また、LANインタフェース回路901では、各カメラやセンタサーバ8との制御のための情報のやり取りを行うため、制御回路907からの指示などの情報を、LANパケット情報602に格納し、各カメラおよびセンタサーバ8に送信、あるいはそれらから受信したLANパケット60のLANパケット情報602から情報を取り出し制御回路907に伝達する。映像伸張回路9031~9033からのディジタル映像信号は、画像処理回路904に入力される。画像処理回路904では、各カメラからの映像信号を複数同時に並べて表示したり、センタサーバ8からの映像信号を表示し、センタサーバ8での画像処理結果を観測したりできるようにする。各カメラからの映像信号と、センタサーバ8からの映像信号を並べて表示してもよい。OSD回路905では、画像処理回路904からの映像信号に文字や図形を重畳し、ディスプレイ6に出力する。 In addition, since the LAN interface circuit 901 exchanges information for control with each camera and the center server 8, information such as an instruction from the control circuit 907 is stored in the LAN packet information 602, and each camera and center Information is extracted from the LAN packet information 602 of the LAN packet 60 transmitted to or received from the server 8 and transmitted to the control circuit 907. Digital video signals from the video expansion circuits 9031 to 9033 are input to the image processing circuit 904. The image processing circuit 904 displays a plurality of video signals from each camera side by side or displays a video signal from the center server 8 so that an image processing result in the center server 8 can be observed. The video signal from each camera and the video signal from the center server 8 may be displayed side by side. The OSD circuit 905 superimposes characters and figures on the video signal from the image processing circuit 904 and outputs it to the display 6.
 図9の説明においては、システムデコーダ9021~9023、映像伸張回路9031~9033、画像処理回路904はハードウェアとして説明される。しかしこれらは、制御回路907が各々に対応する機能を持つプログラムを図示しないメモリに展開して実行することにより、各機能をソフトウェアでも実現可能である。以下では説明の簡略化のため、各機能に対応するプログラムを制御回路907が実行する場合も含め、システムデコーダ9021~9023、映像伸張回路9031~9033、画像処理回路904、が動作主体として各処理を実行するように説明する。 In the description of FIG. 9, the system decoders 9021 to 9023, the video decompression circuits 9031 to 9033, and the image processing circuit 904 are described as hardware. However, these functions can also be realized by software by causing the control circuit 907 to develop and execute a program having functions corresponding to each in a memory (not shown). In the following, for simplification of description, the system decoders 9021 to 9023, the video decompression circuits 9031 to 9033, and the image processing circuit 904 are mainly used as the operation subjects, including the case where the control circuit 907 executes a program corresponding to each function. Will be explained.
 図10は、本実施例における、各カメラの動作処理のフローチャートである。前述のように、基準時刻サーバとの間で、動作クロックの同期動作を行い時刻を同期する(ステップS101)。次に、前述のフレームパルスである基準信号の位相情報をLANインタフェース回路107で抽出し(ステップS102)、その位相情報をもとに、基準信号を生成する(ステップS103)。そして、その基準信号に従って、撮像した映像信号を圧縮し(ステップS104)、送信する(ステップS105)。
図11は、本実施例における、センタサーバ8の動作処理のフローチャートである。まず、基準時刻サーバとの間で動作クロックの同期動作を行い時刻を同期する(ステップS111)。次に、その動作クロックをもとに基準信号を生成する(ステップS112)。さらに、例えば、基準信号の発生時刻を基準時刻で示した、基準信号の位相情報を各カメラ、およびコントローラ10に送信する(ステップS113)。そして、各カメラからの映像信号を受信し(ステップ114)、復号し(ステップS115)、後述する第2の基準信号を生成し(ステップ116)、前述の画像処理を施し(ステップS117)、その結果の映像信号を圧縮し(ステップS118)、コントローラ10に送信する(ステップS119)。
FIG. 10 is a flowchart of the operation process of each camera in this embodiment. As described above, the operation clock is synchronized with the reference time server to synchronize the time (step S101). Next, the phase information of the reference signal that is the frame pulse is extracted by the LAN interface circuit 107 (step S102), and the reference signal is generated based on the phase information (step S103). Then, in accordance with the reference signal, the captured video signal is compressed (step S104) and transmitted (step S105).
FIG. 11 is a flowchart of the operation process of the center server 8 in this embodiment. First, the operation clock is synchronized with the reference time server to synchronize the time (step S111). Next, a reference signal is generated based on the operation clock (step S112). Further, for example, the reference signal phase information indicating the generation time of the reference signal as the reference time is transmitted to each camera and the controller 10 (step S113). Then, the video signal from each camera is received (step 114), decoded (step S115), a second reference signal to be described later is generated (step 116), and the image processing described above is performed (step S117). The resulting video signal is compressed (step S118) and transmitted to the controller 10 (step S119).
 図12は、本実施例における、コントローラ10の動作処理のフローチャートである。 FIG. 12 is a flowchart of the operation process of the controller 10 in this embodiment.
 同様に基準時刻サーバとの間で動作クロックの同期動作を行い(ステップS121)、基準信号の位相情報をLANインタフェース回路901で抽出し(ステップS122)、その位相情報をもとに、基準信号を生成する(ステップS123)。そして、各カメラやデータセンタセンタサーバ8からの映像信号を受信し(ステップS124)、復号し(ステップS125)、前述の画像処理回路904による画像処理を行い(ステップS126)、その結果を表示する(ステップS127)。 Similarly, the operation clock is synchronized with the reference time server (step S121), the phase information of the reference signal is extracted by the LAN interface circuit 901 (step S122), and the reference signal is extracted based on the phase information. Generate (step S123). Then, video signals from each camera and data center center server 8 are received (step S124), decoded (step S125), image processing by the above-described image processing circuit 904 is performed (step S126), and the result is displayed. (Step S127).
 図13は、本実施例における各カメラの送信処理タイミングとセンタサーバ8の受信処理タイミングの一例を示す図である。同図中、(1-1)~(1-4)はカメラ1aの処理タイミング、(2-1)~(2-4)はカメラ4bの処理タイミング、(3-1)~(3-11)はセンタサーバ8の処理タイミングを示す。 FIG. 13 is a diagram illustrating an example of the transmission processing timing of each camera and the reception processing timing of the center server 8 in the present embodiment. In the figure, (1-1) to (1-4) are processing timings of the camera 1a, (2-1) to (2-4) are processing timings of the camera 4b, and (3-1) to (3-11). ) Shows the processing timing of the center server 8.
 (1-1)は基準信号aであり、(1-2)は撮像素子101による撮像処理を行っている撮像タイミングa、(1-3)は映像圧縮回路102による映像圧縮処理を行っている映像圧縮タイミングa、(1-4)はLANインタフェース回路107による送信処理を行っている送信タイミングaである。ここでは、基準信号毎に1フレーム分の映像信号の処理を行っている。カメラ1aは、基準信号aを処理の基準として、例えば基準信号aのパルスのタイミングで撮像処理を開始し、その後、映像圧縮処理、送信処理を順に行っていく。カメラ1aでは、基準信号aから送信タイミングaの送信処理開始までの時間d1が処理遅延時間となる。 (1-1) is the reference signal a, (1-2) is the imaging timing a when imaging processing is performed by the imaging device 101, and (1-3) is the video compression processing performed by the video compression circuit 102. Video compression timing a, (1-4) is transmission timing a at which the LAN interface circuit 107 performs transmission processing. Here, a video signal for one frame is processed for each reference signal. The camera 1a uses the reference signal a as a processing reference, starts imaging processing, for example, at the timing of the pulse of the reference signal a, and then sequentially performs video compression processing and transmission processing. In the camera 1a, a time d1 from the reference signal a to the start of transmission processing at the transmission timing a is a processing delay time.
 また、(2-1)はカメラ4bの基準信号bであり、(2-2)はカメラ4bの撮像素子101による撮像処理を行っている撮像タイミングb、(2-3)は映像圧縮回路102による映像圧縮処理を行っている映像圧縮タイミングb、(2-4)はカメラ4bのLANインタフェース回路107による送信処理を行っている送信タイミングbである。カメラ4bは、基準信号bを処理の基準として、基準信号bのタイミングで撮像処理を開始し、その後、映像圧縮処理、送信処理を順に行っていく。カメラ4bでは、基準信号bから送信タイミングbまでの時間d2が処理遅延時間となる。また、前述のように、カメラ1aの基準信号aと、カメラ4bの基準信号bは同期している。 Further, (2-1) is a reference signal b of the camera 4b, (2-2) is an imaging timing b at which an imaging process is performed by the imaging device 101 of the camera 4b, and (2-3) is a video compression circuit 102. (2-4) is a transmission timing b at which the LAN interface circuit 107 of the camera 4b performs a transmission process. The camera 4b starts the imaging process at the timing of the reference signal b using the reference signal b as a processing reference, and then sequentially performs the video compression process and the transmission process. In the camera 4b, a time d2 from the reference signal b to the transmission timing b is a processing delay time. Further, as described above, the reference signal a of the camera 1a and the reference signal b of the camera 4b are synchronized.
 次に(3-1)はセンタサーバ8の基準信号S、(3-2)はセンタサーバ8がカメラ1aからのLANパケットの受信処理を行っている受信タイミングa、(3-3)は映像伸張回路8031による映像伸張処理を行っている映像伸張タイミングa、(3-4)は、映像伸張回路8031により伸張され得られた1フレーム分のカメラ1aの映像出力タイミングaである。時間d3は、基準信号aから映像が出力されるまでの遅延時間であり、前述の遅延時間d1と、WANを経由するパケットの伝送遅延時間、センタサーバ8の受信、伸張処理遅延時間を合わせたものである。 Next, (3-1) is the reference signal S of the center server 8, (3-2) is the reception timing a when the center server 8 receives the LAN packet from the camera 1a, and (3-3) is the video. Video expansion timing a (3-4) for performing video expansion processing by the expansion circuit 8031 is the video output timing a of the camera 1a for one frame obtained by expansion by the video expansion circuit 8031. The time d3 is a delay time from the reference signal a until the video is output, and the delay time d1 described above is combined with the transmission delay time of the packet passing through the WAN, the reception of the center server 8, and the expansion processing delay time. Is.
 また、(3-5)は、センタサーバ8がカメラ4bからのLANパケットの受信処理を行っている受信タイミングb、(3-6)は映像伸張回路8032による映像伸張処理を行っている映像伸張タイミングb、(3-7)は映像伸張回路8032により伸張され得られた1フレーム分のカメラ4bの映像出力タイミングbである。時間d4はd3と同様に、基準信号bから映像が出力されるまでの遅延時間であり、前述の遅延時間d2と、WANを経由するパケットの伝送遅延時間、センタサーバ8の受信、伸張処理遅延時間を合わせたものである。 Also, (3-5) is the reception timing b when the center server 8 is receiving the LAN packet from the camera 4b, and (3-6) is the video expansion where the video expansion circuit 8032 is performing the video expansion processing. Timing b, (3-7) is the video output timing b of the camera 4b for one frame obtained by the video expansion circuit 8032. Similarly to d3, time d4 is a delay time until the video is output from the reference signal b. The delay time d2 described above, the transmission delay time of the packet via the WAN, the reception of the center server 8, and the expansion processing delay It is a combination of time.
 さらに、(3-8)はセンタサーバ8における別の基準信号S2、(3-9)はセンタサーバ8が画像処理回路804で画像処理を行う画像処理タイミングS、(3-10)はセンタサーバ8がディジタル映像圧縮信号を生成する圧縮タイミングS、(3-11)はセンタサーバ8がパケット化したディジタル映像圧縮信号を送信する送信タイミングSである。 Further, (3-8) is another reference signal S2 in the center server 8, (3-9) is an image processing timing S at which the center server 8 performs image processing by the image processing circuit 804, and (3-10) is a center server. 8 is a compression timing S for generating a digital video compression signal, and (3-11) is a transmission timing S for transmitting a packetized digital video compression signal by the center server 8.
 センタサーバ8は、カメラ1aからの受信タイミングaを処理の基準とし、受信処理に引き続き映像伸張処理を順に行っていく。同様にカメラ4bからの受信処理に引き続き映像伸張処理を行う。ここで、カメラ1aの送信タイミングaと、カメラ4bの送信タイミングbは例えば映像の圧縮処理方法の違いにより異なる場合がある。また、カメラ1aからの映像の受信タイミングaと、カメラ4bからの映像の受信タイミングbはWANの経路の違いなどにより異なる場合がある。従って時間d3と時間d4は異なることになる。 The center server 8 uses the reception timing a from the camera 1a as a reference for processing, and sequentially performs video expansion processing following the reception processing. Similarly, the video expansion process is performed following the reception process from the camera 4b. Here, the transmission timing a of the camera 1a and the transmission timing b of the camera 4b may differ depending on, for example, a difference in video compression processing method. In addition, the reception timing a of the video from the camera 1a and the reception timing b of the video from the camera 4b may be different due to a difference in the WAN route. Therefore, the time d3 and the time d4 are different.
 そこで、本実施例では受信した映像を伸張してから出力する映像出力タイミングが遅い映像出力タイミングaを基準として、第2の基準信号S2を生成する。基準信号S2の生成は、基準信号Sに対し映像出力タイミングの遅い方の映像出力タイミングaの遅延時間であるTdelay1だけ遅らせた信号を生成することで実現できる。そして、その基準信号S2を基準として、画像処理、圧縮処理、送信処理を行うことで、各カメラでの撮像からセンタサーバでの映像送信までの時間が、接続された機器間で実現できる最短の遅延時間で実現可能になる。 
 図14は、本実施例におけるセンタサーバ8の送信処理タイミングとコントローラ10の受信処理タイミングの一例を示す図である。同図中、(3-8)~(3-11)はセンタサーバ8の処理タイミングであり、図13の(3-8)~(3-11)と同じである。基準信号S2から送信タイミングSの送信処理開始までの時間d5が処理遅延時間となる。
Therefore, in this embodiment, the second reference signal S2 is generated with reference to the video output timing a that is output after the received video is expanded and output later. The generation of the reference signal S2 can be realized by generating a signal delayed by Tdelay1, which is a delay time of the video output timing a having a later video output timing with respect to the reference signal S. By performing image processing, compression processing, and transmission processing using the reference signal S2 as a reference, the time from imaging with each camera to video transmission with the center server is the shortest that can be realized between connected devices. This can be realized with a delay time.
FIG. 14 is a diagram illustrating an example of the transmission processing timing of the center server 8 and the reception processing timing of the controller 10 in the present embodiment. In the figure, (3-8) to (3-11) are processing timings of the center server 8, which are the same as (3-8) to (3-11) in FIG. A time d5 from the reference signal S2 to the start of transmission processing at the transmission timing S is a processing delay time.
 (4-1)~(4-4)はコントローラ10の処理タイミングを示す。(4-1)はコントローラ10の基準信号C、(4-2)はコントローラ10がセンタサーバ8からのLANパケットの受信処理を行っている受信タイミングC、(4-3)は映像伸張回路9031による映像伸張処理を行っている映像伸張タイミングC、(4-4)は、映像伸張回路9031により伸張され得られた1フレーム分の映像出力タイミングCである。ここで基準信号Cの生成は、基準信号S2に対し映像出力タイミングCの遅延時間であるTdelay2だけ遅らせた信号、すなわち図13の基準信号aに対しTdelay1+Tdelay2送らせた信号を生成することで実現できる。そして、その基準信号Cを基準として、表示処理を行うことで、各カメラでの撮像からセンタサーバでの映像送信までの時間が、接続された機器間で実現できる最短の遅延時間で実現可能になる。 
また、センタサーバ8で、複数カメラの映像処理をする場合に、同一撮影時刻の映像を処理することができ、画像処理精度が向上する。
(4-1) to (4-4) indicate processing timing of the controller 10. (4-1) is a reference signal C of the controller 10, (4-2) is a reception timing C at which the controller 10 is receiving a LAN packet from the center server 8, and (4-3) is a video expansion circuit 9031. The video expansion timing C (4-4) for performing the video expansion processing according to (4) is the video output timing C for one frame obtained by the video expansion circuit 9031. Here, the generation of the reference signal C can be realized by generating a signal delayed by Tdelay2, which is a delay time of the video output timing C, with respect to the reference signal S2, that is, a signal sent by Tdelay1 + Tdelay2 with respect to the reference signal a in FIG. . By performing display processing using the reference signal C as a reference, the time from imaging with each camera to video transmission with the center server can be realized with the shortest delay time that can be realized between connected devices. Become.
Further, when the center server 8 performs video processing of a plurality of cameras, video at the same shooting time can be processed, and image processing accuracy is improved.
 以上の説明では、コントローラ10は、センタサーバ8からの映像信号の受信の例を示したが、センタサーバ8からの映像受信とともに、各カメラからの映像信号を受信し表示してもよい。また、映像信号の送受信について説明したが、音声信号の伝送も同様に可能である。 In the above description, the controller 10 has shown an example of receiving a video signal from the center server 8, but may receive and display a video signal from each camera together with the video reception from the center server 8. Moreover, although transmission / reception of a video signal has been described, transmission of an audio signal is possible as well.
 本実施例によれば、各カメラで撮像してから、センタサーバ8での映像処理、およびコントローラ10での映像出力までの一連の処理を、実現できる最短の遅延時間で行うことができる。また、センタサーバ8での映像処理を同一撮影時刻の映像同士で施すことが可能となり、精度の高い画像処理が実現できる。 According to the present embodiment, a series of processing from image capturing by each camera to video processing at the center server 8 and video output at the controller 10 can be performed with the shortest possible delay time. In addition, video processing at the center server 8 can be performed between videos at the same shooting time, and high-precision image processing can be realized.
 次に、映像伝送装置であるカメラを含む映像伝送システムの形態の別の例について説明する。実施例1と同様の部分については説明を省略する。 Next, another example of the form of a video transmission system including a camera which is a video transmission apparatus will be described. A description of the same parts as those in the first embodiment will be omitted.
 実施例1では、図13に示すように、カメラの基準信号aと基準信号bが周期および位相も含めて同期している例を説明した。しかしながら現実にはシステム同士の間で基準信号の周期(または周波数)のみ一致しているものの位相は必ずしも一致していないという場合もある。本実施例においては、そのような場合を想定して、基準信号aと基準信号bの周期は一致しているが位相は一致していないという場合について説明を行う。 In the first embodiment, as illustrated in FIG. 13, the example in which the camera reference signal a and the reference signal b are synchronized including the period and the phase has been described. However, in reality, there are cases where only the periods (or frequencies) of the reference signals are matched between systems, but the phases are not necessarily matched. In the present embodiment, assuming such a case, the case where the periods of the reference signal a and the reference signal b match but the phases do not match will be described.
 本実施例においても、前述のように、各カメラ、センタサーバ8、コントローラ10は、時刻を同期させる仕組みを設けている。この方法を用いてシステム間で定期的に時刻を同期させ、その時刻を用いてシステム内における基準信号の発振周期を調節することにより、基準信号の周期をシステム間で一致させることができる。 Also in this embodiment, as described above, each camera, the center server 8, and the controller 10 have a mechanism for synchronizing time. By using this method to synchronize the time periodically between systems and adjusting the oscillation period of the reference signal in the system using the time, the period of the reference signal can be matched between the systems.
 図15は本実施例における処理タイミングの一例を示す図である。各カメラはそれぞれの基準時刻を基に基準信号の発振周期を調整するため、各カメラの時刻は一致し、基準信号aと基準信号bの周期は一致する。しかしながら互いの位相は必ずしも一致していない。
ここで、カメラ1aの基準信号aのあるフレームパルスが時刻T0に、カメラ4bの基準信号bのあるフレームパルスが時刻T1にある場合、これらは、各カメラの撮像時刻を示している。本実施例では、この撮像時刻を、映像信号の圧縮処理時に、撮像時刻情報として、例えば前述のトランスポートパケットのパケットヘッダに撮像の度に格納する。
次にセンタサーバ8において、各カメラからの映像信号を受信し伸張してディジタル映像信号を得るところは実施例1と同様である。ここで、センタサーバ8においては、基準信号Sをもとに処理をしている。各カメラの基準信号とセンタサーバ8の基準信号Sの位相は、各カメラから送信される映像信号に格納されている撮像時刻情報と比較することにより得られ、センタサーバ8とカメラ1aとの時刻位相差がd6、センタサーバ8とカメラ4bとの時刻位相差がd7であることが判る。そこで、(3-4)に示された映像a-1から時刻位相差d6遅れた時刻の映像である(3-4‘)のa-1’を、画像処理回路804において生成する。生成に当たっては、例えば映像a-0と映像a-1から、映像の動き予測や補間処理をすることで映像a-1´を求めることで実現できる。
FIG. 15 is a diagram showing an example of processing timing in this embodiment. Since each camera adjusts the oscillation period of the reference signal based on the respective reference time, the time of each camera coincides, and the periods of the reference signal a and the reference signal b coincide. However, the phases of each other do not necessarily match.
Here, when the frame pulse with the reference signal a of the camera 1a is at time T0 and the frame pulse with the reference signal b of the camera 4b is at time T1, these indicate the imaging time of each camera. In the present embodiment, this imaging time is stored as imaging time information, for example, in the packet header of the above-described transport packet at the time of imaging when the video signal is compressed.
Next, the center server 8 receives the video signal from each camera and expands it to obtain a digital video signal as in the first embodiment. Here, the center server 8 performs processing based on the reference signal S. The phase of the reference signal of each camera and the reference signal S of the center server 8 is obtained by comparing with the imaging time information stored in the video signal transmitted from each camera, and the time between the center server 8 and the camera 1a. It can be seen that the phase difference is d6 and the time phase difference between the center server 8 and the camera 4b is d7. Therefore, the image processing circuit 804 generates (3-4 ′) a-1 ′, which is a video delayed by the time phase difference d6 from the video a-1 shown in (3-4). The generation can be realized, for example, by obtaining the video a-1 ′ from the video a-0 and the video a-1 by performing video motion prediction and interpolation processing.
 同様に、カメラ4bの映像についても、映像b-0とb-1から時刻T1から時刻位相差d7経過した映像b-1’を求める。得られた映像a-1´と映像b-1´は、時刻T3で同時に撮像された映像として、以降の処理をすることで、画像処理精度を維持できる。 Similarly, for the video of the camera 4b, the video b-1 'after the time phase difference d7 from the time T1 is obtained from the video b-0 and b-1. The obtained video a-1 ′ and video b-1 ′ can maintain the image processing accuracy by performing the subsequent processing as videos simultaneously captured at time T3.
 なお、本実施例ではa-0と映像a-1からa-1´を、b-0とb-1からb-1´を求める(生成する)例を説明したが、3つ以上のフレームを使う、あるいは1つのフレームのみからa-1´やb-1´を生成してもよい。 In this embodiment, an example in which a-0 and images a-1 to a-1 ′ and b-0 and b-1 to b-1 ′ are obtained (generated) has been described. Or a-1 ′ and b-1 ′ may be generated from only one frame.
 その他、センタサーバでの画像処理が処理の負荷が大きくなり処理遅延が増加すると見込まれる場合は、画像処理に割り当てるCPU(Central Processing Unit)を動的に増やして処理性能を維持することも可能である。 In addition, if the image processing at the center server is expected to increase the processing load and increase the processing delay, it is also possible to maintain the processing performance by dynamically increasing the CPU (Central Processing Unit) allocated to the image processing. is there.
 実施例2によれば、基準信号の位相が一致していない場合でも、各カメラで撮像してから、センタサーバ8での映像処理、およびコントローラ10での映像出力までの一連の処理を、実現できる最短の遅延時間で行うことができる。また、センタサーバ8での映像処理をほぼ同一撮影時刻の映像同士で施すことが可能となり、高い精度を維持しながら画像処理が実現できる。 According to the second embodiment, even when the phases of the reference signals are not matched, a series of processes from image capturing by each camera to video processing at the center server 8 and video output at the controller 10 are realized. This can be done with the shortest possible delay time. In addition, video processing at the center server 8 can be performed between videos at almost the same shooting time, and image processing can be realized while maintaining high accuracy.
 以上のように、これらの実施例により、これらの一連の制御手順を踏むことで、撮像から映像出力までの時間が、接続された機器間で実現できる最短の遅延時間で画像処理できるとともに、同一撮影時刻の映像を処理することができ、画像処理精度が向上するシステムを構築することが可能である。 As described above, according to these embodiments, by performing these series of control procedures, the time from imaging to video output can be processed with the shortest delay time that can be realized between connected devices, and the same. It is possible to construct a system that can process the video at the shooting time and improve the image processing accuracy.
 なお、これらの実施例ではカメラで撮像された映像を同期させる処理を説明したが、カメラ以外にも適用可能である。例えば、複数の温度センサや複数の湿度センサにより感知された温度の情報や湿度の情報を時間的に同期させて1つのディスプレイに表示させることも可能である。 In addition, although the process which synchronizes the image | video imaged with the camera was demonstrated in these Examples, it is applicable also besides a camera. For example, temperature information and humidity information detected by a plurality of temperature sensors and a plurality of humidity sensors can be temporally synchronized and displayed on one display.
1a、1b、1c、4a、4b、4c、…カメラ、2、5…LAN、3、6、9、10…基準時刻サーバ、7…WAN、8…センタサーバ、10…コントローラ、12…ディスプレイ、100…レンズ、101…撮像素子、102…映像圧縮回路、103…映像バッファ、104…システムエンコーダ、105…パケットバッファ、106…基準信号発生回路、107…LANインタフェース回路、108…制御回路、201…イントラフレーム、202…インターフレーム、301…シーケンスヘッダ、302…ピクチャヘッダ、40…トランスポートパケット、401…パケットヘッダ、402…パケット情報、801…LANインタフェース回路、8021、8022、8023…システムデコーダ、8031、8032、8033…映像伸張回路、804…画像処理回路、806…基準信号発生回路、807…制御回路、901…LANインタフェース回路、9021、9022、9023…システムデコーダ、9031、9032、9033…映像伸張回路、904…画像処理回路、905…OSD回路、906…基準信号発生回路、907…制御回路 1a, 1b, 1c, 4a, 4b, 4c,..., Camera, 2, 5 ... LAN, 3, 6, 9, 10 ... reference time server, 7 ... WAN, 8 ... center server, 10 ... controller, 12 ... display, DESCRIPTION OF SYMBOLS 100 ... Lens 101 ... Image pick-up element 102 ... Video compression circuit 103 ... Video buffer 104 ... System encoder 105 ... Packet buffer 106 ... Reference signal generation circuit 107 ... LAN interface circuit 108 ... Control circuit 201 ... Intraframe, 202 ... interframe, 301 ... sequence header, 302 ... picture header, 40 ... transport packet, 401 ... packet header, 402 ... packet information, 801 ... LAN interface circuit, 8021, 8022, 8023 ... system decoder, 8031 , 8032, 8033 ... Image expansion circuit, 804... Image processing circuit, 806... Reference signal generation circuit, 807... Control circuit, 901... LAN interface circuit, 9021, 9022, 9023 ... System decoder, 9031, 9032, 9033 ... Video expansion circuit, 904. Processing circuit, 905... OSD circuit, 906... Reference signal generation circuit, 907.

Claims (16)

  1.  ネットワークを介して情報を送受信するインタフェースと、
     前記インタフェースで受信した映像情報を処理する映像処理部と、を有し、
     前記インタフェースで複数の映像情報を受信した場合に、前記映像処理部は当該複数の映像情報の伝送遅延時間を含む遅延時間に基づいて当該複数の映像情報を処理することを特徴とする映像処理装置。
    An interface for transmitting and receiving information over a network;
    A video processing unit for processing video information received by the interface;
    When a plurality of video information is received by the interface, the video processing unit processes the plurality of video information based on a delay time including a transmission delay time of the plurality of video information. .
  2.  請求項1の映像処理装置であって、
     前記映像処理部は、前記複数の映像情報の遅延時間のうち、最も長い遅延時間に基づいて前記複数の映像情報を処理することを特徴とする映像処理装置。
    The video processing apparatus according to claim 1,
    The video processing apparatus, wherein the video processing unit processes the video information based on a longest delay time among delay times of the video information.
  3.  請求項1または2の映像処理装置であって、
     前記映像処理部は映像情報の処理に用いる基準信号を発生させる基準信号発生回路を含み、
     前記基準信号発生回路は前記遅延時間に基づいて基準信号の位相を制御することを特徴とする映像処理装置。
    The video processing apparatus according to claim 1 or 2,
    The video processing unit includes a reference signal generation circuit for generating a reference signal used for processing video information,
    The video processing apparatus, wherein the reference signal generation circuit controls the phase of the reference signal based on the delay time.
  4.  請求項1~3のいずれかの映像処理装置であって、
     前記遅延時間は、映像情報に付加された映像情報の発生時刻の情報を用いて算出されることを特徴とする映像処理装置。
    The video processing device according to any one of claims 1 to 3,
    The video processing apparatus according to claim 1, wherein the delay time is calculated using information on an occurrence time of video information added to the video information.
  5.  時刻情報に基づいて基準信号を発生する基準信号発生回路と、
     前記基準信号発生回路により発生された基準信号に基づいて映像情報を撮像する撮像部と、
     前記撮像部により撮像した映像情報を処理する映像処理部と、
     ネットワークを介して情報を送受信するインタフェースと、
    を有し、
     前記基準信号発生回路は、前記インタフェースで受信した位相情報に基づいて、発生させる基準信号の位相を制御することを特徴とする映像送信装置。
    A reference signal generation circuit for generating a reference signal based on time information;
    An imaging unit for imaging video information based on a reference signal generated by the reference signal generation circuit;
    A video processing unit for processing video information captured by the imaging unit;
    An interface for transmitting and receiving information over a network;
    Have
    The video transmission device, wherein the reference signal generation circuit controls a phase of a reference signal to be generated based on phase information received by the interface.
  6.  時刻情報に基づいて基準信号を発生する基準信号発生回路と、
     ネットワークを介して情報を送受信するインタフェースと、
     前記インタフェースで受信した映像情報を処理する映像処理部と、を有し、
     前記時刻情報に対する前記基準信号の位相情報を、前記インタフェースから前記ネットワークに接続された一つまたは複数の機器に送信することを特徴とする映像処理装置。
    A reference signal generation circuit for generating a reference signal based on time information;
    An interface for transmitting and receiving information over a network;
    A video processing unit for processing video information received by the interface;
    A video processing apparatus, wherein phase information of the reference signal with respect to the time information is transmitted from the interface to one or a plurality of devices connected to the network.
  7.  時刻情報に基づいて基準信号を発生する基準信号発生回路と、
     ネットワークを介して情報を送受信するインタフェースと、
     前記インタフェースで受信した映像情報を処理する映像処理部と、
     前記映像処理部で処理された映像情報を表示するための処理を行う表示処理部と、を有し、
     前記インタフェースで受信した映像情報の伝送遅延時間を含む遅延時間に基づいて、前記基準信号発生回路で発生する前記基準信号の位相を制御することを特徴とする映像受信装置。
    A reference signal generation circuit for generating a reference signal based on time information;
    An interface for transmitting and receiving information over a network;
    A video processing unit for processing video information received by the interface;
    A display processing unit for performing processing for displaying the video information processed by the video processing unit,
    A video receiving apparatus that controls the phase of the reference signal generated by the reference signal generation circuit based on a delay time including a transmission delay time of video information received by the interface.
  8.  請求項5に記載の映像送信装置において、
     前記基準信号発生回路により発生した基準信号の前記時刻情報に基づく発生時刻を取得し、前記映像処理部により処理された映像情報に付加して、前記インタフェースから送信することを特徴とする映像送信装置。
    The video transmission device according to claim 5,
    A video transmission device characterized in that the generation time based on the time information of the reference signal generated by the reference signal generation circuit is acquired, added to the video information processed by the video processing unit, and transmitted from the interface .
  9.  ネットワークを介して情報を送受信するインタフェースと、
     前記インタフェースで受信した映像情報を処理する映像処理部と、を有し、
     前記映像処理部は映像情報の処理に用いる基準信号を発生させる基準信号発生回路を含み、
     前記映像処理部は、前記インタフェースで受信した映像情報を生成した機器の基準信号と前記基準信号発生回路で発生させる基準信号との位相差に基づいて、前記インタフェースで受信した映像情報から新たな映像情報を生成することを特徴とする映像処理装置。
    An interface for transmitting and receiving information over a network;
    A video processing unit for processing video information received by the interface;
    The video processing unit includes a reference signal generation circuit for generating a reference signal used for processing video information,
    The video processing unit generates a new video from the video information received by the interface based on a phase difference between a reference signal of a device that generated the video information received by the interface and a reference signal generated by the reference signal generation circuit. An image processing apparatus characterized by generating information.
  10.  ネットワークを介して映像情報を受信する受信ステップと、
     受信した映像情報を処理する映像処理ステップと、を有し、
     前記受信ステップで複数の映像情報を受信した場合に、前記映像処理ステップで当該複数の映像情報の伝送遅延時間を含む遅延時間に基づいて当該複数の映像情報を処理することを特徴とする映像処理方法。
    A receiving step for receiving video information over a network;
    A video processing step for processing the received video information,
    Video processing characterized in that, when a plurality of video information is received in the receiving step, the plurality of video information is processed based on a delay time including a transmission delay time of the video information in the video processing step. Method.
  11.  請求項10の映像処理方法であって、
     前記映像処理ステップでは、前記複数の映像情報の遅延時間のうち、最も長い遅延時間に基づいて前記複数の映像情報を処理することを特徴とする映像処理方法。
    The video processing method according to claim 10, comprising:
    In the video processing step, the plurality of video information is processed based on a longest delay time among delay times of the plurality of video information.
  12.  請求項10または11の映像処理方法であって、
     前記遅延時間に基づいて位相が制御された基準信号を発生させる基準信号発生ステップを有し、
     前記映像処理ステップでは前記基準信号発生ステップで発生させた基準信号に基づいて映像情報が処理されることを特徴とする映像処理方法。
    The video processing method according to claim 10 or 11, comprising:
    A reference signal generating step for generating a reference signal whose phase is controlled based on the delay time;
    In the video processing step, video information is processed based on the reference signal generated in the reference signal generation step.
  13.  請求項10~12のいずれかの映像処理方法であって、
     前記遅延時間は、映像情報に付加された映像情報の発生時刻の情報を用いて算出されることを特徴とする映像処理方法。
    The video processing method according to any one of claims 10 to 12,
    The video processing method according to claim 1, wherein the delay time is calculated using information on a generation time of video information added to the video information.
  14.  位相情報を受信する受信ステップと、
     時刻情報に基づいて基準信号を発生する基準信号発生ステップと、
     前記基準信号発生ステップにより発生された基準信号に基づいて映像情報を撮像する撮像ステップと、
     前記撮像ステップで撮像した映像情報を処理する映像処理ステップと、
     ネットワークを介して前記映像処理ステップで処理した映像情報を送信する送信ステップと、を有し、
     前記基準信号発生ステップでは、前記受信ステップで受信した位相情報に基づいて、発生させる基準信号の位相を制御することを特徴とする映像送信方法。
    A receiving step for receiving phase information;
    A reference signal generating step for generating a reference signal based on time information;
    An imaging step of imaging video information based on the reference signal generated by the reference signal generation step;
    A video processing step for processing the video information imaged in the imaging step;
    Transmitting the video information processed in the video processing step via a network,
    In the reference signal generation step, the phase of the reference signal to be generated is controlled based on the phase information received in the reception step.
  15.  時刻情報に基づいて基準信号を発生する基準信号発生ステップと、
     ネットワークを介して映像情報を受信する受信ステップと、
     前記受信ステップで受信した映像情報を処理する映像処理ステップと、
     前記時刻情報に対する前記基準信号の位相情報を、ネットワークを介して他の機器に送信することを特徴とする映像処理方法。
    A reference signal generating step for generating a reference signal based on time information;
    A receiving step for receiving video information over a network;
    A video processing step for processing the video information received in the receiving step;
    A video processing method, wherein phase information of the reference signal with respect to the time information is transmitted to another device via a network.
  16.  時刻情報に基づいて基準信号を発生する基準信号発生ステップと、
     映像情報を受信する受信ステップと、
     前記受信ステップで受信した映像情報を処理する映像処理ステップと、
     前記映像処理ステップで処理された映像情報を表示するための処理を行う表示処理ステップと、を有し、
     前記受信ステップで受信した映像情報の伝送遅延時間を含む遅延時間に基づいて、前記基準信号発生ステップで発生する前記基準信号の位相を制御することを特徴とする映像受信方法。
    A reference signal generating step for generating a reference signal based on time information;
    A receiving step for receiving video information;
    A video processing step for processing the video information received in the receiving step;
    A display processing step for performing processing for displaying the video information processed in the video processing step,
    A video receiving method, comprising: controlling a phase of the reference signal generated in the reference signal generating step based on a delay time including a transmission delay time of the video information received in the receiving step.
PCT/JP2012/077824 2012-10-29 2012-10-29 Video processing device, video transmission device, video reception device, video processing method, video transmission method, and video reception method WO2014068629A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2012/077824 WO2014068629A1 (en) 2012-10-29 2012-10-29 Video processing device, video transmission device, video reception device, video processing method, video transmission method, and video reception method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2012/077824 WO2014068629A1 (en) 2012-10-29 2012-10-29 Video processing device, video transmission device, video reception device, video processing method, video transmission method, and video reception method

Publications (1)

Publication Number Publication Date
WO2014068629A1 true WO2014068629A1 (en) 2014-05-08

Family

ID=50626610

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/077824 WO2014068629A1 (en) 2012-10-29 2012-10-29 Video processing device, video transmission device, video reception device, video processing method, video transmission method, and video reception method

Country Status (1)

Country Link
WO (1) WO2014068629A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005252344A (en) * 2004-03-01 2005-09-15 Nippon Telegr & Teleph Corp <Ntt> Quality improving method in multipoint communication and multipoint communication terminal, and quality improving program in inter-multipoint communication
JP2006250638A (en) * 2005-03-09 2006-09-21 Matsushita Electric Ind Co Ltd Video camera provided with clock synchronization function
JP2009152733A (en) * 2007-12-19 2009-07-09 Nec Corp Person specifying system, person specifying device, person specifying method, and person specifying program
JP2010197320A (en) * 2009-02-27 2010-09-09 Sony Corp Slave device, time synchronization method of the same, master device, and electronic equipment system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005252344A (en) * 2004-03-01 2005-09-15 Nippon Telegr & Teleph Corp <Ntt> Quality improving method in multipoint communication and multipoint communication terminal, and quality improving program in inter-multipoint communication
JP2006250638A (en) * 2005-03-09 2006-09-21 Matsushita Electric Ind Co Ltd Video camera provided with clock synchronization function
JP2009152733A (en) * 2007-12-19 2009-07-09 Nec Corp Person specifying system, person specifying device, person specifying method, and person specifying program
JP2010197320A (en) * 2009-02-27 2010-09-09 Sony Corp Slave device, time synchronization method of the same, master device, and electronic equipment system

Similar Documents

Publication Publication Date Title
JP5697743B2 (en) Video transmission device, video transmission method, video reception device, and video reception method
US9723193B2 (en) Transmitting device, receiving system, communication system, transmission method, reception method, and program
US8745432B2 (en) Delay controller, control method, and communication system
US10038827B2 (en) Semiconductor device, electronic device module and network system
JP2013134119A (en) Transmitter, transmission method, receiver, reception method, synchronous transmission system, synchronous transmission method, and program
US20160366431A1 (en) Video decoding device and video decoding method
JP2011234341A (en) Receiving apparatus and camera system
US20150163003A1 (en) Communication apparatus, communication system, communication controlling method, and program
US9807282B2 (en) Synchronous camera
JP2013058986A (en) Communication system, transmission device, reception device, transmission method, reception method, and program
WO2020017499A1 (en) Video/audio transmission system, transmission method, transmission device, and reception device
WO2014068629A1 (en) Video processing device, video transmission device, video reception device, video processing method, video transmission method, and video reception method
JP2020005063A (en) Processing device and control method thereof, output device, synchronization control system, and program
JP5697494B2 (en) Video transmission device, video transmission method, video reception device, and video reception method
US20220360845A1 (en) Reception apparatus, reception method, and transmission and reception system
JP2015149761A (en) Encoded signal transmission device
US9912427B2 (en) Reception apparatus and system
JP2012195794A (en) Encoded signal reception device
JPWO2016017266A1 (en) Video information reproducing apparatus and reproducing method
GB2595879A (en) Method for controlling an image capture device
JP2012195796A (en) Encoded signal transmission device
JP2012195795A (en) Network camera system
JP6335775B2 (en) Media receiver
JP2004289544A (en) Network connection machine and time stamp processing method using therefor
JP2012191284A (en) Image transmitter, image transmission method, image receiver, and image reception method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12887752

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12887752

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP