WO2022176109A1 - 映像合成装置及び映像合成方法 - Google Patents
映像合成装置及び映像合成方法 Download PDFInfo
- Publication number
- WO2022176109A1 WO2022176109A1 PCT/JP2021/006139 JP2021006139W WO2022176109A1 WO 2022176109 A1 WO2022176109 A1 WO 2022176109A1 JP 2021006139 W JP2021006139 W JP 2021006139W WO 2022176109 A1 WO2022176109 A1 WO 2022176109A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- video
- frame rate
- synthesizing
- timing
- frame
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 21
- 239000000203 mixture Substances 0.000 title abstract description 10
- 230000002194 synthesizing effect Effects 0.000 claims description 67
- 238000001308 synthesis method Methods 0.000 claims 1
- 239000002131 composite material Substances 0.000 abstract description 8
- 230000015572 biosynthetic process Effects 0.000 description 32
- 238000003786 synthesis reaction Methods 0.000 description 32
- 238000004891 communication Methods 0.000 description 13
- 238000001514 detection method Methods 0.000 description 8
- 230000006870 function Effects 0.000 description 8
- 239000000872 buffer Substances 0.000 description 7
- 238000010586 diagram Methods 0.000 description 6
- 238000003384 imaging method Methods 0.000 description 6
- 230000005540 biological transmission Effects 0.000 description 5
- 230000003111 delayed effect Effects 0.000 description 3
- 230000007423 decrease Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 230000033228 biological regulation Effects 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 230000001934 delay Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000008707 rearrangement Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/2624—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects for obtaining an image which is composed of whole input images, e.g. splitscreen
Definitions
- the present invention relates to a video synthesizing device and video synthesizing method for synthesizing videos from a plurality of cameras into one screen.
- this video device transmits one screen using a time equal to the frame rate. For example, in the case of a video signal of 60 frames per second (hereinafter referred to as 60 fps (Frame per Second)), 1/60 s, that is, about 16.8 ms is applied to transmit one screen of video.
- 60 fps frames per Second
- FIG. 1 A video signal for one frame is shown in FIG.
- 51 is a video signal for one frame
- 52 is blanking
- 53 is a scanning line
- 54 is a display screen.
- the screen is scanned horizontally line by line like a scanning line 53, and the scanning line 53 advances downward sequentially.
- This scanning includes display screen 54 as well as blanking 52 and overhead information/signals.
- the blanking 52 may include information other than video information, such as control information and audio information (see Non-Patent Document 1, for example).
- FIG. 2 shows a mode in which images from a plurality of cameras are displayed on monitors less than the number of cameras.
- 200 is a video synthesizing device
- 20 is a camera
- 22 is a monitor. Images from the four cameras 20 are synthesized into one screen by the image synthesizing device 200 and displayed on the monitor 22 .
- FIG. 3 is a timing chart of video synthesis in which four video images at different timings are input, synthesized into one screen, and output.
- Tf the frame time
- Tp the synthesis processing time
- the maximum delay time from the first video input to the video output is 2Tf+Tp.
- the combined video will include a delay of 2 frame times or more, that is, 34.5 ms or more.
- the delay associated with this video synthesis will greatly impair its feasibility.
- 120 BPM Beat Per Minute
- a trigger is given from an image synthesizing device or an external device to each camera to instruct appropriate imaging timing so that the frame timings of the images from each camera are aligned (for example, see Non-Patent Document 2).
- a trigger mode of the genIcam standard it is possible to cause the camera to capture an image at a desired timing by giving an electrical trigger as a rectangular wave to a gigE camera or the like.
- FIG. 4 is a diagram for explaining a form in which images from a plurality of cameras are displayed on monitors that are smaller in number than the number of cameras.
- 210 is a video synthesizing device
- 20 is a camera
- 22 is a monitor.
- Each camera 20 takes an image in accordance with an imaging trigger from the video synthesizing device 210 .
- the video from each camera 20 is synthesized into one screen by the video synthesizing device 210 and output to the monitor 22 .
- FIG. 5 shows a timing chart of video synthesis. Assuming that the frame time is Tf and the processing time is Tp, the maximum delay time from the first video input to the video output is Tf+Tp.
- FIG. 6 shows a form in which the method of Non-Patent Document 2 is applied to a video conference system that connects remote locations.
- FIG. 6 is a diagram for explaining a form in which images from a plurality of cameras are displayed on monitors that are smaller in number than the number of cameras.
- 210 is a video synthesizing device
- 20 is a camera
- 21 is a communication network
- 22 is a monitor.
- a communication network 21 for transmitting signals is interposed between the camera 20 and the image synthesizing device 210 .
- the trigger signal is distorted according to transmission delay fluctuations in the communication network.
- FIG. 7 shows a timing chart of video synthesis. Assuming that the frame time is Tf, the processing time is Tp, and the additional delay is 2 ⁇ t, the maximum delay time from the first image input to the image output is Tf+Tp+2 ⁇ t.
- Non-Patent Document 2 Even with the method of Non-Patent Document 2, avoiding a large delay in the time from the input of a plurality of images to the output of the synthesized image when synthesizing a plurality of images from a plurality of sites via a communication network. can't Therefore, there has been a problem of reducing the time delay from the input of a plurality of images to the output of the composite image.
- the invention of the present disclosure aims to reduce the time delay from the input of multiple videos to the output of the composite video.
- the frequency control of the frame rate of each camera is performed so that the timings of the videos from a plurality of cameras match.
- the video synthesizer of the present disclosure includes: Detecting a time lag between the frame timing of each video and a predetermined timing when synthesizing video from a plurality of cameras on one screen, instructing the camera to capture a frame rate such that the time lag is reduced; Images from the plurality of cameras are synthesized on one screen and output.
- the video synthesizer of the present disclosure includes:
- the instructed frame rate is characterized by being a value separated by a constant value from the image synthesis frame rate for synthesizing the images.
- the video synthesizer of the present disclosure includes:
- the instructed frame rate is characterized by being a value that is separated from the image synthesis frame rate for synthesizing the images by a value corresponding to the time lag.
- the video synthesizer of the present disclosure includes: The frame rate is indicated periodically.
- the video synthesizer of the present disclosure includes: The instructed frame rate is fixed if the time lag is equal to or less than a predetermined value.
- the video synthesizer of the present disclosure includes:
- the predetermined timing is characterized in that it is a synthesis processing start timing.
- the video synthesizer of the present disclosure includes:
- the synthesis processing start timing is characterized by being an average value of the frame end timings of the images from the plurality of cameras.
- the frequency control of the frame rate of each camera is performed so that the timings of the videos from the multiple cameras match.
- the video composition method of the present disclosure includes: Detecting a time lag between the frame timing of each video and a predetermined timing when synthesizing video from a plurality of cameras on one screen, instructing the camera to capture a frame rate such that the time lag is reduced; Images from the plurality of cameras are synthesized on one screen and output.
- the video synthesizing device or video synthesizing method of the present disclosure it is possible to reduce the delay in the time from the input of multiple videos to the output of the synthesized video.
- 4 is a diagram illustrating an aspect of the present disclosure in which images from multiple cameras are displayed on fewer monitors than the number of cameras; 4 is a timing chart of video composition; 4 is a timing chart of video composition; It is a figure explaining the structure of a video synthesizing
- the image synthesizing device of the present disclosure when synthesizing images from a plurality of cameras into one screen, reduces the time lag between the timings of the plurality of images and a predetermined timing. Perform frequency control.
- the frequency control of the frame rate captured by the camera can be executed using a control interface such as GenICam, for example, for cameras compatible with gigEvision and USBVision.
- GenICam for example, for cameras compatible with gigEvision and USBVision.
- the usable frequency is selective for a camera compatible with HDMI, it can be applied depending on the resolution and frame rate.
- gigEvision is a standard formulated by AIA (Automated Imaging Association) for transmitting camera control and captured video signals to a personal computer or the like via the Internet.
- USBVision is a standard formulated by AIA for transferring video data from a camera to a user buffer.
- GenICam is a software interface standard formulated by EMVA (European Machine Vision Association) for end-to-end setting of a wide range of standard interfaces regardless of camera type or video transmission format.
- HDMI High Definition Multimedia Interface
- FIG. 8 shows a form of the present disclosure in which images from a plurality of cameras are displayed on monitors less than the number of cameras.
- 100 is a video synthesizer of the present disclosure
- 20 is a camera
- 21 is a communication network
- 22 is a monitor.
- the image synthesizing device 100 synthesizes the images input from the plurality of cameras 20 via the communication network 21 into one screen, and outputs the synthesized image to the monitor 22 .
- the video synthesizer 100 has four input channels, but any number of inputs may be used.
- FIG. 9 shows a timing chart for synthesizing images by the image synthesizing device 100 of the present disclosure.
- FIG. 9 illustrates four input channels, but is not limited to these numbers.
- "i, k frame” represents the k-th frame of the video input to the i-th input channel. The same applies hereafter. If the timing of the k-th frame of the video input to the i-th input channel does not match the predetermined timing, for example, the timing at which the synthesis processing is to be started in FIG. is detected ((1) in FIG. 9), and the camera 20 connected to the i-th input channel is instructed to shoot at a frame rate slightly different from the frame rate output by the video synthesizer 100 ((1) in FIG. 9). 9 (2)). As a result, the time lag gradually decreases from "i, k+1 frame” to "i, k+2 frame".
- the video synthesizing device 100 has a video synthesizing frame rate for synthesizing video according to the standard frame rate. For example, the video synthesizing device 100 sets the video synthesizing frame rate to 120 fps for a group of cameras whose nominal standard frame rate is 120 fps. In FIG. 9, for a group of cameras 20 whose standard frame rate is nominally 120 fps, the video synthesizing device 100 detects the timing of the k-th frame of the video input to the i-th input channel of the camera 20 by performing synthesizing processing.
- the camera 20 connected to the i-th input channel is instructed to pick up at a frame rate of (120+ ⁇ f) fps, which is a fixed value away from the video synthesis frame rate. instruct.
- the video synthesizing device 100 detects that the timing of the k-th frame of the video input to the i-th input channel is ahead of the desired timing to start synthesizing processing, the i-th input channel is connected.
- the camera 20 is instructed to shoot at a frame rate of (120- ⁇ f) fps, which is a fixed value away from the video synthesis frame rate.
- the delay or advance of the frame timing is less than 1/2 of the frame length.
- the instructed frame rate is a value that is a constant value away from the video synthesis frame rate, but it may be a variable value away from the video synthesis frame rate. For example, depending on the time lag between the timing of the k-th frame of the video input to the i-th input channel and the timing at which you want to start the synthesis process, you can specify a frame rate that is far from the video synthesis frame rate. good. Alternatively, if the time lag is greater than a predetermined value, a frame rate separated by ⁇ f from the video synthesis frame rate is indicated, and if the time lag is smaller than the predetermined value, a frame rate separated by 1/2 ⁇ f from the video synthesis frame rate is indicated. You can specify the frame rate.
- FIG. 10 shows a timing chart for synthesizing images by the image synthesizing device 100 of the present disclosure.
- the video synthesizer 100 may constantly detect the time lag between the timing of the k-th frame of the video input to the i-th input channel and a predetermined timing ((1 in FIG. 10). ) is constantly executed for each frame), or may be detected periodically at a constant cycle ((1) in FIG. 10 is periodically executed for every several frames).
- the video synthesizer 100 may always instruct the frame rate to the camera 20 connected to the i-th input channel (always execute (2) in FIG. 10 for each frame), or It may be instructed periodically ((2) in FIG. 10 is periodically executed every several frames).
- the video synthesizer 100 sets a new frame rate to the camera 20. You may instruct ((3) upper part of FIG. 10). Also, if the time lag between the timing of the k+n+m-th frame of the video input to the i-th input channel and the predetermined timing is equal to or less than a certain value, whether to fix the instructed frame rate to the composite frame rate (FIG. 10). (3) lower row), the new frame rate may not be instructed.
- the instructed frame rate is fixed or no new frame rate is instructed, after fixing the frame rate for imaging by the camera 20 is completed, as long as the settings of the related devices are not changed or the characteristics of the communication network are not changed. , the amount of traffic generated can be minimized without the need for additional controls.
- the video synthesizing device and video synthesizing method of the present disclosure control only the frame rate without controlling the imaging timing of the camera. . Even if the timing of instructing the frame rate to the camera is delayed, it only prolongs the time required until the frame rate is fixed. If you continue to decrease the time lag by instructing the frame rate to the camera, eventually it will become excessive control and the lag of the time lag will increase in the opposite direction. You can do it. For example, if you want to ensure that the timing lag is 3.5 ms or less, and if you specify an imaging frame rate of 121 fps for a synthesis processing frame rate of 120 fps, the delay time that can be compensated for per frame is 0.7 ms.
- FIG. 11 shows the configuration of the video synthesizing device of the present disclosure.
- 100 is a video synthesizer
- 101 is a time lag detection circuit
- 102 is a frame rate calculation circuit
- 103 is a crossbar switch
- 104 is an up/down converter
- 105 is a buffer
- 106 is a pixel synthesis circuit
- 20 is a camera
- 21 is a communication network
- 22 is a monitor.
- the video synthesizer 100 has four inputs in FIG. 11, any number of inputs may be used.
- the time lag detection circuit 101 detects the time lag between the timing of the video frame from the camera 20 and a predetermined timing.
- the frame rate calculation circuit 102 calculates the frame rate at which the camera 20 captures images so that the time lag detected by the time lag detection circuit 101 is reduced, and instructs the camera 20 of the calculated frame rate.
- the crossbar switch 103 rearranges video inputs in arbitrary order and outputs them.
- the time lag detection circuit 101 may have a function of instructing rearrangement.
- the up/down converter 104 scales the number of pixels of the image to an arbitrary size.
- the crossbar switch 103 and the up/down converter 104 may be connected to the inputs in the order opposite to that in FIG.
- a buffer 105 buffers the input video.
- the buffer 105 may have a function of arbitrarily changing the order of the video to be output.
- the pixel synthesizing circuit 106 reads the video from the buffer 105 and outputs it.
- the pixel synthesizing circuit 106 may have a function of adding an arbitrary control signal to the blanking portion of the screen.
- the predetermined timing that serves as the base point for the time lag detected by the time lag detection circuit 101 may be the synthesis processing start timing of the video synthesizing device 100 .
- Control when the end timing of the k-th frame of the video imaged by the i-th camera is time-shifted from the synthesis processing start timing of the video synthesizing device 100 will be described with reference to FIG. In FIG. 12, the time lag detection circuit 101 records the synthesis processing start timing t1. Also, the end timing t2 of the k-th frame of the video imaged by the i-th camera is recorded, and the time shift from the synthesis processing start timing t1 is detected.
- the average value of the frame end timings of images from a plurality of cameras may be used.
- the gigE camera records the time stamp of the image capturing timing in the video, and if the fluctuation of the time difference of the time stamp recorded in the video acquired from the multiple cameras is large, the frame end of the video from the multiple cameras
- An average value of timing may be derived according to the following equation.
- Average value of frame end timing (1/N)* ⁇ (t2(k)-t1(k))
- t1(k) is the synthesis processing start timing for the kth frame
- t1(k) is the kth frame end timing
- f0 is the video synthesis frame rate
- the frame rate calculation circuit 102 calculates the frame rate f so as to reduce the time lag.
- FIG. 13 shows an example of the calculated frame rate control function.
- the image synthesizer and the image synthesizer method of the present disclosure can minimize the time lag (t2-t1).
- the frame rate calculation circuit 102 when the time lag (t2-t1) is still large, the frame rate calculation circuit 102 recalculates and determines the frame rate f. As shown in FIG. 15, when the time lag (t2-t1) is reduced, the frame rate calculation circuit 102 may fix the frame rate f or may recalculate it.
- the instructed frame rate f is set to a value separated from the synthesis processing frame rate f0 by a constant value
- the difference (f ⁇ f0) from the rate f0 and the time lag (t2 ⁇ t1) may be used to calculate the expected time T at which the time lag is minimal.
- the frame rate calculation circuit 102 may instruct a constant frame rate until the expected time T elapses, and then recalculate the frame rate after the expected time T elapses.
- the frame rate calculation circuit 102 instructs each camera 20 of the determined frame rate. Each camera 20 takes an image at the instructed frame rate.
- the video synthesizing device and video synthesizing method of the present disclosure can reduce the time delay from the input of a plurality of videos to the output of the synthesized video.
- the time delay from the input of the plurality of images can be reduced. can be done.
- This disclosure can be applied to the information and communications industry.
- Video synthesizer 101 Time lag detection circuit 102: Frame rate calculation circuit 103: Crossbar switch 104: Up/down converter 105: Buffer 106: Pixel synthesizer circuit 200: Video synthesizer 210: Video synthesizer 20: Camera 21: Communication Network 22: Monitor 51: Video signal for one frame 52: Blanking 53: Scanning line 54: Display screen
Abstract
Description
複数のカメラからの映像を1の画面に合成する際に、それぞれの映像のフレームのタイミングと所定のタイミングとの時間ずれを検出し、
前記カメラに対して、前記時間ずれが減少するように、撮像するフレームレートを指示し、
前記複数のカメラからの映像を1の画面に合成して出力する。
指示するフレームレートは、映像を合成する映像合成フレームレートから一定値だけ離れた値であることを特徴とする。
指示するフレームレートは、映像を合成する映像合成フレームレートから前記時間ずれに応じた値だけ離れた値であることを特徴とする。
前記フレームレートを定期的に指示することを特徴とする。
前記時間ずれが一定値以下であれば、指示するフレームレートを固定することを特徴とする。
前記所定のタイミングは、合成処理開始タイミングであることを特徴とする。
前記合成処理開始タイミングは、前記複数のカメラからの映像のフレーム終了タイミングの平均値であることを特徴とする。
複数のカメラからの映像を1の画面に合成する際に、それぞれの映像のフレームのタイミングと所定のタイミングとの時間ずれを検出し、
前記カメラに対して、前記時間ずれが減少するように、撮像するフレームレートを指示し、
前記複数のカメラからの映像を1の画面に合成して出力する。
USBVisionとは、カメラからの映像データをユーザバッファへ転送するために、AIAによって策定された規格である。
GenICamとは、カメラの種類や映像伝送の形式に係らず、幅広い標準インターフェースの設定をエンドツーエンドで行うために、EMVA(European Machine Vision Association)によって策定されたソフトウェアインターフェース規格である。
HDMI(High Definition Multimedia Interface)とは、AV機器向けに、企業7社によって策定された伝送規格である。
フレーム終了タイミングの平均値=(1/N)*Σ(t2(k)-t1(k))
但し、t1(0)=t1(0)+n*1/f0
Nはフレーム終了タイミングの平均値を導出する映像の数
t1(k)はk番目のフレームに対する合成処理開始タイミング
t1(k)はk番目のフレームの終了タイミング
f0は映像合成フレームレート
T=(t2-t1)*(1/f0)/|1/f-1/f0|
フレームレート算出回路102は、見込み時間Tが経過するまでは、一定値のフレームレートを指示し、見込み時間Tが経過後に、再計算してもよい。
101:時間ずれ検出回路
102:フレームレート算出回路
103:クロスバスイッチ
104:アップダウンコンバータ
105:バッファ
106:画素合成回路
200:映像合成装置
210:映像合成装置
20:カメラ
21:通信ネットワーク
22:モニタ
51:1フレーム分の映像信号
52:ブランキング
53:走査線
54:表示画面
Claims (8)
- 複数のカメラからの映像を1の画面に合成する際に、それぞれの映像のフレームのタイミングと所定のタイミングとの時間ずれを検出し、
前記カメラに対して、前記時間ずれが減少するように、撮像するフレームレートを指示し、
前記複数のカメラからの映像を1の画面に合成して出力する映像合成装置。 - 指示するフレームレートは、映像を合成する映像合成フレームレートから一定値だけ離れた値であることを特徴とする請求項1に記載の映像合成装置。
- 指示するフレームレートは、映像を合成する映像合成フレームレートから前記時間ずれに応じた値だけ離れた値であることを特徴とする請求項1に記載の映像合成装置。
- 前記フレームレートを定期的に指示することを特徴とする請求項1から3のいずれかに記載の映像合成装置。
- 前記時間ずれが一定値以下であれば、指示するフレームレートを固定することを特徴とする請求項1から4のいずれかに記載の映像合成装置。
- 前記所定のタイミングは、合成処理開始タイミングであることを特徴とする請求項1から5のいずれかに記載の映像合成装置。
- 前記合成処理開始タイミングは、前記複数のカメラからの映像のフレーム終了タイミングの平均値であることを特徴とする請求項6に記載の映像合成装置。
- 複数のカメラからの映像を1の画面に合成する際に、それぞれの映像のフレームのタイミングと所定のタイミングとの時間ずれを検出し、
前記カメラに対して、前記時間ずれが減少するように、撮像するフレームレートを指示し、
前記複数のカメラからの映像を1の画面に合成して出力する映像合成方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2023500223A JP7480908B2 (ja) | 2021-02-18 | 2021-02-18 | 映像合成装置及び映像合成方法 |
PCT/JP2021/006139 WO2022176109A1 (ja) | 2021-02-18 | 2021-02-18 | 映像合成装置及び映像合成方法 |
US18/276,225 US20240121505A1 (en) | 2021-02-18 | 2021-02-18 | Image synthesis apparatus and image synthesis method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2021/006139 WO2022176109A1 (ja) | 2021-02-18 | 2021-02-18 | 映像合成装置及び映像合成方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022176109A1 true WO2022176109A1 (ja) | 2022-08-25 |
Family
ID=82930323
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/006139 WO2022176109A1 (ja) | 2021-02-18 | 2021-02-18 | 映像合成装置及び映像合成方法 |
Country Status (3)
Country | Link |
---|---|
US (1) | US20240121505A1 (ja) |
JP (1) | JP7480908B2 (ja) |
WO (1) | WO2022176109A1 (ja) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120092443A1 (en) * | 2010-10-14 | 2012-04-19 | Cisco Technology, Inc. | Network Synchronization Video for Composite Video Streams |
JP2018207279A (ja) * | 2017-06-02 | 2018-12-27 | 株式会社日立ビルシステム | 映像監視システム及び映像監視装置 |
JP2020198510A (ja) * | 2019-05-31 | 2020-12-10 | 日本電信電話株式会社 | 同期制御装置、同期制御方法及び同期制御プログラム |
-
2021
- 2021-02-18 JP JP2023500223A patent/JP7480908B2/ja active Active
- 2021-02-18 US US18/276,225 patent/US20240121505A1/en active Pending
- 2021-02-18 WO PCT/JP2021/006139 patent/WO2022176109A1/ja active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120092443A1 (en) * | 2010-10-14 | 2012-04-19 | Cisco Technology, Inc. | Network Synchronization Video for Composite Video Streams |
JP2018207279A (ja) * | 2017-06-02 | 2018-12-27 | 株式会社日立ビルシステム | 映像監視システム及び映像監視装置 |
JP2020198510A (ja) * | 2019-05-31 | 2020-12-10 | 日本電信電話株式会社 | 同期制御装置、同期制御方法及び同期制御プログラム |
Also Published As
Publication number | Publication date |
---|---|
JPWO2022176109A1 (ja) | 2022-08-25 |
JP7480908B2 (ja) | 2024-05-10 |
US20240121505A1 (en) | 2024-04-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4991129B2 (ja) | 映像音声再生装置および映像音声再生方法 | |
US5497199A (en) | Apparatus for processing progressive scanning video signal comprising progressive to interlaced signal converter and interlaced to progressive signal converter | |
WO2005009031A1 (ja) | 撮像装置と同期信号発生装置 | |
JP4475225B2 (ja) | 映像信号伝送システム、撮像装置、信号処理装置および映像信号伝送方法 | |
US20070188645A1 (en) | Image output apparatus, method and program thereof, and imaging apparatus | |
JP2012169727A (ja) | 映像信号処理装置および映像信号処理方法 | |
WO2022176109A1 (ja) | 映像合成装置及び映像合成方法 | |
US7187417B2 (en) | Video signal processing apparatus that performs frame rate conversion of a video signal | |
WO2019004783A1 (ko) | 다채널 영상을 위한 전송 시스템 및 이의 제어 방법, 다채널 영상 재생 방법 및 장치 | |
CN100418352C (zh) | 用于减小运动画面屏幕的失真的无线终端和方法 | |
WO2022137325A1 (ja) | 映像信号を合成する装置、方法及びプログラム | |
JP2005333520A (ja) | 画像伝送装置、画像伝送方法、伝送システム、及び映像監視システム | |
WO2022137324A1 (ja) | 映像信号を合成する装置、方法及びプログラム | |
JP2010278983A (ja) | 映像伝送装置及び方法 | |
US8848102B2 (en) | Method for processing digital video images | |
WO2023013072A1 (ja) | 映像信号を合成する装置、方法及びプログラム | |
WO2023017578A1 (ja) | 映像信号を合成する装置、方法及びプログラム | |
JP4738251B2 (ja) | 同期自動調整装置 | |
WO2022137326A1 (ja) | 映像音響合成装置、方法及びプログラム | |
JP2011146930A (ja) | 情報処理装置、情報処理方法、およびプログラム | |
KR100475343B1 (ko) | 다채널 입력을 위한 단일 아날로그/디지털 컨버터 소자를이용한 멀티플렉싱 장치 및 방법 | |
JP7346124B2 (ja) | 映像処理装置及び映像処理プログラム | |
WO2022269723A1 (ja) | 同期制御を行う通信システム、その同期制御方法、受信サーバ及び同期制御プログラム | |
JP3059151B1 (ja) | デ―タストリ―ム切替装置及びデ―タストリ―ム切替方法 | |
JP2003309759A (ja) | 撮影システム、テレビジョンカメラ、および撮影システムに用いる同期調整装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21926544 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2023500223 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18276225 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21926544 Country of ref document: EP Kind code of ref document: A1 |