WO2008050621A1 - Système de serveur de caméra, procédé de traitement de données et serveur de caméra - Google Patents

Système de serveur de caméra, procédé de traitement de données et serveur de caméra Download PDF

Info

Publication number
WO2008050621A1
WO2008050621A1 PCT/JP2007/070008 JP2007070008W WO2008050621A1 WO 2008050621 A1 WO2008050621 A1 WO 2008050621A1 JP 2007070008 W JP2007070008 W JP 2007070008W WO 2008050621 A1 WO2008050621 A1 WO 2008050621A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
image data
camera server
camera
sound
Prior art date
Application number
PCT/JP2007/070008
Other languages
English (en)
Japanese (ja)
Inventor
Katsura Obikawa
Jong-Sun Hur
Original Assignee
Opt Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Opt Corporation filed Critical Opt Corporation
Publication of WO2008050621A1 publication Critical patent/WO2008050621A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/4223Cameras

Definitions

  • Camera server system data processing method, and camera server
  • the present invention relates to a camera server system, a data processing method, and a camera server.
  • a camera server for distributing image data of an image captured by a camera device including an image sensor to a client via a network is known (for example, see Patent Document 1).
  • Patent Document 1 Japanese Patent Application Laid-Open No. 2005-318412
  • an object of the present invention is to provide a camera server system, a data processing method, and a camera server that increase the delivery speed of image data to a client.
  • a camera server system including a camera device including an image sensor and a camera server that distributes image data output from the camera device to a client, the image sensor and the camera server
  • An encoder that converts image data output from the image sensor into compressed image data that has been subjected to data compression is provided between the control unit and the control unit.
  • the processing load on the control means of the camera server is reduced, and the processing speed of the camera server is increased. Therefore, the processing speed for distributing image data to the client can be increased.
  • the encoder is a JPEG encoder and is provided in a camera device.
  • the image data is converted into the JPEG format in the camera device, and the image data in the JPEG format is sent to the camera server. Therefore, the volume of data transmitted from the camera device to the camera server can be reduced, and the transmission speed of image data in the camera server system can be improved. As a result, the speed at which image data is transmitted to the client can be increased.
  • image data is compressed in a data processing method when image data of an image captured by a camera device including an image sensor is distributed to a client via a network. After conversion to compressed image data, this compressed image data is distributed to the client in accordance with a request from the client.
  • the processing load on the control means of the camera server is reduced, and the processing speed of the camera server is increased. Therefore, the processing speed for distributing image data to the client can be increased.
  • the data compression is JPEG compression
  • the JPEG compression is performed in parallel with the camera device.
  • the image data is converted into the JPEG format in the camera device, and the image data in the JPEG format is sent to the camera server. Therefore, the volume of data transmitted from the camera device to the camera server can be reduced, and the transmission speed of image data in the camera server system can be improved. As a result, the speed at which image data is transmitted to the client can be increased.
  • image data of an image captured by a camera device including an image sensor is converted into compressed image data that is data-compressed in a camera server that distributes the image data to a client via a network.
  • This encoder is provided between an input unit for inputting image data sent from the camera device to the camera server and the control means of the camera server.
  • another invention is configured such that the encoder is a JPEG encoder, It is supposed to be provided in the equipment.
  • the image data is converted into the JPEG format in the camera device, and the image data in the JPEG format is sent to the camera server. Therefore, the volume of data transmitted from the camera device to the camera server can be reduced, and the transmission speed of image data in the camera server system can be improved. As a result, the speed at which image data is transmitted to the client can be increased.
  • FIG. 1 is a diagram showing a configuration of a camera server system according to an embodiment of the present invention.
  • FIG. 2 is a block diagram showing an electric circuit of a camera device in the camera server system shown in FIG.
  • FIG. 3 is a block diagram showing an electric circuit of a camera server in the camera server system shown in FIG.
  • USB connector 16 ⁇ ⁇ ⁇ USB connector (input section)
  • FIG. 1 is a system configuration diagram showing a camera server system 1 according to an embodiment of the present invention.
  • the camera server system 1 includes a camera device 2 and a camera server 3.
  • the camera device 2 and the camera server 3 are connected by a USB (Universal Serial Bus) cable 4.
  • Camera server system 1 and client 5 are connected via network 6.
  • the camera server system 1 receives a command from the personal computer 7 on the client 5 side.
  • the camera server 3 distributes image data of an image captured by the camera device 2 to the personal computer 7 (client 5) based on a command received from the personal computer 7.
  • client 5 One or a plurality of clients 5 exist, and the camera server system 1 distributes the image data to each client 5 in accordance with a request from each client 5.
  • the camera device 2 has a cubic-shaped housing 8.
  • Housing 8 has a fisheye lens
  • USB connector 10 audio 'video output connector 11, sound output connector 12, etc.
  • the fisheye lens 9 is disposed on the upper surface of the housing 8.
  • a ventilation hole 14 for the microphone 13 is formed on the upper surface of the housing 8.
  • the USB connector 10, the audio / video output connector 11 and the sound output connector 12 are arranged on the side surface of the housing 8.
  • the camera server 3 has a rectangular parallelepiped housing 15.
  • the housing 15 includes a USB connector 16 as an input unit that is an input terminal for inputting data from the camera device 2, a microphone connector 18 to which an expansion microphone 17 to be used as needed is connected, an infrared sensor, and the like.
  • General purpose input / output port (GPIO) 19 for connecting external equipment, audio output connector 20 for outputting audio signals, video output connector 21 for outputting video signals, and a speaker connected to a speaker
  • GPIO General purpose input / output port
  • a power connector 22, a communication interface 23 connected to the network, a power switch 24 also serving as a reset switch, and the like are provided.
  • FIG. 2 is a block diagram showing an electric circuit of the camera device 2.
  • the camera device 2 includes a fisheye lens 9, a USB connector 10, an audio 'video output connector 11, a sound output connector 12, a microphone 13, an image sensor 25 as an image sensor, and an F PGA (Field Programmable Gate Array).
  • Color conversion processing unit 26 sound processing unit 27 composed of sound IC (Integrated Circuit) 27, audio 'video encoder 28, streaming generation unit 29, and CPU (CCU) that controls the operation of each of the above components Central Processing Unit) 30 etc.
  • the microphone 13 collects sounds and sounds around the camera device 2.
  • the microphone 13 generates a sound signal that is an electrical signal corresponding to the collected sound and outputs the sound signal to the sound processing unit 27.
  • the sound processing unit 27 generates digital sound data based on the electrical signal from the microphone 13.
  • the sound processing unit 27 outputs an electrical signal corresponding to the sound from the microphone 13 to the sound output connector 12 without converting it into sound data (digital format). Therefore, the sound collected by the microphone 13 can be heard by connecting, for example, a speaker, headphones or the like to the sound output connector 12.
  • the image sensor 25 for example, a CMOS (Complementary Metal-Oxide Semiconductor) or the like is used.
  • An image by the fisheye lens 9 is formed on the image sensor 25.
  • the image sensor 25 outputs luminance distribution data, which is an imaging signal, to the color conversion processing unit 26 as digital image data.
  • luminance distribution data which is an imaging signal
  • the image sensor 25 performs imaging at an imaging speed of 1/15 frames per second, for example. Therefore, one frame of image data is output from the image sensor 25 every 1/15 second.
  • the color conversion processing unit 26 replaces the data of each pixel of the luminance distribution data using a color conversion table (not shown) to generate digital image data.
  • the generated image data is added with a header for each frame by a header adding unit 31 formed by the CPU 30.
  • a header number M (M 1, 2, 3,...), Which is a serial number, is assigned to the header according to the imaging order.
  • the CPU 30 is configured with a JPEG encoder section 32 as an encoder that converts image data into JPEG (Joint Photographic Experts Group) open-type compressed image data.
  • JPEG Joint Photographic Experts Group
  • noise data other than image data is output from the image sensor 25, or noise data is mixed between the image sensor 25 and the JPEG encoder unit 32.
  • the JPEG encoder unit 32 when the data is image data input to the JPEG encoder unit 32, the data is converted into JPEG format image data, and the header is added to the header. Gives information that it is image data.
  • the JPEG encoder unit 32 when the data to be converted is not image data, the JPEG encoder unit 32 does not convert the image data into JPEG format image data. That is, of the data input as image data output from the image sensor 25 to the JPEG encoder unit 32, the resolution data is not image data! /, And is not converted to JPEG image data.
  • the JPEG encoder unit 32 may be configured so as not to convert the image data into JPEG format data if the image data includes noise data exceeding a predetermined level! /.
  • a header is added by the header adding unit 31 in a predetermined amount of data. For example, in the present embodiment, a header is added to sound data for each data length of 1/30 second.
  • image data has a data amount of 15 frames per second
  • sound data has a data amount of 1/30 second. Therefore, one unit of image data corresponds to two units (1/30 seconds x 2) of sound data.
  • the image data and the sound data are output from the CPU 30 to the streaming generation unit 29 in a state where the timing at the time of imaging and the time of sound collection are synchronized with each other based on the header number! It is powered.
  • the header number M of the image data is 1, 2, 3, 4, 5, ...
  • the header number N of the sound data is 1, 2, 3, 4, 5, 6, 7, 8, 9, Let's say 10, ...
  • the streaming generation unit 29 reads image data and sound data output from the CPU 30. Data having these contents data is generated as streaming data.
  • the streaming generation unit 29 supplies the generated digital format data signal to the communication processing unit 33, and transmits it from the USB connector 10 to the camera server 3 via the USB cable 4.
  • Audio 'The video encoder 28 reads the image data output from the color conversion processing unit 26 without converting it to JPEG format image data, and NTSC (National TV standards Committee) method 1 -PAL (Phase Alternating Line ) Method high signal.
  • the audio / video encoder 28 converts the sound data in the digital format into a sound signal that is an electrical signal.
  • the video output connector 11 outputs the video signal and the sound signal together.
  • FIG. 3 is a block diagram showing an electric circuit of the camera server 3.
  • the camera server 3 includes a USB connector 16, a microphone connector 18 to which an extension microphone 17 is connected, a general-purpose input / output port 19 to which an external device such as an infrared sensor is connected, an audio output connector 20, and a video output.
  • a USB connector 16 to which an extension microphone 17 is connected
  • a general-purpose input / output port 19 to which an external device such as an infrared sensor is connected
  • an audio output connector 20 and a video output.
  • speaker connector 22 includes a USB connector 16, a microphone connector 18 to which an extension microphone 17 is connected to which an external device such as an infrared sensor is connected, an audio output connector 20, and a video output.
  • speaker connector 22 includes a USB connector 16, a microphone connector 18 to which an extension microphone 17 is connected, a general-purpose input / output port 19 to which an external device such as an infrared sensor is connected, an audio output connector 20, and a video output.
  • speaker connector 22 to which an external device such as an infrared sensor is connected
  • the MCU 37 includes a data selection unit 38 that is a data selection unit, a data synchronization unit 39 that is a data synchronization unit, a data distribution unit 40 that is a data distribution unit, and the like.
  • the data selection unit 38, the data synchronization unit 39, and the data distribution unit 40 are functionally realized by reading the control program power MCU 37 stored in the memory 36.
  • the data selection unit 38 determines the content of the image data transmitted from the camera device 2 based on the header information of the image data of each frame.
  • the header is the data power in the header JPEG converted image data
  • the data in the header is noise data
  • the data selection unit 38 determines whether the content of the data for each header is predetermined image data or data other than the predetermined image data based on the header information.
  • the data selection unit 38 selects data to be output to the data synchronization unit 39 based on the determination result. That is, when the data selection unit 38 determines that the content of the data is predetermined image data, the image data is transmitted from the communication interface unit 23 via the data synchronization unit 39 and the data distribution unit 40. Then, it is distributed to the personal computer 7 of the client 5 through the network 6. On the other hand, data that is not predetermined image data is not output to the data synchronization unit 39 side.
  • the image data distributed to the client 5 side can be obtained by providing the data selection unit 38 and not distributing the predetermined image data! / To the client 5 side.
  • Noise data will not be included.
  • the video displayed on the monitor of the personal computer 7 of the client 5 is a beautiful video with little noise. Since noise data is not transmitted, the amount of data transmitted from the camera server 3 to the client 5 is reduced, and the transmission speed of image data from the camera server 3 to the client 5 can be increased.
  • the sound data is not selected as in the case of image data, all the sound data is transmitted from the communication interface unit 23 via the data synchronization unit 39 and the data distribution unit 40. It is distributed to the personal computer 7 of the client 5 through the network 6. On the client 5 side, sound such as voice collected by the microphone 13 can be heard from a speaker built in the personal computer 7 or the like.
  • the header number N of the sound data input from the data selection unit 38 to the data distribution unit 40 is a continuous number without any loss.
  • the data selection unit 38 selects the data to be output to the data distribution unit 40, and therefore the image data input to the data distribution unit 40 from the data selection unit 38 is selected.
  • the header number M may be missing. Therefore, for example, for image data, when the image data of the portion in which the header number M is missing is packed and distributed, and for sound data, the sound data is distributed to the client 5 side without packing the data.
  • the video based on the image data is ahead of the content of the sound based on the video, and the video and sound are not synchronized.
  • the content of power sound that can synchronize video and sound is skipped.
  • the sound collection timing and the imaging timing of the sound data and the image data output from the data selection unit 38 are synchronized with each other, and the sound content is skipped. Process to synchronize so that nothing happens.
  • the header number power of the image data output from the data selection unit 38 is 1, 2,-, 4, 5, ... If the header number of the sound data is 1, 2, 3, 4, 5, 6, ..., the sound with the header numbers 1 and 2 for the image data with the header number 1 Synchronize the data, and synchronize the sound data of header numbers 3 and 4 with the image data of header number 2. Image data with header number 3 is missing force Sound data with header numbers 5 and 6 is synchronized with this part. The sound data of header numbers 7 and 8 are synchronized with the image data of header number 4. As described above, in a state where the sound data and the image data are synchronized, the data is distributed from the data distribution unit 40 to the network 6 through the communication interface 23.
  • the viewer software of the personal computer 7 of the client 5 is configured to continuously display the video based on the previous image data on the monitor until new image data is distributed. When image data is lost, nothing can be displayed on the monitor of the PC 7 so that it does not appear unnatural.
  • the visual software is configured in this way, the video data based on the image data with the header number 2 is displayed until the image data with the header number 2 is distributed and then the image data with the header number 4 is distributed. Is this During this period, the sound based on the sound data of the header numbers 5 and 6 is reproduced.
  • the data distribution unit 40 simultaneously distributes data signals to these clients 5 when a plurality of clients 5 are accessing the camera server 3.
  • the USB connector 16 is supplied with a data signal having image data in digital format and sound data in digital format from the camera device 2 via the USB cable 4 and is input to the MCU 37.
  • the digital data signal input to the MCU 37 is configured to be output from the communication interface 23 to the network 6 as a digital data signal.
  • the camera server 3 outputs the digital data signal output from the camera device 2 to the network side as a digital signal, there is no signal deterioration. Further, it is not necessary to provide an AD conversion function for converting an analog signal into a digital signal in the camera server 3.
  • image data is transmitted using the TCP or UDP protocol in a half-duplex communication system. Sound data is transmitted by the TCP protocol or UDP protocol in a duplex communication system.
  • image data is converted into compressed image data in JPEG format in the camera device 2, and then transmitted to the camera server 3. From the data distribution unit 40, the image data is transmitted to the client through the communication interface 23. Delivered to 5.
  • the image data is converted into the JPEG format in the camera device 2 and the image data in the JPEG format is sent to the camera server 3, so that the capacity of the data transmitted from the camera device 2 to the camera server 3 can be reduced. Can be reduced. For this reason, the transmission speed of image data in the camera server system 1 can be improved. As a result, it is possible to increase the speed at which image data is sent to the client 5.
  • the conversion of the image data into the JPEG format is performed in the MCU 37 of the camera server 3, and in response to a request for transmission of the image data from the client 5, the JPEG format is used each time.
  • the speed of distributing the image data from the camera server 3 to the client 5 becomes slow.
  • the MCU 37 performs many processes related to the control of the camera device 2 and the communication with the client 5. Therefore, in addition to these processes, J If an encoding function for PEG conversion is provided, the load on the MCU 37 increases, the processing speed of the MCU 37 decreases, and the encoding processing speed of the JPEG conversion also decreases. As a result, the transmission speed of the image data from the camera server 3 to the client 5 becomes slow.
  • the camera device has fewer processing items than the MCU 37 on the camera server 3 side.
  • JPEG conversion speed can be increased by performing JPEG conversion using the CPU 30 on the second side.
  • the processing speed does not decrease, the processing speed of the data selection unit 38, the data distribution unit 40, etc. can be increased, and the image data is distributed to the client 5.
  • the processing speed can be increased.
  • the client 5 waits for the distribution to the other clients 5 to finish and stores the image data.
  • the delivery is not received and the delivery speed can be increased.
  • the JPEG encoder unit 32 is provided in the CPU 30 of the camera device 2, and, as indicated by a dotted line in Fig. 3, the JPEG encoder 32A is connected to the camera server 3 side separately from the MCU 37, the USB connector 16 and the MCU 37
  • the image data obtained by performing JPEG conversion by the JPEG encoder 32A may be input to the MCU 37.
  • the JPEG encoder unit 32A is provided as a processing device different from the MCU 37 provided with the data selection unit 38, the data synchronization unit 39, the data distribution unit 40, and the like, so that the processing speed of the MCU 37 is not reduced. .
  • a dedicated JPEG encoder may be provided on the camera device 2 side separately from the CPU 30, and JPEG conversion may be performed by the JP EG encoder.
  • the data compression method and encoder are not limited to the JPEG compression method and other compression such as MPEG (Moving Picture Experts Group) shown as JPEG encoder unit 32 (32A).
  • a method may be adopted.
  • a reversible compression method may be used.
  • the audio encoder 35 reads the digital sound data input from the communication interface 23 to the camera server 3 from the MCU 37, converts it into a sound signal that is an electrical signal, and outputs it to the speaker connector 22. For this reason, for example, by connecting a speaker or the like to the speaker connector 22, the sound data input from the communication interface 23 can be reproduced with the force S.
  • a camera device control signal for controlling the camera device 2 is transmitted from the camera server 3 to the camera device 2 via the USB cable 4.
  • This camera device control signal is generated by the device control unit 41 configured in the MCU 37, and controls, for example, the start and stop of the operation of the camera device 2, or the camera device 2 receives image data. That require the transmission of Also, for a command transmitted from the personal computer 7 on the client 5 side connected to the network to the camera server 3, the command is interpreted and a camera device control signal is transmitted to the camera device 2.
  • the device control unit 41 interprets the command and transmits a camera device control signal to the camera device 2.
  • the device control unit 41 controls the external device in addition to the camera device 2 when an external device is connected to the GPIO.
  • the microphone 13 built in the camera device 2 is limited in the size that can be used due to restrictions on the image sensor 25 disposed in the housing 8 and the circuit configuration, and has sufficient sound collecting power. You may not get. In such a case, the extension microphone 17 is connected to the microphone connector 18.
  • the MCU 37 confirms that the extension microphone 17 is connected to the microphone connector 18.
  • a camera device control signal for stopping the sound collection by the microphone 13 is transmitted to the camera device 2.
  • the CPU 30 of the camera device 2 controls the sound processing unit 27 to stop generating sound data. Therefore, the power camera device 2 outputs data without sound data.
  • a sound signal is read from the microphone connector 18, and digital sound data is generated by the sound processing unit 42 configured in the MCU 37 of the camera server 3.
  • the sound data and the image data transmitted from the camera device 2 are supplied from the data distribution unit 40 to the communication interface 23 and distributed to the network 6.
  • the extension microphone 17 collects sound instead of the microphone 13 built in the camera device 2, so that the target of the sound to be collected and the surrounding sound are collected. Sound data suitable for the environment such as noise can be obtained and placed on the data for distribution.
  • the application of the camera server 3 sets the sound data of the microphone 13 sent from the USB cable 4 as the default sound input. ing. Thereafter, the application of the camera server 3 searches for the connection of the extension microphone 17 of the microphone connector 18.
  • the device control unit 41 transmits a camera device control signal for stopping the sound collection by the microphone 13 to the camera device 2. To do.
  • the application of the camera server 3 does not recognize the connection of the extension microphone 17, the sound data of the microphone 13 sent from the USB cable 4 is used as it is.
  • the determination by the data selection unit 38 may be performed based on other information instead of based on the header information added to the image data. For example, when image data is inspected for each header and noise data exceeding a predetermined level is detected, the image data of the header may not be transmitted to the network 6 side.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Television Signal Processing For Recording (AREA)
  • Studio Devices (AREA)

Abstract

L'invention permet d'améliorer la vitesse de distribution des données à un client. Un serveur de client (3) distribue à un client les données d'image d'une image saisie par un dispositif de caméra (2) équipé d'un élément d'imagerie (capteur d'image (25)) (5), via un réseau (6). Dans le serveur de caméra, un codeur JPEG (32), qui convertit les données d'image en données d'image de format JPEG, est disposé entre une unité d'entrée (16), dans laquelle les données d'image transmises par le dispositif de caméra (2) sont entrées dans le serveur de caméra (3), et un moyen de commande (MC37) du serveur de caméra (3).
PCT/JP2007/070008 2006-10-25 2007-10-12 Système de serveur de caméra, procédé de traitement de données et serveur de caméra WO2008050621A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2006289893A JP2008109364A (ja) 2006-10-25 2006-10-25 カメラサーバシステム、データの処理方法、およびカメラサーバ
JP2006-289893 2006-10-25

Publications (1)

Publication Number Publication Date
WO2008050621A1 true WO2008050621A1 (fr) 2008-05-02

Family

ID=39324421

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2007/070008 WO2008050621A1 (fr) 2006-10-25 2007-10-12 Système de serveur de caméra, procédé de traitement de données et serveur de caméra

Country Status (2)

Country Link
JP (1) JP2008109364A (fr)
WO (1) WO2008050621A1 (fr)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8992102B1 (en) * 2014-01-06 2015-03-31 Gopro, Inc. Camera housing for a square-profile camera
EP4365476A2 (fr) 2018-08-07 2024-05-08 GoPro, Inc. Caméra et support de caméra
USD894256S1 (en) 2018-08-31 2020-08-25 Gopro, Inc. Camera mount
USD905786S1 (en) 2018-08-31 2020-12-22 Gopro, Inc. Camera mount
USD920419S1 (en) 2019-09-17 2021-05-25 Gopro, Inc. Camera
USD946074S1 (en) 2020-08-14 2022-03-15 Gopro, Inc. Camera

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003018525A (ja) * 2001-07-04 2003-01-17 Fujitsu Ltd ネットワーク蓄積型ビデオカメラシステム
JP2004336484A (ja) * 2003-05-08 2004-11-25 Fuji Photo Film Co Ltd 動画伝送方法および装置、並びに動画伝送システム

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003018525A (ja) * 2001-07-04 2003-01-17 Fujitsu Ltd ネットワーク蓄積型ビデオカメラシステム
JP2004336484A (ja) * 2003-05-08 2004-11-25 Fuji Photo Film Co Ltd 動画伝送方法および装置、並びに動画伝送システム

Also Published As

Publication number Publication date
JP2008109364A (ja) 2008-05-08

Similar Documents

Publication Publication Date Title
WO2013132828A1 (fr) Système de communication et appareil de relais
US8160129B2 (en) Image pickup apparatus and image distributing method
WO2008050621A1 (fr) Système de serveur de caméra, procédé de traitement de données et serveur de caméra
US7593580B2 (en) Video encoding using parallel processors
KR20110052933A (ko) 촬영장치 및 촬영영상 제공방법
TWI655865B (zh) 用於組態自數位視訊攝影機輸出之視訊串流之方法
JP2006054921A (ja) 映像信号送信方法及び映像信号受信方法並びに映像信号送受信システム
JP2003299067A (ja) 映像信号送信方法及び映像信号受信方法並びに映像信号送受信システム
JP2008131264A (ja) 監視カメラ、画像記録表示装置及び監視カメラシステム
JP2005323379A (ja) データ転送に応じるシステムおよび方法
WO2008050620A1 (fr) Serveur de caméra
CN112788198A (zh) 摄影装置、传送系统及方法、记录介质和计算机装置
JP5024331B2 (ja) ビデオカメラ及び情報送信方法
KR100220632B1 (ko) 유니버설 시리얼 버스를 이용한 카메라
KR100223590B1 (ko) 다기능 텔레비전
WO2008018351A1 (fr) système de caméra, dispositif de caméra et serveur de caméra
JP2006171524A (ja) 画像処理装置
WO2007043227A1 (fr) Caméra, vidéo-enregistreur et système de caméra
JPH10294939A (ja) 画像伝送システム及び画像伝送装置
KR101676400B1 (ko) 촬영장치 및 촬영영상 제공방법
JP2002094934A (ja) 信号処理装置およびカメラシステム
JPH0759070A (ja) 表示制御装置
JP5249265B2 (ja) テレビインターホン装置
JP4167730B2 (ja) 動画圧縮伸長装置およびコンピュータシステム
JP2000324375A (ja) デジタルスチルカメラ

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07829744

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07829744

Country of ref document: EP

Kind code of ref document: A1