USRE44924E1 - Digital observation system - Google Patents

Digital observation system Download PDF

Info

Publication number
USRE44924E1
USRE44924E1 US13/669,969 US201213669969A USRE44924E US RE44924 E1 USRE44924 E1 US RE44924E1 US 201213669969 A US201213669969 A US 201213669969A US RE44924 E USRE44924 E US RE44924E
Authority
US
United States
Prior art keywords
digital
data
circuit
image sensor
rgb data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US13/669,969
Inventor
Alan Neal Cooper
Christopher Michael Fritz
James Walter Exner
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Star Co
Original Assignee
Star Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Star Co filed Critical Star Co
Priority to US13/669,969 priority Critical patent/USRE44924E1/en
Assigned to STAR CO reassignment STAR CO ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: IMMERSIVE MEDIA OF TEXAS
Application granted granted Critical
Publication of USRE44924E1 publication Critical patent/USRE44924E1/en
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source

Definitions

  • the present invention is related to patent application Ser. No 10/202,968 titled DIGITAL TRANSMISSION SYSTEM, to patent application Ser. No. 10/202,668 titled DIGITAL CAMERA SYNCHRONIZATION, and to patent application Ser. No. 10/202,257 titled UNIVERSAL SERIAL BUS DISPLAY UNIT. These applications are commonly assigned, commonly filed, and are incorporated by reference herein.
  • the present invention relates to observation systems and, more particularly, to a digital observation system comprising a digital camera and a base unit.
  • a conventional observation system is based on standard analog cameras attached, via a specialized cable, to a Cathode Ray Tube monitor for display. All video processing is performed in the analog domain.
  • This type of conventional system has several limitations including noisy video, low interlaced resolution, slow frame rate or image roll during switching, limited display capabilities and limited storage and processing options. Therefore, it is desirable for the present invention to overcome the conventional limitations associated with processing video in the analog domain.
  • the present invention achieves technical advantages as a digital observation system and method for processing and transmitting video data between a video camera (or video cameras) and a base unit via, for example, a communication protocol that is compliant with Ethernet physical drivers for transmitting and receiving data at around 100 Mbs.
  • a digital observation system video is captured at a sensor in the video camera, digitally processed and transmitted, thus overcoming the aforementioned limitations and allowing unique features to be added. Images and other data may be transmitted efficiently (in their native format), with reduced overhead (header information can be minimized), and in a non-compressed format (since transmission occurs at or below 100 Mbs).
  • a digital observation system comprises a digital camera and a base unit.
  • the digital camera includes a charge coupled device (CCD) image sensor memory, a CCD image sensor, a correlated double sampling (CDS) circuit, and a physical interface.
  • the CCD image sensor memory is adapted to store video data, while the CCD image sensor is adapted to transmit the stored video data in analog RGB color space format native to the CCD image sensor (analog RGB data), where the CCD image sensor comprises the CCD image sensor memory.
  • the CDS circuit is adapted to receive the analog RGB data from the CCD image sensor and convert the analog RGB data into digital RGB data and the physical interface is adapted to receive the digital RGB data and transmit the digital RGB data with reduced operational overhead and increased operational functionality.
  • the base unit includes a base unit physical interface, a base unit programmable logic device (PLD), and a display.
  • the base unit physical interface is adapted to receive the digital RGB data with the reduced operational overhead and the increased operational functionality from the digital camera physical interface and display it via the display.
  • the base unit PLD is coupled to the base unit physical interface and the display.
  • FIG. 1 illustrates a block diagram of a digital observation system in accordance with an exemplary embodiment of the present invention.
  • FIG. 2 illustrates a block diagram of a digital camera in accordance with an exemplary embodiment of the present invention.
  • FIG. 3 illustrates a block diagram of a programmable logic device of the digital camera in accordance with an exemplary embodiment of the present invention.
  • FIG. 4 illustrates a block diagram of a base unit in accordance with an exemplary embodiment of the present invention.
  • FIG. 5 illustrates a block diagram of a programmable logic device of the base unit in accordance with an exemplary embodiment of the present invention.
  • FIG. 6 illustrates a flow chart for data processing in accordance with an exemplary embodiment of the present invention.
  • FIG. 7 illustrates an alternate flow chart for data processing in accordance with an exemplary embodiment of the present invention.
  • the system 10 includes at least one digital camera 12 coupled to at least one base unit 14 via, for example, an Ethernet connection.
  • the digital camera 12 includes a charge coupled device (CCD) image sensor 20 which includes a CCD memory (not shown), a correlated double sampling (CDS) circuit 22 , and a physical interface 24 .
  • CCD charge coupled device
  • CDS correlated double sampling
  • the image sensor preferably used by the present invention is a CCD image sensor, a complementary metal oxide semiconductor (CMOS) image sensor may also be used.
  • CMOS complementary metal oxide semiconductor
  • the CCD image sensor is a collection of tiny light-sensitive diodes, called photosites, which convert photons (light) into electrons (electrical charge). Each photosite is sensitive to light, wherein the brighter the light that hits a single photosite, the greater the electrical charge that will accumulate at that site.
  • the value (accumulated charge) of each cell in an image is read and an analog-to-digital converter (ADC—not shown) turns each pixel's value into a digital value.
  • ADC analog-to-digital converter
  • the sensor uses filtering to “look at” the light in its three primary colors (Red Green Blue or RGB) optically or electrically, filtered or unfiltered. Once all three colors have been recorded, they can be added together to create a full spectrum of colors.
  • the CCD memory is adapted to store video data and the CCD image sensor 20 is adapted to transmit the stored video data in analog RGB color space format native to the CCD image sensor (such data will hereinafter be referred to as “analog RGB data”).
  • the CDS circuit 22 is adapted to receive the analog RGB data from the CCD image sensor 20 and convert the analog RGB data into digital RGB data, wherein the CDS circuit is operably coupled to the CCD image sensor.
  • the physical interface 24 such as an Ethernet physical interface for example, is adapted to receive the digital RGB data and transmit the digital RGB data with reduced operational overhead and increased operational functionality via an RJ45 (or similar) connector 26 , wherein the physical interface is operably coupled to the CDS circuit 22 .
  • the operational overhead includes at least one of: a multi-drop operation, error detection, source addressing, destination addressing, and multi-speed operation, while the operational functionality includes at least one of: header data, secondary data, and error correction.
  • the digital camera 12 further includes a Field-Programmable Gate Array (FPGA) or Programmable Logic Device (PLD) 28 that interfaces to the CDS circuit 22 and controls timing signals to the CCD image sensor 20 to transmit the analog RGB data to the CDS circuit and is adapted to delay one line of the digital RGB data with the reduced operational overhead and the increased operational functionality transmission when a transmission error correction occurs.
  • the PLD 28 is a circuit that can be programmed to perform complex functions.
  • the digital camera PLD 28 is more fully discussed in the description of FIG. 3 below.
  • An erasable programmable read-only memory 30 which is a type of memory that retains its contents until it is exposed to ultraviolet light (the ultraviolet light clears its contents, making it possible to reprogram the memory), is coupled to the PLD 28 .
  • a horizontal driver 32 , a vertical driver 34 , and an RJ11 (or similar) connector 40 are coupled to the PLD 28 .
  • the horizontal driver 32 and the vertical driver 34 further coupled to the CCD image sensor 20 , convert logic level signals to voltage levels that the CCD image sensor can utilize.
  • a “remote control” message may be sent from the PLD 28 to an external module (such a voice or data platform), via the RJ11 connector 40 , to indicate data to be transferred between the digital camera 12 and the external module (not shown) via commands from the base unit 14 (attached to the digital camera 12 via the RJ45 connector 26 ).
  • a “call input” message may be sent to the PLD 28 from the external module.
  • the digital camera 12 also includes a microphone 36 coupled to an audio amplifier 38 , for supporting full duplexed audio, which is further coupled to the RJ11 connector 40 .
  • the digital camera 12 preferably utilizes a single input voltage of 14V to 40V and generates the multiple voltages that are needed by the logic devices and the CCD image sensor 20 .
  • These multiple voltages include a linear regulated output voltage 42 for the RJ11 device interface which provides 12 V (25 mA max), a main system power supply 44 which generates all of the multiple voltages other than the linear regulated output voltage 42 and provides 3.3 V (1.65 A), a core voltage 46 for the PLD 28 which provides 2.5 V (mA), a CCD amplifier voltage 48 which provides 15 V (mA), and a CCD substrate bias which provides ⁇ 5.5 V (mA).
  • the digital camera 12 additionally utilizes various clocks including clocks for CCD timing and clock generation, for outputting to the physical interface 24 , and a main video processing clock.
  • the digital camera 12 comprises a memory adapted to store video data, a first circuit (for example, the CCD image sensor 20 ) adapted to transmit the stored video data in a first representation of a first format (for example, analog RGB data), wherein the first circuit comprises the memory, a second circuit (for example, the CDS circuit 22 ) adapted to receive the data in the first representation of the first format from the first circuit and convert the data in the first representation of the first format into a second representation of the first format (for example, digital RGB data), wherein the first circuit is operably coupled to the second circuit, a third circuit (for example, the digital camera PLD) adapted to control timing signals to the first circuit to transmit the data in the first representation of the first format to the second circuit, wherein the third circuit interfaces to the first circuit and to the second circuit, and a fourth circuit (for example, the interface) adapted to receive the data in the second representation of the first format from the second circuit and transmit the data in the second representation of the first format, wherein the fourth circuit is
  • the digital camera 12 comprises a memory adapted to store video data, a first circuit adapted to transmit the stored video data in a first format, wherein the first circuit comprises the memory, a second circuit adapted to receive the data in the first representation of the first format from the first circuit, wherein the first circuit is operably coupled to the second circuit, a third circuit adapted to control timing signals to the first circuit to transmit the data in the first format to the second circuit and convert the data in the first format into a second format (for example, digital RGB data), wherein the third circuit interfaces to the first circuit and to the second circuit, and a fourth circuit adapted to receive the data in the second format from the second circuit and transmit the data in the second format, wherein the fourth circuit is operably coupled to the second circuit.
  • a second circuit adapted to receive the data in the first representation of the first format from the first circuit
  • the first circuit is operably coupled to the second circuit
  • a third circuit adapted to control timing signals to the first circuit to transmit the data in the first format to the second circuit and convert the data
  • the digital camera PLD 28 which contains logic that is responsible for timing, video processing, data reception and data transmission, includes a two video line delay memory 52 adapted to receive the digital RGB data from the CDS circuit 22 .
  • the digital RGB data is received at a first multiplexer 54 which transfers the data into a first line delay 56 or into a second line delay 58 depending on which line delay is used for storing new data and which line delay is used for outputting to a destination such as the base unit 14 .
  • the first line delay 56 may be used to read in a new line of data from the CCD image sensor 20 via the CDS circuit 22 while the second line delay 58 may output the data to be processed.
  • a second multiplexer 60 transfers the line of data to be output to video processing modules (such as a data reduction circuit 72 and a gamma correct circuit 76 ) and transmission modules (such as a cyclic redundancy check (CRC) generator 78 and an Ethernet physical data transmission module 80 ). Such modules are described further below.
  • the two line delays 56 , 58 collectively hold the last two lines read out from the CCD image sensor 20 .
  • One line contains red and green data while the other line contains green and blue data. This is the format of the data in the CCD image sensor 20 .
  • a Y conversion circuit 64 requires red, green, and blue data so data from both line delays 56 , 58 are read into the Y conversion circuit each time a conversion from RGB to YUV is to be calculated.
  • YUV is a format that represents the signal as luminance and chrominance information and is a widely used video format for the transmission of digital video data.
  • the Y conversion circuit 64 sends the YUV data to a Y average circuit 66 which progressively calculates the average Y value for the whole image when all lines of the frame of data are readout.
  • the Y average circuit 66 basically finds the average brightness of the whole image.
  • the data is sent to an exposure control circuit 68 which calculates the correct timing signals to be sent to the CCD image sensor 20 to get the proper exposure control for the CCD image sensor.
  • the exposure control circuit 68 also sends an amplifier gain control to the CDS circuit 22 which includes a variable gain amplifier (not shown) to adjust the signal level from the CCD image sensor 20 before the analog to digital conversion.
  • the Y conversion circuit also sends the Y value for each pixel to an auto tracking white (ATW) coefficient circuit 70 to discriminate the use of pixels with too large or small of a Y value so as to improve the ATW performance.
  • ATW is used to make color corrections to ensure that white is the correct color.
  • the ATW coefficient circuit 70 calculates an average red pixel value for the whole image along with the same for the blue and green pixels. It then calculates correction factors (gain changes) for the red pixels and the blue pixels so that the red, green, and blue pixel average for the whole frame are the same. As such, the whole frame includes equal values of red, green, and blue averages. These correction factors are fed to the multiplier 62 that connects the second multiplexer 60 and the data reduction circuit 72 .
  • the red correction factor is multiplied on all red pixels and the blue correction factor is calculated on all the blue pixels.
  • the data reduction circuit 72 is used to reduce the number of bits for each pixel back to nine bits.
  • the CDS circuit 22 outputs ten bits for each pixel and the multiplier 62 increases that number by several more bits. As such, the data reduction circuit 72 truncates this amount back to nine bits.
  • the pixels are sent to either a pixel mask 74 , that is used to improve the performance of the ATW coefficient circuit 70 , or to the gamma correct circuit 76 that puts each pixel value through a non-linear transfer function to enhance the values of pixels that are low. This action also corrects a non-linear transfer function at a final display in the base unit 14 .
  • the CRC generator 78 calculates a CRC value for each line transmitted to the base unit 14 . The CRC value is added to the end of the line transmission so that the receiving base unit 14 can make the same calculation for the line of data it received to verify that the data received is correct.
  • the Ethernet physical data transmission module 80 outputs header data to the base unit 14 prior to transmitting the line of video data.
  • the header data consist of a preamble (so as to alert the receiver that new data is coming) and secondary data (such as the line number of the data to be transmitted) that the digital camera 12 needs to send to the base unit 14 .
  • the line of video data and the CRC data is transmitted as one continuous stream of data until the CRC is complete.
  • the data is input into the Ethernet physical data transmission module 80 in groups of 4 bits at a time, wherein two groups of 4 bits contains one pixel of data. As such, a pixel is sent as an 8 bit word.
  • the Ethernet physical data receiver module 82 which is used to receive data from the base unit 14 , removes the header data and outputs 4 bit words to a CRC checker module 84 . It should be understood that the data may be input into the Ethernet physical data transmission module 80 and output to the CRC checker module 84 at a greater and/or a lesser aforementioned amount.
  • the CRC checker module 84 calculates the CRC value as the data is received and sends the data to a data flow controller 86 which holds all of the data until all of it is received and the CRC is checked. If the CRC check shows the data is correct, the data flow controller 86 sends the data to the main time base 88 .
  • the data sent informs the digital camera 12 if it is time to start a new frame of data or if the last line was received and further includes any control signals that the base unit 14 needs to send the camera to control the operation of the camera.
  • the main time base 88 controls all of the timing functions of the camera 12 (but is preferably slaved to the base unit 14 time base) including exposure control and video processing synchronization.
  • the main time base 88 can also send signals to a remote control and call controller 90 which is connected to a connector (not shown) on the camera 12 enabling signals to be input into the camera or output from the camera to control external devices (not shown) connected to the camera.
  • the Main Time Base 88 also controls all of the timing for the CCD image sensor 20 through a CCD clock generator 90 and by sending signals directly to the CCD image sensor.
  • the main time base further sends signals to a CDS controller 94 which sets all of the configurations for the CDS circuit 22 and controls the synchronization of the CDS samples of the analog RGB data.
  • the digital camera PLD 28 is adapted to control timing signals to the CCD image sensor 20 to transmit analog RGB data to a CDS circuit 22 where the analog RGB data is converted to digital RGB data, and delay one line of the digital RGB data transmission (by a two video line delay memory) when a transmission error correction occurs, wherein the digital RGB data transmission comprises the reduced operational overhead and the increased operational functionality.
  • the digital camera PLD 28 is adapted to receive digital RGB data from the CDS circuit 22 , convert the received digital RGB data into digital YUV data, and transmit the digital YUV data to the CDS circuit 22 , wherein the digital camera PLD interfaces to the CDS circuit.
  • the base unit 14 comprises a camera interface module (or base unit physical interface) 100 which interfaces to the digital camera 12 via RJ45 connectors 102 .
  • the camera interface module 100 is operably coupled to a base unit FPGA/PLD 104 which is further operably coupled to a display 106 which is adapted to display the received video data.
  • the base unit 14 further comprises a microprocessor including a USB interface 108 that receives (or is adapted to receive) video data from a first source, such as, for example, a personal computer (not shown) via a physical USB interface or data port 110 , and a video decoder 112 that receives other video data from a second source, such as another video source, via a connection plug 114 and/or an RJ11 connector 116 .
  • the microprocessor 108 manages and controls the operational functions of the base unit 14 including managing the display 106 and also controls a user interface.
  • the microprocessor 28 further controls the USB interface 110 and the data associated with it including data flow management, data transfer and reception.
  • the video decoder 112 is used to digitize the incoming analog video signal and the decoder's 112 output, for example, may be the spatial resolution of 4:2:2 (intensity:reddishness:blueishness) YUV digital video data. There are a plurality of bits of data for each pixel and horizontal and vertical synchronization signals are output from the decoder 112 in addition to a data valid signal.
  • the microprocessor 108 can transmit the PLD 104 processed video data to the first source via the USB port 110 which may further receive audio data from the first source.
  • An audio circuit 118 takes audio data and amplifies it for output to a speaker 120 and/or to the RJ11 connector. Audio information can also be received via the RJ11 connector 116 and/or the microphone 122 , and can also be output via the USB port 110 provided a D/A converter was present in the audio circuit.
  • a real-time clock 124 transmits and receives time and date information between the video decoder 112 , the microprocessor 108 and a second memory 126 and further stores configuration registers and timer functions.
  • the second memory 126 which is operably coupled to the microprocessor 108 and the video decoder 112 , maintains operation code of the microprocessor.
  • the base unit 14 preferably operates from an external 24V (or alternatively a 12V) DC wall mount power supply 128 that supplies all the power necessary for the display 106 to operate.
  • a power supply 130 is designed to protect the base unit 14 from excess voltage inputs and to filter any noise from entering or exiting the display unit.
  • the power supply 130 further creates multiple DC voltages (such as 1.8V, 3.3V, and 5V) to supply various portions of the base unit 14 .
  • the display 106 is a video display monitor utilizing an LCD active matrix display with a VGA resolution of 640 pixels by 480 lines (although the resolution could be higher or lower).
  • the interface to the display 106 is comprised of a plurality of logic level clock signals that are used for clocking, synchronization, and data transfer.
  • the power supply module 130 which receives power from an external adapter (not shown), creates a plurality of voltages to supply the display 106 and a backlight 132 .
  • the backlight 132 applies a voltage to tubes (not shown) that illuminate the display 106 , where the tubes are operably coupled to the monitor.
  • the base unit 14 will have enough memory, such as the memories 134 and 136 (which are preferably synchronous dynamic random access memories), to store a number of images so that the rate for switching display images is not effected by the transfer time of the data sent by the camera 12 and or a first source over the USB connection 110 .
  • the base unit 14 may further be controlled by a keyboard 138 which is operably coupled to the PLD 34 .
  • the PLD 104 is the primary controller for the functions of the various portions of the base unit 14 .
  • One of these functions includes managing (which includes reading, writing, and refreshing) the memories 134 , 136 that are utilized to store the images that are to be displayed on the display 106 .
  • the data input to the memories 134 , 136 are received from, the camera interface module 100 , the video decoder 112 , and/or the USB microprocessor 108 .
  • the memories 134 , 136 size are dependant on the screen resolution of the display 106 and can contain multiple images for display as well as buffer memory that will be utilized as a receiving buffer for new images.
  • the output of the memory data is sent to a scalar (not shown) which is located in the PLD 104 to convert the data to the appropriate data size for the display 106 .
  • PLD 104 controls the data flow from the USB data port 110 , the video decoder 112 , the memories 134 , 136 , and the display 106 , interfacing to the display, developing all the necessary signals for a time-base of the display, direct memory access controlling of the data from the microprocessor 108 to the memories 134 , 136 , managing the user interfaces, transmitting the data to the microprocessor, generating an “on screen display” thereby enabling a user to program and adjust display parameters, buffering video data as it transfers from different circuit areas that operate at different data rates, scaling the video data to be displayed to the appropriate resolution for the display, and controlling various first-in-first-out (FIFO) controllers (such as spot memory controllers and VCR memory controllers). Further functions include performing video processing such as enhancing the video by controlling the contrast, brightness, color saturation, sharpness, and color space conversion of the video data that is received.
  • FIFO first-in-first-out
  • the base unit 14 comprises the base unit physical interface 100 adapted to receive the digital RGB data with the reduced operational overhead and the increased operational functionality from the digital camera physical interface 24 , the base unit PLD operably coupled to the base unit physical interface, and the display 106 adapted to display the digital RGB data with reduced operational overhead and increased operational functionality, wherein the display is operably coupled to the PLD.
  • the base unit PLD is further adapted to control timing signals to display the digital RGB data with the reduced operational overhead and the increased operational functionality.
  • the base unit 14 comprises a first circuit (for example, the base unit interface 100 ) adapted to receive transmitted data in a second representation of a first format (for example, digital RGB data), a second circuit (for example, the display), and a third circuit (for example, the base unit PLD) adapted to control timing signals to the first circuit to transmit the data in the second representation of the first format for display via the second circuit, wherein the third circuit interfaces to the first circuit and to the second circuit.
  • a first circuit for example, the base unit interface 100
  • a second circuit for example, the display
  • a third circuit for example, the base unit PLD
  • the base unit 14 comprises a physical interface adapted to receive digital YUV data with reduced operational overhead and increased operational functionality from a digital camera physical interface, a PLD operably coupled to the base unit physical interface, and a display adapted to display the digital YUV data with reduced operational overhead and increased operational functionality, wherein the display is operably coupled to the PLD.
  • the base unit PLD 104 comprises a plurality of base unit FIFO modules 150 which manipulate the digital camera data received over, for example, an Ethernet transmission into a form that can be used by the base unit 14 .
  • the FIFO modules 150 synchronize data from an asynchronous clock (which may be generated by the physical interface 24 ) to a system clock (such as a master time base 174 , described further below), perform a CRC check and transmit a valid or non-valid data receive, calculate a CRC and transmit control data to the digital camera 12 , convert Bayer CFA data from 8-bit to 24-bit RGB output, and output CIF resolution data (for example, 352 ⁇ 288) and VGA resolution data (for example, 640 ⁇ 480).
  • asynchronous clock which may be generated by the physical interface 24
  • a system clock such as a master time base 174 , described further below
  • a pre-video processor multiplexer 152 receives the 24-bit RGB data from the FIFO modules 150 and creates a multiplexed high-speed data stream for processing to reduce the logic required for processing a plurality of video inputs.
  • the 24-bit RGB data is preferably multiplexed into one 24-bit stream at 62.5 MHz (for both VGA and CIF resolutions) and performs a color space conversion to YCrCb 4:4:4 (which is a color space similar to YUV).
  • a decoder interface 154 receives data from the video decoder 112 and synchronizes the asynchronous decoder data to the system clock, converts the 4:2:2 data to 4:4:4 data for processing, scales the VGA resolution to CIF resolution, and outputs VGA resolution data.
  • a video processing module 156 receives the YCrCb 4:4:4 multiplexed 24-bit stream at 62.5 MHz from the pre-video processor multiplexer 152 and also receives the YCrCb 4:4:4 data as well as the VGA resolution data from the decoder interface 154 .
  • This received data is processed by multiplexing the data from the decoder interface 154 into a high-speed serial digital camera stream, performing a 4:4:4 to 4:2:2 conversion, parsing the data stream into three potential outputs (to the display 106 , the USB interface 110 , and/or the NTSC or video decoder 112 , and stacking the display and USB bits into 32-bit wide words.
  • the video processing module 156 outputs reformatted data to an LCD top module 158 and to a VCR top module.
  • the LCD top module 158 performs SDRAM memory control functions and formats the output data to the required output format for display via the display 106 .
  • the VCR top module 160 performs SDRAM memory control functions and formats the output data to the required output format to the microprocessor 108 (for USB data) or to the LCD 106 for display.
  • the LCD top module 158 outputs a 16-bit YUV data stream to an LCD processing module 162 and to a microprocessor interface 164 .
  • the LCD processing module 162 interfaces to the display 106 , formats the data to the display and provides the following functions: YCrCb 4:2:2 to 4:4:4 conversion, color, brightness, contrast, and sharpness adjustment, YCrCb to RGB conversion, RGB 18-bit formatting, and on screen display (such as a menu) insertion.
  • the microprocessor interface 164 stores the system wide control registers 166 and interfaces to the microprocessor 108 .
  • the VCR top module 160 outputs a 16-bit YUV data stream to the plurality of encoders 168 - 172 that format the data to the necessary configuration for the NTSC encoder interfaces such as interfaces or connection plugs 116 , 144 .
  • the master time base 174 generates the timing for the entire system 10 . It has multiple counters on different clocks to control the different system inputs and outputs.
  • the time bases controlled by the master time base 174 include: vertical synchronization to the camera 12 (or to a plurality of cameras), NTSC encoder output, LCD master timing, LCD SDRAM frame synchronization, and NTSC SDRAM frame synchronization.
  • a clock generator 176 generates various clock frequencies for the master time base 174 to operate, while a VCR alarm module 178 instructs a VCR to record under alarm conditions.
  • the VCR alarm module 178 may be connected with the keyboard 138 .
  • the digital observation system 10 comprises a digital camera and a base unit.
  • the digital camera includes a CCD image sensor memory adapted to store video data, a CCD image sensor adapted to transmit the stored video data in analog RGB color space format native to the CCD image sensor (analog RGB data), wherein the CCD image sensor comprises the CCD image sensor memory, a CDS circuit adapted to receive the analog RGB data from the CCD image sensor and convert the analog RGB data into digital RGB data, wherein the CDS circuit is operably coupled to the CCD image sensor, a PLD adapted to: receive the transmitted digital RGB data from the CDS circuit, convert the received digital RGB data into YUV color space format (digital YUV data), and transmit the digital YUV data to the CDS circuit, wherein the digital camera PLD interfaces to the CDS circuit, and a physical interface adapted to receive the digital YUV data and transmit the digital YUV data with reduced operational overhead and increased operational functionality, wherein the physical interface is operably coupled to the CDS circuit and wherein the
  • the base unit includes a base unit physical interface adapted to receive the digital YUV data with the reduced operational overhead and the increased operational functionality from the digital camera physical interface, a base unit PLD operably coupled to the base unit physical interface, and a display adapted to display the digital YUV data with reduced operational overhead and increased operational functionality, wherein the display is operably coupled to the PLD.
  • the digital camera PLD is further adapted to control timing signals to the CCD image sensor to transmit the digital YUV data to the CDS circuit, and to delay one line of the digital YUV data with the reduced operational overhead and the increased operational functionality transmission when a transmission error correction occurs.
  • the digital YUV data with the reduced operational overhead and the increased operational functionality is further transmitted from the digital camera physical interface and received by the base unit physical interface via a physical layer device.
  • a digital observation system comprises a first module (for example, a digital camera) and a second module (for example, a base unit).
  • the first module includes a memory adapted to store video data, a first circuit (for example, a CCD image sensor) adapted to transmit the stored video data in a first representation of a first format (for example, analog RGB data), wherein the first circuit comprises the memory, a second circuit (for example, a CDS) adapted to receive the data in the first representation of the first format from the first circuit and convert the data in the first representation of the first format into a second representation of the first format (for example, digital RGB data), wherein the first circuit is operably coupled to the second circuit, a third circuit (for example, a PLD) adapted to control timing signals to the first circuit to transmit the data in the first representation of the first format to the second circuit, wherein the third circuit interfaces to the first circuit and to the second circuit, and a fourth circuit (for example, an interface) adapted to receive the data in the
  • the second module includes a first circuit (for example, a base unit interface) adapted to receive the transmitted data in the second representation of the first format, a second circuit (for example, a display), and a third circuit (for example, a base unit PLD) adapted to control timing signals to the first circuit of the second module to transmit the data in the second representation of the first format for display via the second circuit, wherein the third circuit interfaces to the first circuit of the second module and interfaces to the second circuit of the second module.
  • a first circuit for example, a base unit interface
  • a second circuit for example, a display
  • a third circuit for example, a base unit PLD
  • the method begins at steps 180 and 182 , respectively, with receiving digital RGB data at a first module (for example, a first multiplexer of a two video line delay memory), and storing the data in a second module (for example, a first line delay).
  • a first module for example, a first multiplexer of a two video line delay memory
  • a second module for example, a first line delay
  • receiving the data at a third module for example, a second multiplexer from the second module occurs (the second module is operably coupled to the first module and the third module).
  • the method proceeds to steps 186 and 188 , respectively, where reducing a number of bits per pixel of the data in a fourth module (for example, a data reduction module), and increasing a number of bits per pixel of a first set of the data in a fifth module (for example, a gamma correct module) occur. Determining a cyclic redundancy check (CRC) for a line of the data in a sixth module (for example, a CRC generator) occurs in step 190 .
  • CRC cyclic redundancy check
  • the method may further comprise adding the CRC, by the sixth module, to the end of the data line, and transmitting, by a seventh module (for example, a physical interface) to a destination (for example, a base unit): header data and the data line comprising the CRC.
  • a seventh module for example, a physical interface
  • a destination for example, a base unit
  • the method begins at steps 200 and 202 , respectively, with a receiving of data (such as image sensor data) by a first module (such as one or more of the FIFO modules 150 ) from an origination (such as the digital camera 12 ), and converting, by the first module, the received data to RGB data (such as 24-bit RGB output data).
  • data such as image sensor data
  • a first module such as one or more of the FIFO modules 150
  • origination such as the digital camera 12
  • RGB data such as 24-bit RGB output data
  • multiplexing the RGB data into one bit stream (such as a 24-bit stream) by a second module occurs, and, at step 206 , performing a color space conversion from the RGB data to YCrCb data (such as YCrCb 4:4:4 data) by the second module occurs.
  • performing a color space conversion from the YCrCb data to another YCrCb data format (such as a conversion from YCrCb 4:4:4 to YCrCb 4:2:2) by a third module (such as the video processing module 156 ) occurs.
  • a third module such as the video processing module 156
  • the method proceeds with formatting the other YCrCb data to a required output format by a fourth module (such as the LCD top 158 ) at step 210 , and converting the formatted data to RGB data (such as RGB 4:4:4 data) by a fifth module (such as the LCD processing module 162 ) at step 212 .
  • the fifth module may additionally perform color, sharpness, contrast, and brightness alterations.
  • the method may further include steps for multiplexing locally received data (such as data received from the decoder 154 ) and remotely received data (such as the YCrCb 4:4:4 data) into a high speed digital stream by the third module, and formatting the output data to various memory locations (such as memories 134 ).
  • locally received data such as data received from the decoder 154
  • remotely received data such as the YCrCb 4:4:4 data
  • a plurality of cameras 12 and base units 14 may be utilized with the present invention.
  • the camera physical transceiver 24 and the base unit physical transceiver 26 may be operably coupled to each other via other connections, including copper, fiber and wireless, if the transceivers were modified to accommodate such other connections.
  • the connection 25 may comprise an entity with dynamic characteristics thus altering maximum travel time, data transmission travel time, acknowledgement time, line time, and processing time.
  • system 10 may operate at a varying distance which will alter the travel time and time for the system to start and process any transmissions. Further, various data transmission protocols comprising different information and/or sizes of information may be used with the system 10 . Additionally, a lesser or greater number of modules may comprise the system 10 , the digital camera 12 , and/or the base unit 14 .

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Color Television Image Signal Generators (AREA)

Abstract

A digital observation system and method for processing and transmitting video data between a video camera, or video cameras, and a base unit. The video data is transmitted, for example, by a communication protocol that is compliant with Ethernet physical drivers for transmitting and receiving data at around 100 Mbps. Video is captured at a sensor in the video camera, digitally processed and transmitted, thus overcoming limitations associated with analog processing and allowing unique features to be added. Images and other data may be transmitted efficiently in their native format, with reduced overhead, and in a non-compressed format due to the data transmission rate at or below 100 Mbps.

Description

RELATED APPLICATIONS
The present application is an application for reissue of U.S. Pat. No. 7,312,816, and is a continuation of U.S. application Ser. No. 12/642,698, filed Dec. 18, 2009, which is also an application for reissue of U.S. Pat. No. 7,312,816, now U.S. Pat. No. Re. 43,786.
The present invention is related to patent application Ser. No 10/202,968 titled DIGITAL TRANSMISSION SYSTEM, to patent application Ser. No. 10/202,668 titled DIGITAL CAMERA SYNCHRONIZATION, and to patent application Ser. No. 10/202,257 titled UNIVERSAL SERIAL BUS DISPLAY UNIT. These applications are commonly assigned, commonly filed, and are incorporated by reference herein.
1. FIELD OF THE INVENTION
The present invention relates to observation systems and, more particularly, to a digital observation system comprising a digital camera and a base unit.
2. BACKGROUND OF THE INVENTION
A conventional observation system is based on standard analog cameras attached, via a specialized cable, to a Cathode Ray Tube monitor for display. All video processing is performed in the analog domain. This type of conventional system has several limitations including noisy video, low interlaced resolution, slow frame rate or image roll during switching, limited display capabilities and limited storage and processing options. Therefore, it is desirable for the present invention to overcome the conventional limitations associated with processing video in the analog domain.
SUMMARY OF THE INVENTION
The present invention achieves technical advantages as a digital observation system and method for processing and transmitting video data between a video camera (or video cameras) and a base unit via, for example, a communication protocol that is compliant with Ethernet physical drivers for transmitting and receiving data at around 100 Mbs. With such a digital observation system, video is captured at a sensor in the video camera, digitally processed and transmitted, thus overcoming the aforementioned limitations and allowing unique features to be added. Images and other data may be transmitted efficiently (in their native format), with reduced overhead (header information can be minimized), and in a non-compressed format (since transmission occurs at or below 100 Mbs).
In one embodiment, a digital observation system comprises a digital camera and a base unit. The digital camera includes a charge coupled device (CCD) image sensor memory, a CCD image sensor, a correlated double sampling (CDS) circuit, and a physical interface. The CCD image sensor memory is adapted to store video data, while the CCD image sensor is adapted to transmit the stored video data in analog RGB color space format native to the CCD image sensor (analog RGB data), where the CCD image sensor comprises the CCD image sensor memory. The CDS circuit is adapted to receive the analog RGB data from the CCD image sensor and convert the analog RGB data into digital RGB data and the physical interface is adapted to receive the digital RGB data and transmit the digital RGB data with reduced operational overhead and increased operational functionality. The base unit includes a base unit physical interface, a base unit programmable logic device (PLD), and a display. The base unit physical interface is adapted to receive the digital RGB data with the reduced operational overhead and the increased operational functionality from the digital camera physical interface and display it via the display. The base unit PLD is coupled to the base unit physical interface and the display.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 illustrates a block diagram of a digital observation system in accordance with an exemplary embodiment of the present invention.
FIG. 2 illustrates a block diagram of a digital camera in accordance with an exemplary embodiment of the present invention.
FIG. 3 illustrates a block diagram of a programmable logic device of the digital camera in accordance with an exemplary embodiment of the present invention.
FIG. 4 illustrates a block diagram of a base unit in accordance with an exemplary embodiment of the present invention.
FIG. 5 illustrates a block diagram of a programmable logic device of the base unit in accordance with an exemplary embodiment of the present invention.
FIG. 6 illustrates a flow chart for data processing in accordance with an exemplary embodiment of the present invention.
FIG. 7 illustrates an alternate flow chart for data processing in accordance with an exemplary embodiment of the present invention.
DETAILED DESCRIPTION OF THE INVENTION
Referring now to FIG. 1, a digital observation system 10 of the present invention is presented. The system 10 includes at least one digital camera 12 coupled to at least one base unit 14 via, for example, an Ethernet connection.
Referring now to FIG. 2, the digital camera 12 is presented. The digital camera 12 includes a charge coupled device (CCD) image sensor 20 which includes a CCD memory (not shown), a correlated double sampling (CDS) circuit 22, and a physical interface 24. Although the image sensor preferably used by the present invention is a CCD image sensor, a complementary metal oxide semiconductor (CMOS) image sensor may also be used. The CCD image sensor is a collection of tiny light-sensitive diodes, called photosites, which convert photons (light) into electrons (electrical charge). Each photosite is sensitive to light, wherein the brighter the light that hits a single photosite, the greater the electrical charge that will accumulate at that site. The value (accumulated charge) of each cell in an image is read and an analog-to-digital converter (ADC—not shown) turns each pixel's value into a digital value. In order to get a full color image, the sensor uses filtering to “look at” the light in its three primary colors (Red Green Blue or RGB) optically or electrically, filtered or unfiltered. Once all three colors have been recorded, they can be added together to create a full spectrum of colors.
The CCD memory is adapted to store video data and the CCD image sensor 20 is adapted to transmit the stored video data in analog RGB color space format native to the CCD image sensor (such data will hereinafter be referred to as “analog RGB data”). The CDS circuit 22 is adapted to receive the analog RGB data from the CCD image sensor 20 and convert the analog RGB data into digital RGB data, wherein the CDS circuit is operably coupled to the CCD image sensor. The physical interface 24, such as an Ethernet physical interface for example, is adapted to receive the digital RGB data and transmit the digital RGB data with reduced operational overhead and increased operational functionality via an RJ45 (or similar) connector 26, wherein the physical interface is operably coupled to the CDS circuit 22. The operational overhead includes at least one of: a multi-drop operation, error detection, source addressing, destination addressing, and multi-speed operation, while the operational functionality includes at least one of: header data, secondary data, and error correction.
The digital camera 12 further includes a Field-Programmable Gate Array (FPGA) or Programmable Logic Device (PLD) 28 that interfaces to the CDS circuit 22 and controls timing signals to the CCD image sensor 20 to transmit the analog RGB data to the CDS circuit and is adapted to delay one line of the digital RGB data with the reduced operational overhead and the increased operational functionality transmission when a transmission error correction occurs. The PLD 28 is a circuit that can be programmed to perform complex functions. The digital camera PLD 28 is more fully discussed in the description of FIG. 3 below. An erasable programmable read-only memory 30, which is a type of memory that retains its contents until it is exposed to ultraviolet light (the ultraviolet light clears its contents, making it possible to reprogram the memory), is coupled to the PLD 28. Additionally, a horizontal driver 32, a vertical driver 34, and an RJ11 (or similar) connector 40 are coupled to the PLD 28. The horizontal driver 32 and the vertical driver 34, further coupled to the CCD image sensor 20, convert logic level signals to voltage levels that the CCD image sensor can utilize. A “remote control” message may be sent from the PLD 28 to an external module (such a voice or data platform), via the RJ11 connector 40, to indicate data to be transferred between the digital camera 12 and the external module (not shown) via commands from the base unit 14 (attached to the digital camera 12 via the RJ45 connector 26). A “call input” message may be sent to the PLD 28 from the external module.
The digital camera 12 also includes a microphone 36 coupled to an audio amplifier 38, for supporting full duplexed audio, which is further coupled to the RJ11 connector 40. The digital camera 12 preferably utilizes a single input voltage of 14V to 40V and generates the multiple voltages that are needed by the logic devices and the CCD image sensor 20. These multiple voltages include a linear regulated output voltage 42 for the RJ11 device interface which provides 12 V (25 mA max), a main system power supply 44 which generates all of the multiple voltages other than the linear regulated output voltage 42 and provides 3.3 V (1.65 A), a core voltage 46 for the PLD 28 which provides 2.5 V (mA), a CCD amplifier voltage 48 which provides 15 V (mA), and a CCD substrate bias which provides −5.5 V (mA). The digital camera 12 additionally utilizes various clocks including clocks for CCD timing and clock generation, for outputting to the physical interface 24, and a main video processing clock.
In a preferred embodiment, the digital camera 12 comprises a memory adapted to store video data, a first circuit (for example, the CCD image sensor 20) adapted to transmit the stored video data in a first representation of a first format (for example, analog RGB data), wherein the first circuit comprises the memory, a second circuit (for example, the CDS circuit 22) adapted to receive the data in the first representation of the first format from the first circuit and convert the data in the first representation of the first format into a second representation of the first format (for example, digital RGB data), wherein the first circuit is operably coupled to the second circuit, a third circuit (for example, the digital camera PLD) adapted to control timing signals to the first circuit to transmit the data in the first representation of the first format to the second circuit, wherein the third circuit interfaces to the first circuit and to the second circuit, and a fourth circuit (for example, the interface) adapted to receive the data in the second representation of the first format from the second circuit and transmit the data in the second representation of the first format, wherein the fourth circuit is operably coupled to the second circuit.
In an alternate embodiment, the digital camera 12 comprises a memory adapted to store video data, a first circuit adapted to transmit the stored video data in a first format, wherein the first circuit comprises the memory, a second circuit adapted to receive the data in the first representation of the first format from the first circuit, wherein the first circuit is operably coupled to the second circuit, a third circuit adapted to control timing signals to the first circuit to transmit the data in the first format to the second circuit and convert the data in the first format into a second format (for example, digital RGB data), wherein the third circuit interfaces to the first circuit and to the second circuit, and a fourth circuit adapted to receive the data in the second format from the second circuit and transmit the data in the second format, wherein the fourth circuit is operably coupled to the second circuit.
Referring now to FIG. 3, the digital camera PLD 28 is presented. The digital camera PLD 28, which contains logic that is responsible for timing, video processing, data reception and data transmission, includes a two video line delay memory 52 adapted to receive the digital RGB data from the CDS circuit 22. The digital RGB data is received at a first multiplexer 54 which transfers the data into a first line delay 56 or into a second line delay 58 depending on which line delay is used for storing new data and which line delay is used for outputting to a destination such as the base unit 14. For example, the first line delay 56 may be used to read in a new line of data from the CCD image sensor 20 via the CDS circuit 22 while the second line delay 58 may output the data to be processed. If a line of data has to be repeated then the CCD image sensor 20 does not read in a new line into the first line delay 56 and the last line readout of the second line delay 58 is repeated. A second multiplexer 60 transfers the line of data to be output to video processing modules (such as a data reduction circuit 72 and a gamma correct circuit 76) and transmission modules (such as a cyclic redundancy check (CRC) generator 78 and an Ethernet physical data transmission module 80). Such modules are described further below. The two line delays 56, 58 collectively hold the last two lines read out from the CCD image sensor 20. One line contains red and green data while the other line contains green and blue data. This is the format of the data in the CCD image sensor 20.
A Y conversion circuit 64 requires red, green, and blue data so data from both line delays 56, 58 are read into the Y conversion circuit each time a conversion from RGB to YUV is to be calculated. YUV is a format that represents the signal as luminance and chrominance information and is a widely used video format for the transmission of digital video data. The Y conversion circuit 64 sends the YUV data to a Y average circuit 66 which progressively calculates the average Y value for the whole image when all lines of the frame of data are readout. The Y average circuit 66 basically finds the average brightness of the whole image. Once the Y value for the whole image has been calculated, the data is sent to an exposure control circuit 68 which calculates the correct timing signals to be sent to the CCD image sensor 20 to get the proper exposure control for the CCD image sensor. The exposure control circuit 68 also sends an amplifier gain control to the CDS circuit 22 which includes a variable gain amplifier (not shown) to adjust the signal level from the CCD image sensor 20 before the analog to digital conversion.
The Y conversion circuit also sends the Y value for each pixel to an auto tracking white (ATW) coefficient circuit 70 to discriminate the use of pixels with too large or small of a Y value so as to improve the ATW performance. ATW is used to make color corrections to ensure that white is the correct color. The ATW coefficient circuit 70 calculates an average red pixel value for the whole image along with the same for the blue and green pixels. It then calculates correction factors (gain changes) for the red pixels and the blue pixels so that the red, green, and blue pixel average for the whole frame are the same. As such, the whole frame includes equal values of red, green, and blue averages. These correction factors are fed to the multiplier 62 that connects the second multiplexer 60 and the data reduction circuit 72. The red correction factor is multiplied on all red pixels and the blue correction factor is calculated on all the blue pixels. The data reduction circuit 72 is used to reduce the number of bits for each pixel back to nine bits. The CDS circuit 22 outputs ten bits for each pixel and the multiplier 62 increases that number by several more bits. As such, the data reduction circuit 72 truncates this amount back to nine bits.
From the data reduction circuit 72, the pixels are sent to either a pixel mask 74, that is used to improve the performance of the ATW coefficient circuit 70, or to the gamma correct circuit 76 that puts each pixel value through a non-linear transfer function to enhance the values of pixels that are low. This action also corrects a non-linear transfer function at a final display in the base unit 14. The CRC generator 78 calculates a CRC value for each line transmitted to the base unit 14. The CRC value is added to the end of the line transmission so that the receiving base unit 14 can make the same calculation for the line of data it received to verify that the data received is correct. The Ethernet physical data transmission module 80 outputs header data to the base unit 14 prior to transmitting the line of video data. The header data consist of a preamble (so as to alert the receiver that new data is coming) and secondary data (such as the line number of the data to be transmitted) that the digital camera 12 needs to send to the base unit 14. The line of video data and the CRC data is transmitted as one continuous stream of data until the CRC is complete. In a preferred embodiment, the data is input into the Ethernet physical data transmission module 80 in groups of 4 bits at a time, wherein two groups of 4 bits contains one pixel of data. As such, a pixel is sent as an 8 bit word.
The Ethernet physical data receiver module 82, which is used to receive data from the base unit 14, removes the header data and outputs 4 bit words to a CRC checker module 84. It should be understood that the data may be input into the Ethernet physical data transmission module 80 and output to the CRC checker module 84 at a greater and/or a lesser aforementioned amount. The CRC checker module 84 calculates the CRC value as the data is received and sends the data to a data flow controller 86 which holds all of the data until all of it is received and the CRC is checked. If the CRC check shows the data is correct, the data flow controller 86 sends the data to the main time base 88. The data sent informs the digital camera 12 if it is time to start a new frame of data or if the last line was received and further includes any control signals that the base unit 14 needs to send the camera to control the operation of the camera. The main time base 88 controls all of the timing functions of the camera 12 (but is preferably slaved to the base unit 14 time base) including exposure control and video processing synchronization. The main time base 88 can also send signals to a remote control and call controller 90 which is connected to a connector (not shown) on the camera 12 enabling signals to be input into the camera or output from the camera to control external devices (not shown) connected to the camera. External devices connected to the camera 12 are thus enabled to communicate to the camera and the base unit 14 through the main time base 88 and the Ethernet physical data transmission and receiver modules 80, 82. The Main Time Base 88 also controls all of the timing for the CCD image sensor 20 through a CCD clock generator 90 and by sending signals directly to the CCD image sensor. The main time base further sends signals to a CDS controller 94 which sets all of the configurations for the CDS circuit 22 and controls the synchronization of the CDS samples of the analog RGB data.
In a preferred embodiment, the digital camera PLD 28 is adapted to control timing signals to the CCD image sensor 20 to transmit analog RGB data to a CDS circuit 22 where the analog RGB data is converted to digital RGB data, and delay one line of the digital RGB data transmission (by a two video line delay memory) when a transmission error correction occurs, wherein the digital RGB data transmission comprises the reduced operational overhead and the increased operational functionality.
In an alternate embodiment, the digital camera PLD 28 is adapted to receive digital RGB data from the CDS circuit 22, convert the received digital RGB data into digital YUV data, and transmit the digital YUV data to the CDS circuit 22, wherein the digital camera PLD interfaces to the CDS circuit.
Referring now to FIG. 4, the base unit 14 is presented. The base unit 14 comprises a camera interface module (or base unit physical interface) 100 which interfaces to the digital camera 12 via RJ45 connectors 102. The camera interface module 100 is operably coupled to a base unit FPGA/PLD 104 which is further operably coupled to a display 106 which is adapted to display the received video data.
The base unit 14 further comprises a microprocessor including a USB interface 108 that receives (or is adapted to receive) video data from a first source, such as, for example, a personal computer (not shown) via a physical USB interface or data port 110, and a video decoder 112 that receives other video data from a second source, such as another video source, via a connection plug 114 and/or an RJ11 connector 116. The microprocessor 108 manages and controls the operational functions of the base unit 14 including managing the display 106 and also controls a user interface. The microprocessor 28 further controls the USB interface 110 and the data associated with it including data flow management, data transfer and reception. The video decoder 112 is used to digitize the incoming analog video signal and the decoder's 112 output, for example, may be the spatial resolution of 4:2:2 (intensity:reddishness:blueishness) YUV digital video data. There are a plurality of bits of data for each pixel and horizontal and vertical synchronization signals are output from the decoder 112 in addition to a data valid signal. The microprocessor 108 can transmit the PLD 104 processed video data to the first source via the USB port 110 which may further receive audio data from the first source.
An audio circuit 118 takes audio data and amplifies it for output to a speaker 120 and/or to the RJ11 connector. Audio information can also be received via the RJ11 connector 116 and/or the microphone 122, and can also be output via the USB port 110 provided a D/A converter was present in the audio circuit. A real-time clock 124 transmits and receives time and date information between the video decoder 112, the microprocessor 108 and a second memory 126 and further stores configuration registers and timer functions. The second memory 126, which is operably coupled to the microprocessor 108 and the video decoder 112, maintains operation code of the microprocessor. The base unit 14 preferably operates from an external 24V (or alternatively a 12V) DC wall mount power supply 128 that supplies all the power necessary for the display 106 to operate. A power supply 130 is designed to protect the base unit 14 from excess voltage inputs and to filter any noise from entering or exiting the display unit. The power supply 130 further creates multiple DC voltages (such as 1.8V, 3.3V, and 5V) to supply various portions of the base unit 14.
In an exemplary embodiment, the display 106 is a video display monitor utilizing an LCD active matrix display with a VGA resolution of 640 pixels by 480 lines (although the resolution could be higher or lower). The interface to the display 106 is comprised of a plurality of logic level clock signals that are used for clocking, synchronization, and data transfer. The power supply module 130, which receives power from an external adapter (not shown), creates a plurality of voltages to supply the display 106 and a backlight 132. The backlight 132 applies a voltage to tubes (not shown) that illuminate the display 106, where the tubes are operably coupled to the monitor. The base unit 14 will have enough memory, such as the memories 134 and 136 (which are preferably synchronous dynamic random access memories), to store a number of images so that the rate for switching display images is not effected by the transfer time of the data sent by the camera 12 and or a first source over the USB connection 110. The base unit 14 may further be controlled by a keyboard 138 which is operably coupled to the PLD 34.
The PLD 104 is the primary controller for the functions of the various portions of the base unit 14. One of these functions includes managing (which includes reading, writing, and refreshing) the memories 134, 136 that are utilized to store the images that are to be displayed on the display 106. The data input to the memories 134, 136 are received from, the camera interface module 100, the video decoder 112, and/or the USB microprocessor 108. The memories 134, 136 size are dependant on the screen resolution of the display 106 and can contain multiple images for display as well as buffer memory that will be utilized as a receiving buffer for new images. The output of the memory data is sent to a scalar (not shown) which is located in the PLD 104 to convert the data to the appropriate data size for the display 106.
Other functions of the PLD 104 include controlling the data flow from the USB data port 110, the video decoder 112, the memories 134, 136, and the display 106, interfacing to the display, developing all the necessary signals for a time-base of the display, direct memory access controlling of the data from the microprocessor 108 to the memories 134, 136, managing the user interfaces, transmitting the data to the microprocessor, generating an “on screen display” thereby enabling a user to program and adjust display parameters, buffering video data as it transfers from different circuit areas that operate at different data rates, scaling the video data to be displayed to the appropriate resolution for the display, and controlling various first-in-first-out (FIFO) controllers (such as spot memory controllers and VCR memory controllers). Further functions include performing video processing such as enhancing the video by controlling the contrast, brightness, color saturation, sharpness, and color space conversion of the video data that is received.
In a preferred embodiment, the base unit 14 comprises the base unit physical interface 100 adapted to receive the digital RGB data with the reduced operational overhead and the increased operational functionality from the digital camera physical interface 24, the base unit PLD operably coupled to the base unit physical interface, and the display 106 adapted to display the digital RGB data with reduced operational overhead and increased operational functionality, wherein the display is operably coupled to the PLD. The base unit PLD is further adapted to control timing signals to display the digital RGB data with the reduced operational overhead and the increased operational functionality.
In a further embodiment, the base unit 14 comprises a first circuit (for example, the base unit interface 100) adapted to receive transmitted data in a second representation of a first format (for example, digital RGB data), a second circuit (for example, the display), and a third circuit (for example, the base unit PLD) adapted to control timing signals to the first circuit to transmit the data in the second representation of the first format for display via the second circuit, wherein the third circuit interfaces to the first circuit and to the second circuit.
In an alternate embodiment, the base unit 14 comprises a physical interface adapted to receive digital YUV data with reduced operational overhead and increased operational functionality from a digital camera physical interface, a PLD operably coupled to the base unit physical interface, and a display adapted to display the digital YUV data with reduced operational overhead and increased operational functionality, wherein the display is operably coupled to the PLD.
Referring now to FIG. 5, the base unit PLD 104 is presented. The base unit PLD 104 comprises a plurality of base unit FIFO modules 150 which manipulate the digital camera data received over, for example, an Ethernet transmission into a form that can be used by the base unit 14. The FIFO modules 150 synchronize data from an asynchronous clock (which may be generated by the physical interface 24) to a system clock (such as a master time base 174, described further below), perform a CRC check and transmit a valid or non-valid data receive, calculate a CRC and transmit control data to the digital camera 12, convert Bayer CFA data from 8-bit to 24-bit RGB output, and output CIF resolution data (for example, 352×288) and VGA resolution data (for example, 640×480).
A pre-video processor multiplexer 152 receives the 24-bit RGB data from the FIFO modules 150 and creates a multiplexed high-speed data stream for processing to reduce the logic required for processing a plurality of video inputs. The 24-bit RGB data is preferably multiplexed into one 24-bit stream at 62.5 MHz (for both VGA and CIF resolutions) and performs a color space conversion to YCrCb 4:4:4 (which is a color space similar to YUV).
A decoder interface 154 receives data from the video decoder 112 and synchronizes the asynchronous decoder data to the system clock, converts the 4:2:2 data to 4:4:4 data for processing, scales the VGA resolution to CIF resolution, and outputs VGA resolution data.
A video processing module 156 receives the YCrCb 4:4:4 multiplexed 24-bit stream at 62.5 MHz from the pre-video processor multiplexer 152 and also receives the YCrCb 4:4:4 data as well as the VGA resolution data from the decoder interface 154. This received data is processed by multiplexing the data from the decoder interface 154 into a high-speed serial digital camera stream, performing a 4:4:4 to 4:2:2 conversion, parsing the data stream into three potential outputs (to the display 106, the USB interface 110, and/or the NTSC or video decoder 112, and stacking the display and USB bits into 32-bit wide words.
The video processing module 156 outputs reformatted data to an LCD top module 158 and to a VCR top module. The LCD top module 158 performs SDRAM memory control functions and formats the output data to the required output format for display via the display 106. The VCR top module 160 performs SDRAM memory control functions and formats the output data to the required output format to the microprocessor 108 (for USB data) or to the LCD 106 for display. The LCD top module 158 outputs a 16-bit YUV data stream to an LCD processing module 162 and to a microprocessor interface 164. The LCD processing module 162 interfaces to the display 106, formats the data to the display and provides the following functions: YCrCb 4:2:2 to 4:4:4 conversion, color, brightness, contrast, and sharpness adjustment, YCrCb to RGB conversion, RGB 18-bit formatting, and on screen display (such as a menu) insertion. The microprocessor interface 164 stores the system wide control registers 166 and interfaces to the microprocessor 108. The VCR top module 160 outputs a 16-bit YUV data stream to the plurality of encoders 168-172 that format the data to the necessary configuration for the NTSC encoder interfaces such as interfaces or connection plugs 116, 144.
The master time base 174 generates the timing for the entire system 10. It has multiple counters on different clocks to control the different system inputs and outputs. The time bases controlled by the master time base 174 include: vertical synchronization to the camera 12 (or to a plurality of cameras), NTSC encoder output, LCD master timing, LCD SDRAM frame synchronization, and NTSC SDRAM frame synchronization. A clock generator 176 generates various clock frequencies for the master time base 174 to operate, while a VCR alarm module 178 instructs a VCR to record under alarm conditions. The VCR alarm module 178 may be connected with the keyboard 138.
In an alternate embodiment, the digital observation system 10 comprises a digital camera and a base unit. The digital camera includes a CCD image sensor memory adapted to store video data, a CCD image sensor adapted to transmit the stored video data in analog RGB color space format native to the CCD image sensor (analog RGB data), wherein the CCD image sensor comprises the CCD image sensor memory, a CDS circuit adapted to receive the analog RGB data from the CCD image sensor and convert the analog RGB data into digital RGB data, wherein the CDS circuit is operably coupled to the CCD image sensor, a PLD adapted to: receive the transmitted digital RGB data from the CDS circuit, convert the received digital RGB data into YUV color space format (digital YUV data), and transmit the digital YUV data to the CDS circuit, wherein the digital camera PLD interfaces to the CDS circuit, and a physical interface adapted to receive the digital YUV data and transmit the digital YUV data with reduced operational overhead and increased operational functionality, wherein the physical interface is operably coupled to the CDS circuit and wherein the digital camera PLD interfaces to the physical interface. The base unit includes a base unit physical interface adapted to receive the digital YUV data with the reduced operational overhead and the increased operational functionality from the digital camera physical interface, a base unit PLD operably coupled to the base unit physical interface, and a display adapted to display the digital YUV data with reduced operational overhead and increased operational functionality, wherein the display is operably coupled to the PLD.
The digital camera PLD is further adapted to control timing signals to the CCD image sensor to transmit the digital YUV data to the CDS circuit, and to delay one line of the digital YUV data with the reduced operational overhead and the increased operational functionality transmission when a transmission error correction occurs. The digital YUV data with the reduced operational overhead and the increased operational functionality is further transmitted from the digital camera physical interface and received by the base unit physical interface via a physical layer device.
In a further alternate embodiment, a digital observation system comprises a first module (for example, a digital camera) and a second module (for example, a base unit). The first module includes a memory adapted to store video data, a first circuit (for example, a CCD image sensor) adapted to transmit the stored video data in a first representation of a first format (for example, analog RGB data), wherein the first circuit comprises the memory, a second circuit (for example, a CDS) adapted to receive the data in the first representation of the first format from the first circuit and convert the data in the first representation of the first format into a second representation of the first format (for example, digital RGB data), wherein the first circuit is operably coupled to the second circuit, a third circuit (for example, a PLD) adapted to control timing signals to the first circuit to transmit the data in the first representation of the first format to the second circuit, wherein the third circuit interfaces to the first circuit and to the second circuit, and a fourth circuit (for example, an interface) adapted to receive the data in the second representation of the first format from the second circuit and transmit the data in the second representation of the first format, wherein the fourth circuit is operably coupled to the second circuit.
The second module includes a first circuit (for example, a base unit interface) adapted to receive the transmitted data in the second representation of the first format, a second circuit (for example, a display), and a third circuit (for example, a base unit PLD) adapted to control timing signals to the first circuit of the second module to transmit the data in the second representation of the first format for display via the second circuit, wherein the third circuit interfaces to the first circuit of the second module and interfaces to the second circuit of the second module.
Referring now to FIG. 6, a method for data processing is presented. The method begins at steps 180 and 182, respectively, with receiving digital RGB data at a first module (for example, a first multiplexer of a two video line delay memory), and storing the data in a second module (for example, a first line delay). At step 184, receiving the data at a third module (for example, a second multiplexer) from the second module occurs (the second module is operably coupled to the first module and the third module). The method proceeds to steps 186 and 188, respectively, where reducing a number of bits per pixel of the data in a fourth module (for example, a data reduction module), and increasing a number of bits per pixel of a first set of the data in a fifth module (for example, a gamma correct module) occur. Determining a cyclic redundancy check (CRC) for a line of the data in a sixth module (for example, a CRC generator) occurs in step 190. The method may further comprise adding the CRC, by the sixth module, to the end of the data line, and transmitting, by a seventh module (for example, a physical interface) to a destination (for example, a base unit): header data and the data line comprising the CRC.
Referring now to FIG. 7, another method for data processing is presented. The method begins at steps 200 and 202, respectively, with a receiving of data (such as image sensor data) by a first module (such as one or more of the FIFO modules 150) from an origination (such as the digital camera 12), and converting, by the first module, the received data to RGB data (such as 24-bit RGB output data). At step 204, multiplexing the RGB data into one bit stream (such as a 24-bit stream) by a second module (such as the pre-video processor multiplexer 152) occurs, and, at step 206, performing a color space conversion from the RGB data to YCrCb data (such as YCrCb 4:4:4 data) by the second module occurs. At step 208 performing a color space conversion from the YCrCb data to another YCrCb data format (such as a conversion from YCrCb 4:4:4 to YCrCb 4:2:2) by a third module (such as the video processing module 156) occurs. Such a conversion reduces the amount of data to be transmitted with insignificant image degradation.
The method proceeds with formatting the other YCrCb data to a required output format by a fourth module (such as the LCD top 158) at step 210, and converting the formatted data to RGB data (such as RGB 4:4:4 data) by a fifth module (such as the LCD processing module 162) at step 212. The fifth module may additionally perform color, sharpness, contrast, and brightness alterations.
The method may further include steps for multiplexing locally received data (such as data received from the decoder 154) and remotely received data (such as the YCrCb 4:4:4 data) into a high speed digital stream by the third module, and formatting the output data to various memory locations (such as memories 134).
Although an exemplary embodiment of the system and method of the present invention has been illustrated in the accompanied drawings and described in the foregoing detailed description, it will be understood that the invention is not limited to the embodiments disclosed, but is capable of numerous rearrangements, modifications, and substitutions without departing from the spirit of the invention as set forth and defined by the following claims. For example, a plurality of cameras 12 and base units 14 may be utilized with the present invention. Further, the camera physical transceiver 24 and the base unit physical transceiver 26 may be operably coupled to each other via other connections, including copper, fiber and wireless, if the transceivers were modified to accommodate such other connections. Additionally, the connection 25 may comprise an entity with dynamic characteristics thus altering maximum travel time, data transmission travel time, acknowledgement time, line time, and processing time. Also, the system 10 may operate at a varying distance which will alter the travel time and time for the system to start and process any transmissions. Further, various data transmission protocols comprising different information and/or sizes of information may be used with the system 10. Additionally, a lesser or greater number of modules may comprise the system 10, the digital camera 12, and/or the base unit 14.

Claims (35)

What we claim is:
1. A digital observation system comprising:
a digital camera including:
a charge coupled device (CCD) image sensor memory adapted to store video data;
a CCD image sensor adapted to transmit the stored video data in analog RGB color space format native to the CCD image sensor (analog RGB data), wherein the CCD image sensor comprises the CCD image sensor memory;
a correlated double sampling (CDS) circuit adapted to receive the analog RGB data from the CCD image sensor and convert the analog RGB data into digital RGB data, wherein the CDS circuit is operably coupled to the CCD image sensor; and
a physical interface adapted to receive the digital RGB data and transmit the digital RGB data with reduced operational overhead and increased operational functionality, wherein the physical interface is operably coupled to the CDS circuit;
a programmable logic device (PLD) adapted to delay one line of the digital RGB data when a transmission error correction occurs; and
a base unit including:
a base unit physical interface adapted to receive the digital RGB data with the reduced operational overhead and the increased operational functionality from the digital camera physical interface;
a base unit programmable logic device (PLD) operably coupled to the base unit physical interface; and
a display adapted to display the digital RGB data with reduced operational overhead and increased operational functionality, wherein the display is operably coupled to the PLD.
2. The digital observation system of claim 1, wherein the base unit PLD is adapted to control timing signals to display the digital RGB data with the reduced operational overhead and the increased operational functionality.
3. The digital observation system of claim 1, wherein the digital camera PLD is adapted to control timing signals to the CCD image sensor to transmit the analog RGB data to the CDS circuit, wherein the digital camera PLD interfaces to the CDS circuit.
4. The digital observation system of claim 1, wherein the digital RGB data with the reduced operational overhead and the increased operational functionality is transmitted from the digital camera physical interface and received by the base unit physical interface via a physical layer device.
5. A digital observation system comprising:
a digital camera including:
a charge coupled device (CCD) image sensor memory adapted to store video data;
a CCD image sensor adapted to transmit the stored video data in analog RGB color space format native to the CCD image sensor (analog RGB data), wherein the CCD image sensor comprises the CCD image sensor memory;
a correlated double sampling (CDS) circuit adapted to receive the analog RGB data from the CCD image sensor and convert the analog RGB data into digital RGB data, wherein the CDS circuit is operably coupled to the CCD image sensor; and
a physical interface adapted to receive the digital RGB data and transmit the digital RGB data with reduced operational overhead and increased operational functionality, wherein the physical interface is operably coupled to the CDS circuit; and
a base unit including:
a base unit physical interface adapted to receive the digital RGB data with the reduced operational overhead and the increased operational functionality from the digital camera physical interface;
a base unit programmable logic device (PLD) operably coupled to the base unit physical interface; and
a display adapted to display the digital RGB data with reduced operational overhead and increased operational functionality, wherein the display is operably coupled to the PLD; and
operational overhead reduction by elimination of at least one of:
error detection;
source addressing; and
destination addressing.
6. A digital observation system comprising:
a digital camera including:
a charge coupled device (CCD) image sensor memory adapted to store video data;
a CCD image sensor adapted to transmit the stored video data in analog RGB color space format native to the CCD image sensor (analog RGB data), wherein the CCD image sensor comprises the CCD image sensor memory;
a correlated double sampling (CDS) circuit adapted to receive the analog RGB data from the CCD image sensor and convert the analog RGB data into digital RGB data, wherein the CDS circuit is operably coupled to the CCD image sensor;
a programmable logic device (PLD) adapted to:
receive the transmitted digital RGB data from the CDS circuit;
convert the received digital RGB data into YUV color space format (digital YUV data);
transmit the digital YUV data to a Y average circuit which determines an average brightness;
transmit the average brightness to an exposure control circuit, which generates an automatic gain control signal; and
transmit the automatic gain control signal back to the CDS circuit; and
a physical interface adapted to receive the digital YUV data and transmit the digital YUV data with reduced operational overhead and increased operational functionality, wherein the physical interface is operably coupled to the CDS circuit and wherein the digital camera PLD interfaces to the physical interface; and
a base unit including:
a base unit physical interface adapted to receive the digital YUV data with the reduced operational overhead and the increased operational functionality from the digital camera physical interface;
a base unit PLD operably coupled to the base unit physical interface; and
a display adapted to display the digital YUV data with reduced operational overhead and increased operational functionality, wherein the display is operably coupled to the base unit PLD.
7. The digital observation system of claim 6, wherein the digital camera PLD is further adapted to control timing signals to the CCD image sensor to transmit the digital YUV data to the CDS circuit.
8. The digital observation system of claim 6, wherein the digital camera PLD is further adapted to delay one line of the digital YUV data with the reduced operational overhead and the increased operational functionality transmission when a transmission error correction occurs.
9. The digital observation system of claim 6, wherein the digital YUV data with reduced operational overhead and increased operational functionality is transmitted from the digital camera physical interface and received by the base unit physical interface via a physical layer device.
10. A digital camera programmable logic device (PLD) configured to:
control timing signals to a CCD image sensor to transmit analog RGB data to a CDS circuit where the analog RGB data is converted to digital RGB data; and
delay one line of the digital RGB data transmission when a transmission error correction occurs;
wherein the digital RGB data transmission comprises reduced operational overhead, with respect to a protocol standard;
wherein the operational overhead includes at least one of:
error detection;
source addressing;
destination addressing; and
multi-speed operation.
11. The digital camera PLD of claim 10 further comprising a two video line delay memory configured to delay the one line of the digital RGB data transmission when the transmission error correction occurs.
12. A digital camera programmable logic device (PLD) configured to:
receive digital RGB data from a correlated double sampling (CDS) circuit;
convert the received digital RGB data into digital YUV data;
transmit the digital YUV data to a Y average circuit which determines the average brightness;
transmit the average brightness to an exposure control circuit, which generates an automatic gain control (AGC) signal; and
transmit the AGC signal to the CDS circuit, wherein the digital camera PLD interfaces to the CDS circuit.
13. A digital observation system comprising:
a digital camera including:
an image sensor memory adapted to store video data;
an image sensor adapted to transmit the stored video data in an analog RGB color space format, wherein the image sensor comprises the image sensor memory;
a circuit adapted to receive the transmitted video data from the image sensor and convert the transmitted video data into digital RGB data, wherein the circuit is operably coupled to the image sensor; and
a physical interface adapted to receive the digital RGB data and transmit the digital RGB data with reduced operational overhead and increased operational functionality, wherein the physical interface is operably coupled to the circuit;
a programmable logic device (PLD) adapted to delay one line of the digital RGB data when a transmission error occurs; and
a base including:
a base physical interface adapted to receive the digital RGB data with the reduced operational overhead and the increased operational functionality from the digital camera physical interface;
a base programmable device operably coupled to the base physical interface; and
a display adapted to display the digital RGB data with reduced operational overhead and increased operational functionality, wherein the display is operably coupled to the base programmable device.
14. The digital observation system of claim 13, wherein the base programmable device is adapted to control timing signals to display the digital RGB data with the reduced operational overhead and the increased operational functionality.
15. The digital observation system of claim 13, wherein the digital camera PLD is adapted to control timing signals to the image sensor to transmit the analog RGB data to the circuit, wherein the digital camera PLD interfaces to the circuit.
16. The digital observation system of claim 13, wherein the digital RGB data with the reduced operational overhead and the increased operational functionality is transmitted from the digital camera physical interface and received by the base physical interface via a physical layer device.
17. The digital observation system of claim 13, wherein the transmitted video data comprises analog RGB data.
18. The digital observation system of claim 13, wherein the image sensor transforms the stored video data into the analog RGB color space format.
19. The digital observation system of claim 13, wherein the image sensor transforms the stored video data into the analog RGB color space format native to the image sensor.
20. The digital observation system of claim 13, wherein the circuit comprises a correlated double sampling circuit.
21. The digital observation system of claim 13, wherein the base includes a processor that is configured to process color filter array data from the digital camera, wherein the processed color filter array data corresponds to a color image.
22. The digital observation system of claim 13, wherein the image sensor is adapted to collect analog color filter array data including values for light intensity of a plurality of different colors.
23. A digital observation system comprising:
a digital camera including:
an image sensor memory adapted to store video data;
an image sensor adapted to transmit the stored video data in analog RGB color space format native to the image sensor, wherein the image sensor comprises the image sensor memory;
a first circuit adapted to receive the transmitted video data from the image sensor and convert the transmitted video data into digital RGB data, wherein the first circuit is operably coupled to the image sensor;
a programmable logic device (PLD) adapted to:
receive the transmitted digital RGB data from the first circuit;
convert the received digital RGB data into digital YUV data;
transmit the digital YUV data to a second circuit which determines an average brightness;
transmit the average brightness to an exposure control circuit, which generates an automatic gain control signal; and
transmit the automatic gain control signal back to the first circuit; and
a physical interface adapted to receive the digital YUV data and transmit the digital YUV data with reduced operational overhead and increased operational functionality, wherein the physical interface is operably coupled to the first circuit and wherein the digital camera PLD interfaces to the physical interface; and
a base including:
a base physical interface adapted to receive the digital YUV data with the reduced operational overhead and the increased operational functionality from the digital camera physical interface;
a base programmable device operably coupled to the base physical interface; and
a display adapted to display the digital YUV data with reduced operational overhead and increased operational functionality, wherein the display is operably coupled to the base programmable device.
24. The digital observation system of claim 23, wherein the digital camera PLD is further adapted to control timing signals to the image sensor to transmit the digital YUV data to the first circuit.
25. The digital observation system of claim 23, wherein the digital camera PLD is further adapted to delay one line of the digital YUV data with the reduced operational overhead and the increased operational functionality transmission when a transmission error correction occurs.
26. The digital observation system of claim 23, wherein the digital YUV data with reduced operational overhead and increased operational functionality is transmitted from the digital camera physical interface and received by the base physical interface via a physical layer device.
27. The digital observation system of claim 23, wherein the first circuit comprises a correlated double sampling circuit.
28. The digital observation system of claim 23, wherein the transmitted video data comprises analog RGB data.
29. The digital observation system of claim 23, wherein the image sensor transforms the stored video data into the analog RGB color space format.
30. The digital observation system of claim 23, wherein the image sensor transforms the stored video data into the analog RGB color space format native to the image sensor.
31. A digital camera programmable logic device (PLD) configured to:
control timing signals to an image sensor to transmit analog RGB data to a circuit, wherein the circuit converts the analog RGB data to digital RGB data; and
delay one line of the digital RGB data transmission when a transmission error correction occurs;
wherein the digital RGB data transmission comprises reduced operational overhead, with respect to a protocol standard;
wherein the operational overhead includes at least one of:
error detection;
source addressing;
destination addressing; and
multi-speed operation.
32. The digital camera PLD of claim 31, wherein the circuit comprises a correlated double sampling circuit.
33. The digital camera PLD of claim 31 comprising a two video line delay memory configured to delay the one line of the digital RGB data transmission when the transmission error correction occurs.
34. A digital camera programmable logic device (PLD) configured to:
receive digital RGB data;
convert the received digital RGB data into digital YUV data;
transmit the digital YUV data to a first circuit which determines the average brightness;
transmit the average brightness to an exposure control circuit, which generates an automatic gain control (AGC) signal; and
transmit the AGC signal to a second circuit that provided the digital RGB data, wherein the digital camera PLD interfaces to the second circuit.
35. The digital camera PLD of claim 34, wherein the second circuit comprises a correlated double sampling circuit.
US13/669,969 2002-07-24 2012-11-06 Digital observation system Expired - Fee Related USRE44924E1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/669,969 USRE44924E1 (en) 2002-07-24 2012-11-06 Digital observation system

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US10/202,283 US7312816B2 (en) 2002-07-24 2002-07-24 Digital observation system
US12/642,698 USRE43786E1 (en) 2002-07-24 2009-12-18 Digital observation system
US13/669,969 USRE44924E1 (en) 2002-07-24 2012-11-06 Digital observation system

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US10/202,283 Reissue US7312816B2 (en) 2002-07-24 2002-07-24 Digital observation system

Publications (1)

Publication Number Publication Date
USRE44924E1 true USRE44924E1 (en) 2014-06-03

Family

ID=30769788

Family Applications (3)

Application Number Title Priority Date Filing Date
US10/202,283 Ceased US7312816B2 (en) 2002-07-24 2002-07-24 Digital observation system
US12/642,698 Expired - Fee Related USRE43786E1 (en) 2002-07-24 2009-12-18 Digital observation system
US13/669,969 Expired - Fee Related USRE44924E1 (en) 2002-07-24 2012-11-06 Digital observation system

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US10/202,283 Ceased US7312816B2 (en) 2002-07-24 2002-07-24 Digital observation system
US12/642,698 Expired - Fee Related USRE43786E1 (en) 2002-07-24 2009-12-18 Digital observation system

Country Status (1)

Country Link
US (3) US7312816B2 (en)

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10530997B2 (en) 2017-07-13 2020-01-07 Zillow Group, Inc. Connecting and using building interior data acquired from mobile devices
US10643386B2 (en) 2018-04-11 2020-05-05 Zillow Group, Inc. Presenting image transition sequences between viewing locations
US10708507B1 (en) 2018-10-11 2020-07-07 Zillow Group, Inc. Automated control of image acquisition via use of acquisition device sensors
US10809066B2 (en) 2018-10-11 2020-10-20 Zillow Group, Inc. Automated mapping information generation from inter-connected images
US10825247B1 (en) 2019-11-12 2020-11-03 Zillow Group, Inc. Presenting integrated building information using three-dimensional building models
US11057561B2 (en) 2017-07-13 2021-07-06 Zillow, Inc. Capture, analysis and use of building data from mobile devices
US11164361B2 (en) 2019-10-28 2021-11-02 Zillow, Inc. Generating floor maps for buildings from automated analysis of visual data of the buildings' interiors
US11164368B2 (en) 2019-10-07 2021-11-02 Zillow, Inc. Providing simulated lighting information for three-dimensional building models
US11243656B2 (en) 2019-08-28 2022-02-08 Zillow, Inc. Automated tools for generating mapping information for buildings
US11252329B1 (en) 2021-01-08 2022-02-15 Zillow, Inc. Automated determination of image acquisition locations in building interiors using multiple data capture devices
US11405549B2 (en) 2020-06-05 2022-08-02 Zillow, Inc. Automated generation on mobile devices of panorama images for building locations and subsequent use
US11481925B1 (en) 2020-11-23 2022-10-25 Zillow, Inc. Automated determination of image acquisition locations in building interiors using determined room shapes
US11480433B2 (en) 2018-10-11 2022-10-25 Zillow, Inc. Use of automated mapping information from inter-connected images
US11501492B1 (en) 2021-07-27 2022-11-15 Zillow, Inc. Automated room shape determination using visual data of multiple captured in-room images
US11514674B2 (en) 2020-09-04 2022-11-29 Zillow, Inc. Automated analysis of image contents to determine the acquisition location of the image
US11592969B2 (en) 2020-10-13 2023-02-28 MFTB Holdco, Inc. Automated tools for generating building mapping information
US11632602B2 (en) 2021-01-08 2023-04-18 MFIB Holdco, Inc. Automated determination of image acquisition locations in building interiors using multiple data capture devices
US11676344B2 (en) 2019-11-12 2023-06-13 MFTB Holdco, Inc. Presenting building information using building models
US11790648B2 (en) 2021-02-25 2023-10-17 MFTB Holdco, Inc. Automated usability assessment of buildings using visual data of captured in-room images
US11830135B1 (en) 2022-07-13 2023-11-28 MFTB Holdco, Inc. Automated building identification using floor plans and acquired building images
US11836973B2 (en) 2021-02-25 2023-12-05 MFTB Holdco, Inc. Automated direction of capturing in-room information for use in usability assessment of buildings
US11842464B2 (en) 2021-09-22 2023-12-12 MFTB Holdco, Inc. Automated exchange and use of attribute information between building images of multiple types
US12014120B2 (en) 2019-08-28 2024-06-18 MFTB Holdco, Inc. Automated tools for generating mapping information for buildings
US12045951B2 (en) 2021-12-28 2024-07-23 MFTB Holdco, Inc. Automated building information determination using inter-image analysis of multiple building images
US12056900B2 (en) 2021-08-27 2024-08-06 MFTB Holdco, Inc. Automated mapping information generation from analysis of building photos

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7312816B2 (en) 2002-07-24 2007-12-25 Freestone Systems, Inc. Digital observation system
KR100650251B1 (en) * 2003-01-21 2006-11-27 주식회사 팬택앤큐리텔 Handset having image processing function and method therefor
US7257278B2 (en) * 2003-02-26 2007-08-14 Hewlett-Packard Development Company, L.P. Image sensor for capturing and filtering image data
JP4112469B2 (en) * 2003-10-07 2008-07-02 オリンパス株式会社 Multiband camera control apparatus and control method
JP4072108B2 (en) * 2003-10-07 2008-04-09 オリンパス株式会社 Image display device and image display method
US20060282872A1 (en) * 2005-06-08 2006-12-14 Cirstea Madalin C Component-type video and digital audio extender
US20080150474A1 (en) * 2006-10-27 2008-06-26 Ingersoll-Rand Company Cordless power tool battery charging and analyzing system
EP2126682A1 (en) * 2007-03-26 2009-12-02 Record4Free.TV AG Video data transmission via usb interface
US20100238985A1 (en) * 2008-11-13 2010-09-23 John Traywick Cellular Uploader for Digital Game Camera
US9137488B2 (en) 2012-10-26 2015-09-15 Google Inc. Video chat encoding pipeline

Citations (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4584656A (en) 1981-08-26 1986-04-22 Canon Kabushiki Kaisha Signal accumulating time control method and apparatus for a signal accumulating type radiation sensing device
US4780761A (en) 1987-06-02 1988-10-25 Eastman Kodak Company Digital image compression and transmission system visually weighted transform coefficients
US5023725A (en) 1989-10-23 1991-06-11 Mccutchen David Method and apparatus for dodecahedral imaging system
US5040063A (en) 1988-04-04 1991-08-13 Zenith Electronics Corporation TV signal transmission systems and methods
US5249051A (en) 1991-12-09 1993-09-28 Elbex Video Ltd. Method and apparatus for converting synchronizing signal for television cameras
US5450140A (en) 1993-04-21 1995-09-12 Washino; Kinya Personal-computer-based video production system
US5659776A (en) 1990-02-28 1997-08-19 Texas Instruments Incorporated Method and apparatus for inputting data to a single instruction, multiple data processor used in a television receiving system
US5689313A (en) 1994-03-24 1997-11-18 Discovision Associates Buffer management in an image formatter
US5703604A (en) 1995-05-22 1997-12-30 Dodeca Llc Immersive dodecaherdral video viewing system
US5926209A (en) 1995-07-14 1999-07-20 Sensormatic Electronics Corporation Video camera apparatus with compression system responsive to video camera adjustment
US5995140A (en) 1995-08-28 1999-11-30 Ultrak, Inc. System and method for synchronization of multiple video cameras
US6009305A (en) 1993-12-28 1999-12-28 Hitachi Denshi Kabushiki Kaisha Digital video signal multiplex transmission system
US6023131A (en) 1997-11-27 2000-02-08 Okita; Masaya Backlight device for a liquid crystal display
US6141034A (en) 1995-12-15 2000-10-31 Immersive Media Co. Immersive imaging method and apparatus
US6215898B1 (en) 1997-04-15 2001-04-10 Interval Research Corporation Data processing system and method
US6240217B1 (en) 1997-02-24 2001-05-29 Redflex Traffic Systems Pty Ltd Digital image processing
US6259271B1 (en) 1996-07-18 2001-07-10 Altera Corporation Configuration memory integrated circuit
US20010008419A1 (en) 2000-01-14 2001-07-19 Matsushita Electric Industrial Co., Ltd. Solid state imaging apparatus
US20010030694A1 (en) 2000-03-15 2001-10-18 Asahi Kogaku Kogyo Kabushiki Kaisha Digital still camera performing white balance adjustment
US6317166B1 (en) 1998-08-31 2001-11-13 Immersive Media Company Synchronization frame generator for multichannel imaging system
US6330027B1 (en) 1995-08-08 2001-12-11 Canon Kabushiki Kaisha Video input apparatus with error recovery capability
US6381007B2 (en) 1999-12-24 2002-04-30 Astrium Sas Photosensitive charge-accumulating device and a lidar incorporating such a device
US20020051065A1 (en) 2000-04-26 2002-05-02 Nikon Corporation Recording medium for data file management, apparatus for data file management, handling apparatus for image data, and image capturing system
US20020054222A1 (en) 2000-06-08 2002-05-09 Nikon Corporation Image capturing system, and recording medium for control program of image capturing system
US20020126150A1 (en) 2001-03-07 2002-09-12 Parry Travis J. Wireless updateable digital picture frame
US20020140857A1 (en) 2001-03-30 2002-10-03 Limaye Ajit M. Audio/video processing engine
US6493025B1 (en) 1995-10-05 2002-12-10 Sanyo Electronic Co., Ltd. Image sensing system equipped with interface between image sensing apparatus and computer machinery
US6493034B1 (en) 1999-08-23 2002-12-10 Elbex Video Ltd. Method and apparatus for remote adjustment of a time delay circuit of externally synchronized video transmitters
US6514083B1 (en) 1998-01-07 2003-02-04 Electric Planet, Inc. Method and apparatus for providing interactive karaoke entertainment
US6567869B2 (en) 1999-08-25 2003-05-20 Apex Inc. KVM switch including a terminal emulator
US6573931B1 (en) 1996-04-19 2003-06-03 Canon Kabushiki Kaisha Information transmission method and apparatus
US20030131127A1 (en) 2002-01-05 2003-07-10 King Randy J. KVM video & OSD switch
US20030128283A1 (en) 1996-01-29 2003-07-10 Gaku Watanabe Electronic apparatus
US20030193567A1 (en) 2002-04-12 2003-10-16 Hubel Paul M. Digital camera media scanning methods, digital image processing methods, digital camera media scanning systems, and digital imaging systems
US20040017477A1 (en) 2002-07-24 2004-01-29 Cooper Alan Neal Digital observation system
US6735190B1 (en) 1998-10-21 2004-05-11 Lucent Technologies Inc. Packet transport method device utilizing header removal fields
US6748577B2 (en) 2001-06-29 2004-06-08 Stmicroelectronics Ltd. System for simplifying the programmable memory to logic interface in FPGA
US6757413B1 (en) 2000-02-23 2004-06-29 American Telecare, Inc. Low-cost medical image transfer and control system and method
US20040201686A1 (en) 2001-12-28 2004-10-14 Amling Marc R. Intelligent camera head
US6825876B1 (en) 1999-06-08 2004-11-30 Lightsurf Technologies, Inc. Digital camera device with methodology for efficient color conversion
US20050146611A1 (en) 1999-02-12 2005-07-07 Vu Quan A. Method of and apparatus for generating a precise frame rate in digital video transmission from a computer system to a digital video device
US20050146610A1 (en) 1997-12-04 2005-07-07 Pentax U.S.A., Inc. Camera connectible to CCTV network
US20050185056A1 (en) 1998-07-10 2005-08-25 Canon Kabushiki Kaisha Image pickup control apparatus, image pickup control method, image pickup control system, and storage medium
US6977683B1 (en) 1997-12-03 2005-12-20 Minolta Co., Ltd. Digital camera

Patent Citations (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4584656A (en) 1981-08-26 1986-04-22 Canon Kabushiki Kaisha Signal accumulating time control method and apparatus for a signal accumulating type radiation sensing device
US4780761A (en) 1987-06-02 1988-10-25 Eastman Kodak Company Digital image compression and transmission system visually weighted transform coefficients
US5040063A (en) 1988-04-04 1991-08-13 Zenith Electronics Corporation TV signal transmission systems and methods
US5023725A (en) 1989-10-23 1991-06-11 Mccutchen David Method and apparatus for dodecahedral imaging system
US5659776A (en) 1990-02-28 1997-08-19 Texas Instruments Incorporated Method and apparatus for inputting data to a single instruction, multiple data processor used in a television receiving system
US5249051A (en) 1991-12-09 1993-09-28 Elbex Video Ltd. Method and apparatus for converting synchronizing signal for television cameras
US5450140A (en) 1993-04-21 1995-09-12 Washino; Kinya Personal-computer-based video production system
US6009305A (en) 1993-12-28 1999-12-28 Hitachi Denshi Kabushiki Kaisha Digital video signal multiplex transmission system
US5689313A (en) 1994-03-24 1997-11-18 Discovision Associates Buffer management in an image formatter
US5703604A (en) 1995-05-22 1997-12-30 Dodeca Llc Immersive dodecaherdral video viewing system
US5926209A (en) 1995-07-14 1999-07-20 Sensormatic Electronics Corporation Video camera apparatus with compression system responsive to video camera adjustment
US6330027B1 (en) 1995-08-08 2001-12-11 Canon Kabushiki Kaisha Video input apparatus with error recovery capability
US5995140A (en) 1995-08-28 1999-11-30 Ultrak, Inc. System and method for synchronization of multiple video cameras
US6493025B1 (en) 1995-10-05 2002-12-10 Sanyo Electronic Co., Ltd. Image sensing system equipped with interface between image sensing apparatus and computer machinery
US6141034A (en) 1995-12-15 2000-10-31 Immersive Media Co. Immersive imaging method and apparatus
US20030128283A1 (en) 1996-01-29 2003-07-10 Gaku Watanabe Electronic apparatus
US6573931B1 (en) 1996-04-19 2003-06-03 Canon Kabushiki Kaisha Information transmission method and apparatus
US6259271B1 (en) 1996-07-18 2001-07-10 Altera Corporation Configuration memory integrated circuit
US6240217B1 (en) 1997-02-24 2001-05-29 Redflex Traffic Systems Pty Ltd Digital image processing
US6215898B1 (en) 1997-04-15 2001-04-10 Interval Research Corporation Data processing system and method
US6023131A (en) 1997-11-27 2000-02-08 Okita; Masaya Backlight device for a liquid crystal display
US6977683B1 (en) 1997-12-03 2005-12-20 Minolta Co., Ltd. Digital camera
US20050146610A1 (en) 1997-12-04 2005-07-07 Pentax U.S.A., Inc. Camera connectible to CCTV network
US6514083B1 (en) 1998-01-07 2003-02-04 Electric Planet, Inc. Method and apparatus for providing interactive karaoke entertainment
US20050185056A1 (en) 1998-07-10 2005-08-25 Canon Kabushiki Kaisha Image pickup control apparatus, image pickup control method, image pickup control system, and storage medium
US6317166B1 (en) 1998-08-31 2001-11-13 Immersive Media Company Synchronization frame generator for multichannel imaging system
US6735190B1 (en) 1998-10-21 2004-05-11 Lucent Technologies Inc. Packet transport method device utilizing header removal fields
US20050146611A1 (en) 1999-02-12 2005-07-07 Vu Quan A. Method of and apparatus for generating a precise frame rate in digital video transmission from a computer system to a digital video device
US6825876B1 (en) 1999-06-08 2004-11-30 Lightsurf Technologies, Inc. Digital camera device with methodology for efficient color conversion
US6493034B1 (en) 1999-08-23 2002-12-10 Elbex Video Ltd. Method and apparatus for remote adjustment of a time delay circuit of externally synchronized video transmitters
US6567869B2 (en) 1999-08-25 2003-05-20 Apex Inc. KVM switch including a terminal emulator
US6381007B2 (en) 1999-12-24 2002-04-30 Astrium Sas Photosensitive charge-accumulating device and a lidar incorporating such a device
US20010008419A1 (en) 2000-01-14 2001-07-19 Matsushita Electric Industrial Co., Ltd. Solid state imaging apparatus
US6757413B1 (en) 2000-02-23 2004-06-29 American Telecare, Inc. Low-cost medical image transfer and control system and method
US20010030694A1 (en) 2000-03-15 2001-10-18 Asahi Kogaku Kogyo Kabushiki Kaisha Digital still camera performing white balance adjustment
US20020051065A1 (en) 2000-04-26 2002-05-02 Nikon Corporation Recording medium for data file management, apparatus for data file management, handling apparatus for image data, and image capturing system
US20020054222A1 (en) 2000-06-08 2002-05-09 Nikon Corporation Image capturing system, and recording medium for control program of image capturing system
US20020126150A1 (en) 2001-03-07 2002-09-12 Parry Travis J. Wireless updateable digital picture frame
US20020140857A1 (en) 2001-03-30 2002-10-03 Limaye Ajit M. Audio/video processing engine
US6748577B2 (en) 2001-06-29 2004-06-08 Stmicroelectronics Ltd. System for simplifying the programmable memory to logic interface in FPGA
US20040201686A1 (en) 2001-12-28 2004-10-14 Amling Marc R. Intelligent camera head
US20030131127A1 (en) 2002-01-05 2003-07-10 King Randy J. KVM video & OSD switch
US20030193567A1 (en) 2002-04-12 2003-10-16 Hubel Paul M. Digital camera media scanning methods, digital image processing methods, digital camera media scanning systems, and digital imaging systems
US20040017477A1 (en) 2002-07-24 2004-01-29 Cooper Alan Neal Digital observation system
US7312816B2 (en) 2002-07-24 2007-12-25 Freestone Systems, Inc. Digital observation system

Cited By (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10530997B2 (en) 2017-07-13 2020-01-07 Zillow Group, Inc. Connecting and using building interior data acquired from mobile devices
US11632516B2 (en) 2017-07-13 2023-04-18 MFIB Holdco, Inc. Capture, analysis and use of building data from mobile devices
US11165959B2 (en) 2017-07-13 2021-11-02 Zillow, Inc. Connecting and using building data acquired from mobile devices
US10834317B2 (en) 2017-07-13 2020-11-10 Zillow Group, Inc. Connecting and using building data acquired from mobile devices
US11057561B2 (en) 2017-07-13 2021-07-06 Zillow, Inc. Capture, analysis and use of building data from mobile devices
US10643386B2 (en) 2018-04-11 2020-05-05 Zillow Group, Inc. Presenting image transition sequences between viewing locations
US11217019B2 (en) 2018-04-11 2022-01-04 Zillow, Inc. Presenting image transition sequences between viewing locations
US10809066B2 (en) 2018-10-11 2020-10-20 Zillow Group, Inc. Automated mapping information generation from inter-connected images
US11480433B2 (en) 2018-10-11 2022-10-25 Zillow, Inc. Use of automated mapping information from inter-connected images
US11408738B2 (en) 2018-10-11 2022-08-09 Zillow, Inc. Automated mapping information generation from inter-connected images
US11627387B2 (en) 2018-10-11 2023-04-11 MFTB Holdco, Inc. Automated control of image acquisition via use of mobile device interface
US11638069B2 (en) 2018-10-11 2023-04-25 MFTB Holdco, Inc. Automated control of image acquisition via use of mobile device user interface
US11405558B2 (en) 2018-10-11 2022-08-02 Zillow, Inc. Automated control of image acquisition via use of hardware sensors and camera content
US10708507B1 (en) 2018-10-11 2020-07-07 Zillow Group, Inc. Automated control of image acquisition via use of acquisition device sensors
US11284006B2 (en) 2018-10-11 2022-03-22 Zillow, Inc. Automated control of image acquisition via acquisition location determination
US12014120B2 (en) 2019-08-28 2024-06-18 MFTB Holdco, Inc. Automated tools for generating mapping information for buildings
US11243656B2 (en) 2019-08-28 2022-02-08 Zillow, Inc. Automated tools for generating mapping information for buildings
US11164368B2 (en) 2019-10-07 2021-11-02 Zillow, Inc. Providing simulated lighting information for three-dimensional building models
US11823325B2 (en) 2019-10-07 2023-11-21 MFTB Holdco, Inc. Providing simulated lighting information for building models
US11494973B2 (en) 2019-10-28 2022-11-08 Zillow, Inc. Generating floor maps for buildings from automated analysis of visual data of the buildings' interiors
US11164361B2 (en) 2019-10-28 2021-11-02 Zillow, Inc. Generating floor maps for buildings from automated analysis of visual data of the buildings' interiors
US11238652B2 (en) 2019-11-12 2022-02-01 Zillow, Inc. Presenting integrated building information using building models
US10825247B1 (en) 2019-11-12 2020-11-03 Zillow Group, Inc. Presenting integrated building information using three-dimensional building models
US11935196B2 (en) 2019-11-12 2024-03-19 MFTB Holdco, Inc. Presenting building information using building models
US11676344B2 (en) 2019-11-12 2023-06-13 MFTB Holdco, Inc. Presenting building information using building models
US11405549B2 (en) 2020-06-05 2022-08-02 Zillow, Inc. Automated generation on mobile devices of panorama images for building locations and subsequent use
US11514674B2 (en) 2020-09-04 2022-11-29 Zillow, Inc. Automated analysis of image contents to determine the acquisition location of the image
US11592969B2 (en) 2020-10-13 2023-02-28 MFTB Holdco, Inc. Automated tools for generating building mapping information
US11797159B2 (en) 2020-10-13 2023-10-24 MFTB Holdco, Inc. Automated tools for generating building mapping information
US11645781B2 (en) 2020-11-23 2023-05-09 MFTB Holdco, Inc. Automated determination of acquisition locations of acquired building images based on determined surrounding room data
US11481925B1 (en) 2020-11-23 2022-10-25 Zillow, Inc. Automated determination of image acquisition locations in building interiors using determined room shapes
US11632602B2 (en) 2021-01-08 2023-04-18 MFIB Holdco, Inc. Automated determination of image acquisition locations in building interiors using multiple data capture devices
US11252329B1 (en) 2021-01-08 2022-02-15 Zillow, Inc. Automated determination of image acquisition locations in building interiors using multiple data capture devices
US11790648B2 (en) 2021-02-25 2023-10-17 MFTB Holdco, Inc. Automated usability assessment of buildings using visual data of captured in-room images
US11836973B2 (en) 2021-02-25 2023-12-05 MFTB Holdco, Inc. Automated direction of capturing in-room information for use in usability assessment of buildings
US11501492B1 (en) 2021-07-27 2022-11-15 Zillow, Inc. Automated room shape determination using visual data of multiple captured in-room images
US12056900B2 (en) 2021-08-27 2024-08-06 MFTB Holdco, Inc. Automated mapping information generation from analysis of building photos
US11842464B2 (en) 2021-09-22 2023-12-12 MFTB Holdco, Inc. Automated exchange and use of attribute information between building images of multiple types
US12045951B2 (en) 2021-12-28 2024-07-23 MFTB Holdco, Inc. Automated building information determination using inter-image analysis of multiple building images
US11830135B1 (en) 2022-07-13 2023-11-28 MFTB Holdco, Inc. Automated building identification using floor plans and acquired building images

Also Published As

Publication number Publication date
US20040017477A1 (en) 2004-01-29
USRE43786E1 (en) 2012-11-06
US7312816B2 (en) 2007-12-25

Similar Documents

Publication Publication Date Title
USRE44924E1 (en) Digital observation system
US10582151B2 (en) Camera system, video processing apparatus, and camera apparatus
US6895256B2 (en) Optimized camera sensor architecture for a mobile telephone
US7012635B2 (en) Solid state image sensor and video system using the same
US5841471A (en) Timing control for a digitally interfaced camera using variable line readout intervals
EP1359745A4 (en) Screen correcting method and imaging device
US6438587B2 (en) Imaging apparatus and network system using the same
KR20140110194A (en) Image sensor and observing system having the same
US20070236582A1 (en) Video camera with multiple independent outputs
WO2009102174A1 (en) Method for performing digital processing on an image signal output from ccd image sensors
US7463286B2 (en) Image data processing apparatus
JP2009105687A (en) Imaging system
JPH10105696A (en) Film scanner system provided with data interface
JPH08294033A (en) Digital camera
JP3033204B2 (en) TV intercom
US8854490B2 (en) Method and apparatus for compensating a black level of an image signal
US20120257106A1 (en) Video processing apparatus and video processing method
KR20230072220A (en) Integrated board control system
JPH0969980A (en) Camera device
JPH0496582A (en) Picture input device
JP2001359089A (en) Photographing processing unit and digital camera
KR20050004586A (en) Image Processor usable for wireless or wire sender/receiver of arbitrary size of imag

Legal Events

Date Code Title Description
AS Assignment

Owner name: STAR CO, TEXAS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:IMMERSIVE MEDIA OF TEXAS;REEL/FRAME:032424/0070

Effective date: 20140121

FPAY Fee payment

Year of fee payment: 8

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY